www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - Library standardization

reply "Koroskin Denis" <2korden gmail.com> writes:
First of all, I don't want to start Tango vs. Phobos vs. ??? flame war.
But the way Tango or Phobos envolves is not the best one.

Current situation is, someone writes code, probably nice one, and it is  
added
to main trunk. Problem is, interface is implementation driven, not  
otherwise.
It is not discussed. And thats bad. Tests first, then code, Kent Beck said.
Of course, implementation can affect interface, but only after trial.

I mean, what we need is a detailed document (probably, wikified one) with
detailed library interfaces, their rationale, use cases, examples, stress
tests but NO implementation! Implementation is important, too, but only to  
end
users, and not for standardization. Reference implementation will follow, I
promise. It shouldn't be fast, it should be CORRECT and standard compliant  
in
the first place, and it should pass D Library Stress Test.

We need some kind of committee that would endorse that. And a separate  
newsgroup
section. Drafts should be stored in wiki.

As such, my suggestion is to revive digitalmars.dtl group!
The condition is it should be regularly monitored by Walter/Andrei or any  
other
person, that will be assigned for a duty.

We should discuss and answer the following questions:

- How DTL should be organized (bunch of files or structured like in  
Java/C#/Tango)?
- What modules should it consist of?
- What classes does it provide?
- What interfaces these classes expose?
- What feature set of D should it use?
- Templates vs. Object Oriented approach

The library should document all these. Extensive set of functional and  
unit tests should also be provided.
Reference implementation for D1/D2 will exist. However, library should be  
design driven, not implementation driven.

Any module/class/method should be removed if Walter is not satisfied with  
it.
Invariant should be held, that at any given moment Walter is satisfied  
with every piece of the library.

I believe this is the only way we can create single powerful standard  
library.
Apr 18 2008
next sibling parent reply e-t172 <e-t172 akegroup.org> writes:
Koroskin Denis a écrit :
 Current situation is, someone writes code, probably nice one, and it is 
 added
 to main trunk. Problem is, interface is implementation driven, not 
 otherwise.
 It is not discussed. And thats bad. Tests first, then code, Kent Beck said.
 Of course, implementation can affect interface, but only after trial.

I have to agree on this one. As a side note, I definitely think we *need* "real" header files (like .h files in C/C++), which separates the API and his implementation. I see for advantages: - Clearer presentation for the user of the API. The user is only interested in the API, not in the implementation: if the user has to go through the implementation to understand what the library does, it is a result of bad documentation, and should be avoided. With header files, all the information the user needs is put in one place, without the "noise" of the implementation throughout the file. - More efficient. For example, on a Linux distro, if you want to write a program using a library, you need to install the "dev" package of the library, which only contains header files. There is no point of including the implementation in the package, because it is not useful to the user (and definitely not useful if you only want to compile a project, not modify it). See the glibc as an example : if you needed the entire source code of the glibc every time you wanted to compile a program, this would have been a pure waste in terms of disk space and compiler efficiency. - Clear separation of "compiled" code and compile-time code. That is, if a library provides "normal" code (accessed by an API) and compile-time code (which is compiled in the application that uses the library, not in the library itself), the two can be clearly distinguished: "normal" code will only consist of declarations in the header file, while compile-time code will be entirely defined in the header file. That way, the user knows what IS in the library (the .a or .so file), and what will be compiled in his application. (of course this is already possible, just not as "clearly" for the user) - Ability to distribute closed-source libraries. I'm against closed-source libraries, but I know that a lot of people need them. Of course, header files also means additional maintenance issues. But it shouldn't be too hard to write a program which automatically extract the declarations and compile-time code out of a .d file, and use it to generate a header file. This way, each time a .d file is modified, the Makefile (or any other build system) would automatically trigger the regeneration of the associated header file. P.S.: I'm not talking about .di files here. Last time I tried to generate .di files, implementation was still included in them.
Apr 19 2008
next sibling parent reply "Janice Caron" <caron800 googlemail.com> writes:
On 19/04/2008, e-t172 <e-t172 akegroup.org> wrote:
  I have to agree on this one. As a side note, I definitely think we *need*
 "real" header files (like .h files in C/C++), which separates the API and

What if my functions may be inlined? What if my functions are template functions? What if my functions are capable of compile-time-function-execution? What if my functions generate strings for use in string mixins? Besides which - I don't want to have to maintain two separate files! Those days are gone, and good riddance to them.
Apr 19 2008
parent e-t172 <e-t172 akegroup.org> writes:
Janice Caron a écrit :
 On 19/04/2008, e-t172 <e-t172 akegroup.org> wrote:
  I have to agree on this one. As a side note, I definitely think we *need*
 "real" header files (like .h files in C/C++), which separates the API and

What if my functions may be inlined? What if my functions are template functions? What if my functions are capable of compile-time-function-execution? What if my functions generate strings for use in string mixins? Besides which - I don't want to have to maintain two separate files! Those days are gone, and good riddance to them.

You did not read my entire message, did you?
Apr 19 2008
prev sibling next sibling parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
e-t172 wrote:
 Koroskin Denis a écrit :
 Current situation is, someone writes code, probably nice one, and it 
 is added
 to main trunk. Problem is, interface is implementation driven, not 
 otherwise.
 It is not discussed. And thats bad. Tests first, then code, Kent Beck 
 said.
 Of course, implementation can affect interface, but only after trial.

I have to agree on this one. As a side note, I definitely think we *need* "real" header files (like .h files in C/C++), which separates the API and his implementation. I see for advantages:

I think the automatic .di generation is a better. I just wish it generated human-readable output rather than stripping all indentation. --bb
Apr 19 2008
next sibling parent reply e-t172 <e-t172 akegroup.org> writes:
Bill Baxter a écrit :
 e-t172 wrote:
 Koroskin Denis a écrit :
 Current situation is, someone writes code, probably nice one, and it 
 is added
 to main trunk. Problem is, interface is implementation driven, not 
 otherwise.
 It is not discussed. And thats bad. Tests first, then code, Kent Beck 
 said.
 Of course, implementation can affect interface, but only after trial.

I have to agree on this one. As a side note, I definitely think we *need* "real" header files (like .h files in C/C++), which separates the API and his implementation. I see for advantages:

I think the automatic .di generation is a better. I just wish it generated human-readable output rather than stripping all indentation.

Automatic .di generation would be just great if it actually stripped implementation (see my P.S.).
Apr 19 2008
next sibling parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
e-t172 wrote:
 Bill Baxter a écrit :
 e-t172 wrote:
 Koroskin Denis a écrit :
 Current situation is, someone writes code, probably nice one, and it 
 is added
 to main trunk. Problem is, interface is implementation driven, not 
 otherwise.
 It is not discussed. And thats bad. Tests first, then code, Kent 
 Beck said.
 Of course, implementation can affect interface, but only after trial.

I have to agree on this one. As a side note, I definitely think we *need* "real" header files (like .h files in C/C++), which separates the API and his implementation. I see for advantages:

I think the automatic .di generation is a better. I just wish it generated human-readable output rather than stripping all indentation.

Automatic .di generation would be just great if it actually stripped implementation (see my P.S.).

See Janice's comments. .di files contain the implementation of templates and of functions that will be inlined. These things must be in the header file to work. They do not contain all implementation as you suggest. Or at least they're not supposed to. --bb
Apr 19 2008
parent reply e-t172 <e-t172 akegroup.org> writes:
Bill Baxter a écrit :
 See Janice's comments.  .di files contain the implementation of 
 templates and of functions that will be inlined.  These things must be 
 in the header file to work.  They do not contain all implementation as 
 you suggest.  Or at least they're not supposed to.

Okay, I didn't know that. So .di files address the issue, I guess. However, there are three remaining problems with the way .di files are generated: - Like you said, indentation is stripped. This make .di files quite ugly. Considering that .di files will often be directly read by the user of the API, this is a problem. - There should be some kind of feature to automatically copy the "documentation comments" (ddoc, doxygen, etc) from the .d files to the .di files when they are generated. A solution would be to automatically include all comments which are not in implementation code. - If I understand your statement correctly, it means the D compiler decides on its own whether to inline a function or not. I don't think it's a good idea, because it will lead to very strange problems and unexpected behaviour when dealing with shared libraries. (actually this is not a .di issue, but a more general one).
Apr 19 2008
next sibling parent reply e-t172 <e-t172 akegroup.org> writes:
Janice Caron a écrit :
 I don't think it's a good
 idea, because it will lead to very strange problems and unexpected behaviour
 when dealing with shared libraries.

It's an optimisation decision, so it should make no difference whatsoever, except to make your code run faster or slower.

When using shared libraries, it makes a big difference. Imagine I am writing a shared library. Consider the following function: void foo() { /* do something complex here */ } This function is big, it is likely it will not be inlined by the compiler. It will be included in the shared library as a symbol, which will be referenced by the caller. So, I compile and release my shared library, version 1.0.0. There is absolutely no problem at this time. After the first release of my shared library, I finally find out that foo() did not have to be that complex. In fact, it can be simplified. So I rewrite it, without changing the API : void foo() { /* do something simple here */ } This function is now simple, it is likely it will be inlined by the compiler. Therefore, it will not be included in the shared library. So, I compile my shared library, version 1.0.1. When I install it on my system, all hell breaks loose: all the programs that were using foo() are crashing at startup because they do not find foo() in the shared library. What?! But I never changed my API?! How is that possible? There is clearly a problem here.
Apr 19 2008
next sibling parent reply e-t172 <e-t172 akegroup.org> writes:
e-t172 a écrit :
 So, I compile my shared library, version 1.0.1. When I install it on my 
 system, all hell breaks loose: all the programs that were using foo() 
 are crashing at startup because they do not find foo() in the shared 
 library. What?! But I never changed my API?! How is that possible?
 
 There is clearly a problem here.

The same kind of problem arises if you release a general update of your shared library without changing the API : the programs that uses the shared library will use the updated versions if and only if those are not inlined. This is normal, but because inlining a function is not a the programmer's decision in D, the program will use an ugly and unpredictable mix between updated and non-updated functions that will inevitably lead to grave problems (DLL hell reloaded?).
Apr 19 2008
parent Bill Baxter <dnewsgroup billbaxter.com> writes:
e-t172 wrote:
 e-t172 a écrit :
 So, I compile my shared library, version 1.0.1. When I install it on 
 my system, all hell breaks loose: all the programs that were using 
 foo() are crashing at startup because they do not find foo() in the 
 shared library. What?! But I never changed my API?! How is that possible?

 There is clearly a problem here.

The same kind of problem arises if you release a general update of your shared library without changing the API : the programs that uses the shared library will use the updated versions if and only if those are not inlined. This is normal, but because inlining a function is not a the programmer's decision in D, the program will use an ugly and unpredictable mix between updated and non-updated functions that will inevitably lead to grave problems (DLL hell reloaded?).

There is an 'export' attribute that I think is supposed to be used to say that a function in a DLL is callable. http://www.digitalmars.com/d/1.0/attribute.html#ProtectionAttribute Presumably this could cause implementation to not be included in a .di file. I don't know if that happens currently, or not. --bb
Apr 19 2008
prev sibling parent reply e-t172 <e-t172 akegroup.org> writes:
Janice Caron a écrit :
 That assumption is false. Just because a function is small enough to
 be inlined, doesn't mean its object code won't be in the library. In
 fact, if I understand this correctly, the object code will always be
 in the library.
 
 The decision as to whether or not to inline cannot be made at the
 library level. To make that decision most optimally, the compiler also
 needs to know the calling code.
 
 If all calls to a function are inlined, the linker should not link it
 into the final executable, thereby avoiding code bloat.

True. But that does not resolves the problem of a general update to the shared library (see my reply to myself). My point is, inlining functions without the developer's consent is likely to cause grave problems when writing shared libraries. There should be a way to tell the compiler "Hey, I want you NOT to inline this function, even if that sound stupid, because I want it to be in MY shared library, so I can update it whenever I want". Maybe the export attribute does this already, as Bill Baxter was suggesting.
Apr 19 2008
next sibling parent Bill Baxter <dnewsgroup billbaxter.com> writes:
e-t172 wrote:
 Janice Caron a écrit :
 That assumption is false. Just because a function is small enough to
 be inlined, doesn't mean its object code won't be in the library. In
 fact, if I understand this correctly, the object code will always be
 in the library.

 The decision as to whether or not to inline cannot be made at the
 library level. To make that decision most optimally, the compiler also
 needs to know the calling code.

 If all calls to a function are inlined, the linker should not link it
 into the final executable, thereby avoiding code bloat.

True. But that does not resolves the problem of a general update to the shared library (see my reply to myself). My point is, inlining functions without the developer's consent is likely to cause grave problems when writing shared libraries. There should be a way to tell the compiler "Hey, I want you NOT to inline this function, even if that sound stupid, because I want it to be in MY shared library, so I can update it whenever I want". Maybe the export attribute does this already, as Bill Baxter was suggesting.

Doesn't look like export affects it. I think this is the relevant code: dmd/src/dmd/func.c: void FuncDeclaration::bodyToCBuffer(OutBuffer *buf, HdrGenState *hgs) { if (fbody && (!hgs->hdrgen || hgs->tpltMember || canInline(1,1)) ) ... No mention of "if isExported" there. --bb
Apr 19 2008
prev sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
e-t172 wrote:
 My point is, inlining functions without the developer's consent is 
 likely to cause grave problems when writing shared libraries. There 
 should be a way to tell the compiler "Hey, I want you NOT to inline this 
 function, even if that sound stupid, because I want it to be in MY 
 shared library, so I can update it whenever I want". Maybe the export 
 attribute does this already, as Bill Baxter was suggesting.

In the .di file you ship with the library, int foo(); will mean that foo() will never be inlined.
Apr 20 2008
parent reply e-t172 <e-t172 akegroup.org> writes:
Walter Bright a écrit :
 e-t172 wrote:
 My point is, inlining functions without the developer's consent is 
 likely to cause grave problems when writing shared libraries. There 
 should be a way to tell the compiler "Hey, I want you NOT to inline 
 this function, even if that sound stupid, because I want it to be in 
 MY shared library, so I can update it whenever I want". Maybe the 
 export attribute does this already, as Bill Baxter was suggesting.

In the .di file you ship with the library, int foo(); will mean that foo() will never be inlined.

When compiling the library, will the compiler always put foo() in the object, even if it "thinks" it should be inlined?
Apr 21 2008
parent Walter Bright <newshound1 digitalmars.com> writes:
e-t172 wrote:
 When compiling the library, will the compiler always put foo() in the 
 object, even if it "thinks" it should be inlined?

Yes.
Apr 21 2008
prev sibling next sibling parent reply Sean Kelly <sean invisibleduck.org> writes:
== Quote from Janice Caron (caron800 googlemail.com)'s article
 On 19/04/2008, e-t172 <e-t172 akegroup.org> wrote:
  If I understand your statement correctly, it means the D compiler decides
 on its own whether to inline a function or not.

better optimisation decisions than the programmer.

Except sometimes the programmer doesn't want code exposed, even if it means faster execution.
 I don't think it's a good
 idea, because it will lead to very strange problems and unexpected behaviour
 when dealing with shared libraries.

whatsoever, except to make your code run faster or slower.

Exposing implementation can also have implementation changes cause client code to have to be recompiled, and there is more risk of headers getting out of sync. I think the synchronization issue is what he meant by strange problems. Sean
Apr 19 2008
parent reply Sean Kelly <sean invisibleduck.org> writes:
== Quote from Janice Caron (caron800 googlemail.com)'s article
 On 19/04/2008, Sean Kelly <sean invisibleduck.org> wrote:
 Except sometimes the programmer doesn't want code exposed, even if
  it means faster execution.

make two copies of the .d file - one with implementations, and one without. Compile the one with, to make the library object file, and distribute the one without.

The whole point of automatic header generation is to avoid the issues associated with manually maintaining header files.
 Exposing implementation can also have implementation changes cause
  client code to have to be recompiled.

be recompiled. That's why we have makefiles and other build systems.

No. Changing a source file should require the application to be re-linked. As someone who has worked on programs that can take half a day to build, I would throw a fit if every source change required a full rebuild of code that simply includes a header related to this source. Sean
Apr 19 2008
next sibling parent reply Sean Kelly <sean invisibleduck.org> writes:
== Quote from Janice Caron (caron800 googlemail.com)'s article
 On 19/04/2008, Sean Kelly <sean invisibleduck.org> wrote:
  Changing a source file should require the application to be re-linked.

  As someone who has worked on programs that can take half a day to build,
  I would throw a fit if every source change required a full rebuild of code
that
  simply includes a header related to this source.

dependent files if that module changes? Good luck with that one.

Apparently you've never used C/C++. I apologize or the misunderstanding. Sean
Apr 19 2008
next sibling parent Lars Ivar Igesund <larsivar igesund.net> writes:
Janice Caron wrote:

 On 19/04/2008, Sean Kelly <sean invisibleduck.org> wrote:
 So you want the ability to import a module, but not have to rebuild

> Good luck with that one. Apparently you've never used C/C++. I apologize or the misunderstanding.

Touché. But I was talking about D. OK, so you're basically saying you want D to have header files, like C. Fair enough. The prospect doesn't thrill me, but I would be intrigued to know how other many people want this.

The .di generation feature was made as a response to very many requesting this. -- Lars Ivar Igesund blog at http://larsivi.net DSource, #d.tango & #D: larsivi Dancing the Tango
Apr 19 2008
prev sibling next sibling parent reply e-t172 <e-t172 akegroup.org> writes:
Janice Caron a écrit :
 OK, so you're basically saying you want D to have header files, like
 C. Fair enough. The prospect doesn't thrill me, but I would be
 intrigued to know how other many people want this.

Right now, probably not so much. However, when D gets really popular and people begin to use really big projects (several tens of megabytes of code), they will be very annoyed if they don't have header files. And everyone will be annoyed: the library writers, the users, and the distributions that package the library.
Apr 19 2008
parent reply Chris R. Miller <lordSaurontheGreat gmail.com> writes:
e-t172 Wrote:

 Janice Caron a écrit :
 OK, so you're basically saying you want D to have header files, like
 C. Fair enough. The prospect doesn't thrill me, but I would be
 intrigued to know how other many people want this.

Right now, probably not so much. However, when D gets really popular and people begin to use really big projects (several tens of megabytes of code), they will be very annoyed if they don't have header files. And everyone will be annoyed: the library writers, the users, and the distributions that package the library.

I don't think so. As a counterargument, Java doesn't have header files, and there are projects of epic proportions in Java. Just the JDK is 2,033,027 lines of code in 7,069 files over 480 directories(1). Java is also, for better or for worse, a model language as far as stability and language is concerned. Java doesn't have header files, and Java does just fine. I - personally - can't find a reason to use header files in D. 1) I should know. http://www.fsdev.net/wiki/source-scope
Apr 19 2008
next sibling parent "Hans W. Uhlig" <huhlig gmail.com> writes:
Chris R. Miller wrote:
 e-t172 Wrote:
 
 Janice Caron a écrit :
 OK, so you're basically saying you want D to have header files, like
 C. Fair enough. The prospect doesn't thrill me, but I would be
 intrigued to know how other many people want this.

people begin to use really big projects (several tens of megabytes of code), they will be very annoyed if they don't have header files. And everyone will be annoyed: the library writers, the users, and the distributions that package the library.

I don't think so. As a counterargument, Java doesn't have header files, and there are projects of epic proportions in Java. Just the JDK is 2,033,027 lines of code in 7,069 files over 480 directories(1). Java is also, for better or for worse, a model language as far as stability and language is concerned. Java doesn't have header files, and Java does just fine. I - personally - can't find a reason to use header files in D. 1) I should know. http://www.fsdev.net/wiki/source-scope

standard object files. Java .class files have all the necessary meta data inside the object file, and linking, inlining is done all at runtime.
Apr 19 2008
prev sibling parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
Chris R. Miller wrote:
 e-t172 Wrote:
 

 1) I should know.  http://www.fsdev.net/wiki/source-scope

Sounds useful. Does that tool only work on Java code? The description is not clear on that point. --bb
Apr 20 2008
next sibling parent Chris R. Miller <lordSaurontheGreat gmail.com> writes:
Bill Baxter Wrote:

 Chris R. Miller wrote:
 e-t172 Wrote:
 

 1) I should know.  http://www.fsdev.net/wiki/source-scope

Sounds useful. Does that tool only work on Java code? The description is not clear on that point.

At the moment it only works for Java. It uses a pluggable class to scan the files, so it would be fairly simple to modify it to use another language. I've been meaning to write a new version in D, since I don't plan on maintaining it in Java. I just haven't had the time to do so. The sources are included inside the Jar file if you feel you want to try modifying it. I don't think I ever bothered to document that code, so it could be a little confusing. Let me know (off-list, of course) if you need any help.
Apr 21 2008
prev sibling parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Bill Baxter wrote:
 Chris R. Miller wrote:
 e-t172 Wrote:

 1) I should know.  http://www.fsdev.net/wiki/source-scope

Sounds useful. Does that tool only work on Java code? The description is not clear on that point. --bb

descent.metrics can report code statistics for D. It uses the descent parse tree, so it does stuff like counts the number of statements, etc., as well.
Apr 21 2008
parent Ary Borenszweig <ary esperanto.org.ar> writes:
Robert Fraser wrote:
 Bill Baxter wrote:
 Chris R. Miller wrote:
 e-t172 Wrote:

 1) I should know.  http://www.fsdev.net/wiki/source-scope

Sounds useful. Does that tool only work on Java code? The description is not clear on that point. --bb

descent.metrics can report code statistics for D. It uses the descent parse tree, so it does stuff like counts the number of statements, etc., as well.

And if we get bindings done correctly for the next release, it will be able to report anything of this: http://eclipse-metrics.sourceforge.net/
Apr 22 2008
prev sibling next sibling parent reply Lars Noschinski <lars-2008-1 usenet.noschinski.de> writes:
* Janice Caron <caron800 googlemail.com> [08-04-19 19:15]:
On 19/04/2008, Sean Kelly <sean invisibleduck.org> wrote:
 So you want the ability to import a module, but not have to rebuild

> Good luck with that one. Apparently you've never used C/C++. I apologize or the misunderstanding.

Touch. But I was talking about D. OK, so you're basically saying you want D to have header files, like C. Fair enough. The prospect doesn't thrill me, but I would be intrigued to know how other many people want this.

How does Java handle this case? They also do not have header files there.
Apr 19 2008
next sibling parent reply "Jarrett Billingsley" <kb3ctd2 yahoo.com> writes:
"Lars Noschinski" <lars-2008-1 usenet.noschinski.de> wrote in message 
news:20080419230108.GB7752 lars.home.noschinski.de...
* Janice Caron <caron800 googlemail.com> [08-04-19 19:15]:
On 19/04/2008, Sean Kelly <sean invisibleduck.org> wrote:
 So you want the ability to import a module, but not have to rebuild

> Good luck with that one. Apparently you've never used C/C++. I apologize or the misunderstanding.

Touch. But I was talking about D. OK, so you're basically saying you want D to have header files, like C. Fair enough. The prospect doesn't thrill me, but I would be intrigued to know how other many people want this.

How does Java handle this case? They also do not have header files there.

As far as I know, a compiled Java .class file can work in place of the source file from which it was generated. The .class file contains all the declarations in a table, so the compiler doesn't have to parse any code. The eqivalent isn't really possible with any current implementation of D, since all D compilers now use typical object file formats instead of a custom format. I mean, I suppose it's _possible_ to embed some information into the object file and have the compiler read it out, but I have no idea what that would entail or if all object file formats would support that, etc..
Apr 19 2008
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Jarrett Billingsley wrote:
 As far as I know, a compiled Java .class file can work in place of the 
 source file from which it was generated.  The .class file contains all the 
 declarations in a table, so the compiler doesn't have to parse any code.

That's right, the compiler just uses the .class file.
 The eqivalent isn't really possible with any current implementation of D, 
 since all D compilers now use typical object file formats instead of a 
 custom format.  I mean, I suppose it's _possible_ to embed some information 
 into the object file and have the compiler read it out, but I have no idea 
 what that would entail or if all object file formats would support that, 
 etc.. 

Yes, it is certainly possible to embed the information in the .obj file. But it's kind of "why bother". Furthermore, in the future, the 1:1 correspondence between .obj files and source files may go away.
Apr 20 2008
parent reply "Hans W. Uhlig" <huhlig gmail.com> writes:
Walter Bright wrote:
 Jarrett Billingsley wrote:
 As far as I know, a compiled Java .class file can work in place of the 
 source file from which it was generated.  The .class file contains all 
 the declarations in a table, so the compiler doesn't have to parse any 
 code.

That's right, the compiler just uses the .class file.
 The eqivalent isn't really possible with any current implementation of 
 D, since all D compilers now use typical object file formats instead 
 of a custom format.  I mean, I suppose it's _possible_ to embed some 
 information into the object file and have the compiler read it out, 
 but I have no idea what that would entail or if all object file 
 formats would support that, etc.. 

Yes, it is certainly possible to embed the information in the .obj file. But it's kind of "why bother". Furthermore, in the future, the 1:1 correspondence between .obj files and source files may go away.

That would probably be a good thing but why can a obj file not have its "implementation header" inside the object. It would bloat the object a little but having it in there would allow for better optimization(through compartmentalizing, limited loading and such) and the same non header requirement as java.
Apr 20 2008
parent Robert Fraser <fraserofthenight gmail.com> writes:
Hans W. Uhlig wrote:
 Walter Bright wrote:
 Jarrett Billingsley wrote:
 As far as I know, a compiled Java .class file can work in place of 
 the source file from which it was generated.  The .class file 
 contains all the declarations in a table, so the compiler doesn't 
 have to parse any code.

That's right, the compiler just uses the .class file.
 The eqivalent isn't really possible with any current implementation 
 of D, since all D compilers now use typical object file formats 
 instead of a custom format.  I mean, I suppose it's _possible_ to 
 embed some information into the object file and have the compiler 
 read it out, but I have no idea what that would entail or if all 
 object file formats would support that, etc.. 

Yes, it is certainly possible to embed the information in the .obj file. But it's kind of "why bother". Furthermore, in the future, the 1:1 correspondence between .obj files and source files may go away.

That would probably be a good thing but why can a obj file not have its "implementation header" inside the object. It would bloat the object a little but having it in there would allow for better optimization(through compartmentalizing, limited loading and such) and the same non header requirement as java.

Then the compiler needs to parse object files
Apr 20 2008
prev sibling parent Michel Fortin <michel.fortin michelf.com> writes:
On 2008-04-19 19:01:09 -0400, Lars Noschinski 
<lars-2008-1 usenet.noschinski.de> said:

 * Janice Caron <caron800 googlemail.com> [08-04-19 19:15]:
 OK, so you're basically saying you want D to have header files, like
 C. Fair enough. The prospect doesn't thrill me, but I would be
 intrigued to know how other many people want this.

How does Java handle this case? They also do not have header files there.

In Java, if I'm not mistaken, you don't need the source of a class to use it in your code. The compiled class file contains everything the compiler needs to know about a class and the signature of the methods in it. Inlining is done by the virtual machine at runtime. If you change the implementation of a class, because no inlining is performed at compile-time it doesn't matter which version you compiled against, as long as the API is compatible between the two versions. -- Michel Fortin michel.fortin michelf.com http://michelf.com/
Apr 19 2008
prev sibling next sibling parent reply Edward Diener <eddielee_no_spam_here tropicsoft.com> writes:
Sean Kelly wrote:
 == Quote from Janice Caron (caron800 googlemail.com)'s article
 On 19/04/2008, Sean Kelly <sean invisibleduck.org> wrote:
  Changing a source file should require the application to be re-linked.

  As someone who has worked on programs that can take half a day to build,
  I would throw a fit if every source change required a full rebuild of code
that
  simply includes a header related to this source.

dependent files if that module changes? Good luck with that one.

Apparently you've never used C/C++. I apologize or the misunderstanding.

I think there is a miscommunication here. In C++ terms, if a source file changes in such a way as to change the declaration for other dependent code which needs to use that declaration, then obviously the other dependent code needs to be recompiled just to pick up the new declaration. Otherwise there should be no need to recompile dependent code if just the definition and its functionality changes. In D one of the negatives of having everything in a single .d file is that there is no separation between the declaration and the definition since the definition defines the declaration. Unfortunately this appears to lead to the fact that if the .d file changes then any other .d file which imports the original one ( has a dependency on the original one ) needs to be recompiled. In having a single files where the definition is the declaration, as in D as well as Java, C#, and Python, you win some and you lose some. In the case of rebuilding code, you definite lose some as opposed to C++'s dual header/source file paradigm. However in a dynamically typed, interpreted language such as Python, there is no compilation stage so winning or losing is not an issue for this case.
Apr 19 2008
next sibling parent "Hans W. Uhlig" <huhlig gmail.com> writes:
Edward Diener wrote:
 Sean Kelly wrote:
 == Quote from Janice Caron (caron800 googlemail.com)'s article
 On 19/04/2008, Sean Kelly <sean invisibleduck.org> wrote:
  Changing a source file should require the application to be re-linked.

  As someone who has worked on programs that can take half a day to 
 build,
  I would throw a fit if every source change required a full rebuild 
 of code that
  simply includes a header related to this source.

dependent files if that module changes? Good luck with that one.

Apparently you've never used C/C++. I apologize or the misunderstanding.

I think there is a miscommunication here. In C++ terms, if a source file changes in such a way as to change the declaration for other dependent code which needs to use that declaration, then obviously the other dependent code needs to be recompiled just to pick up the new declaration. Otherwise there should be no need to recompile dependent code if just the definition and its functionality changes. In D one of the negatives of having everything in a single .d file is that there is no separation between the declaration and the definition since the definition defines the declaration. Unfortunately this appears to lead to the fact that if the .d file changes then any other .d file which imports the original one ( has a dependency on the original one ) needs to be recompiled. In having a single files where the definition is the declaration, as in D as well as Java, C#, and Python, you win some and you lose some. In the case of rebuilding code, you definite lose some as opposed to C++'s dual header/source file paradigm. However in a dynamically typed, interpreted language such as Python, there is no compilation stage so winning or losing is not an issue for this case.

Perhaps this is where we need to have some form of Library tag for a module. I.E. This module will become a library with a "fixed" api and should be treated thusly. Generate an API file full of meta data and allow the compiler to compare "new versions" against the API. this can gain two advantages. A) the compiler lets the programmer knows when he breaks API (syntactically). B) all interface functions are defined in the API file so all thats necessary is said API file. I am unsure if .di files can accommodate this. Oh, and a note about .di files, why not just have them run through a pretty printer?
Apr 19 2008
prev sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Edward Diener wrote:
 In D one of the negatives of having everything in a single .d file is 
 that there is no separation between the declaration and the definition 
 since the definition defines the declaration. Unfortunately this appears 
 to lead to the fact that if the .d file changes then any other .d file 
 which imports the original one ( has a dependency on the original one ) 
 needs to be recompiled.

But it is quite possible to separate D modules into "headers" and "implementations", if desired. Phobos does this for the gc, for example. It can also be done automatically by generating .di files.
Apr 19 2008
next sibling parent reply Edward Diener <eddielee_no_spam_here tropicsoft.com> writes:
Walter Bright wrote:
 Edward Diener wrote:
 In D one of the negatives of having everything in a single .d file is 
 that there is no separation between the declaration and the definition 
 since the definition defines the declaration. Unfortunately this 
 appears to lead to the fact that if the .d file changes then any other 
 .d file which imports the original one ( has a dependency on the 
 original one ) needs to be recompiled.

But it is quite possible to separate D modules into "headers" and "implementations", if desired. Phobos does this for the gc, for example. It can also be done automatically by generating .di files.

Is there an example of code separated into "headers" and "implementation" ? I do not realize how this can be done. Can one just declare a D function or class in a "header" and then import that file and provide an implementation for what had previously just been declared ? What about templates ? I do not see how a D template can just be declared in one .d file and then implemented in another .d file. Where in the documentation are .di files described ? I recall reading something about .di files somewhere, probably in the spec, but I can no longer find it.
Apr 20 2008
next sibling parent e-t172 <e-t172 akegroup.org> writes:
Edward Diener a écrit :
 Is there an example of code separated into "headers" and 
 "implementation" ? I do not realize how this can be done. Can one just 
 declare a D function or class in a "header" and then import that file 
 and provide an implementation for what had previously just been declared 
 ? What about templates ? I do not see how a D template can just be 
 declared in one .d file and then implemented in another .d file.

I'm not sure - I think this would work : class Foo { void bar(); int bar2(); ... } This would be in the .di file, with the actual implementation of bar() and bar2() in the corresponding .d file. Like C. Templates are a special case. Templates are compile-time code, they cannot be "compiled". So their implementation is not stripped in the .di file.
Apr 20 2008
prev sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
Edward Diener wrote:
 Is there an example of code separated into "headers" and 
 "implementation" ? I do not realize how this can be done. Can one just 
 declare a D function or class in a "header" and then import that file 
 and provide an implementation for what had previously just been declared 
 ? What about templates ? I do not see how a D template can just be 
 declared in one .d file and then implemented in another .d file.
 
 Where in the documentation are .di files described ? I recall reading 
 something about .di files somewhere, probably in the spec, but I can no 
 longer find it.

All .di files are are files with the function bodies removed, i.e.: int foo() { return 3; } becomes: int foo(); There is a compiler switch to generate these files automatically, -H, but they can be done manually. As in C++, templates cannot be compiled without their bodies, so the bodies of the templates are not stripped. The main difference with C++ is you don't *need* to generate those header files, the compiler is just fine reading the implementation files and extracting what it needs. The .di files are used when desiring to hide the implementation, or for speeding up the compilation.
Apr 20 2008
parent reply Edward Diener <eddielee_no_spam_here tropicsoft.com> writes:
Walter Bright wrote:
 Edward Diener wrote:
 Is there an example of code separated into "headers" and 
 "implementation" ? I do not realize how this can be done. Can one just 
 declare a D function or class in a "header" and then import that file 
 and provide an implementation for what had previously just been 
 declared ? What about templates ? I do not see how a D template can 
 just be declared in one .d file and then implemented in another .d file.

 Where in the documentation are .di files described ? I recall reading 
 something about .di files somewhere, probably in the spec, but I can 
 no longer find it.

All .di files are are files with the function bodies removed, i.e.: int foo() { return 3; } becomes: int foo();

That makes sense !
 
 There is a compiler switch to generate these files automatically, -H, 
 but they can be done manually.

I see it now in the online compiler documentation.
 
 As in C++, templates cannot be compiled without their bodies, so the 
 bodies of the templates are not stripped.
 
 The main difference with C++ is you don't *need* to generate those 
 header files, the compiler is just fine reading the implementation files 
 and extracting what it needs. The .di files are used when desiring to 
 hide the implementation, or for speeding up the compilation.

Good idea ! Now I have a revolutionary suggestion for D made before no doubt in C++. I will make it here and then pursue it by Wiki if others are interested. Not everybody in the world believes that all software must be open source. ( gasp ! calls of "oh, no, no" !! People fainting dead away !!! ) There are a few, obviously misguided and benighted souls such as yours truly, who actually believe that one should be able to write software and sell it on the market, and that to do so one has the actual right of not having to distribute the source code, which one created from the biblical sweat of one's brow ( or ache of one's fingers, or thought of one's mind ). Doing this in C++ while writing templates is impossible in current implementations because the 'export' keyword, which hardly anyone ever wanted to implement anyway, except for Daveed Vandevoorde, never promises that the separation between template declarations and template definitions would enable one only to distribute the template declaration without the template definitions. But it was always possible to conceive that the template definitions could be "compiled" down into some intermediate unreadable format which could become part of the binary distribution, ala shared libs, static libs, exes etc. But since no one in C++ ever thought it important enough to protect the template source as an intellectual right which should not need to be distributed in easily readable form, no on in C++ ever decided it was important enough to standardize the idea of some intermediate "compilable" form by which template source could be distributed but remain unread by the end user. Would it be possible that Mr. Walter Bright is sympathetic to the notion that template source should be "compilable" down into some sort of unreadable format which enables the D compiler to read it but not others to discover its source form ? I realize that the very source code which might enable D to do this would itself need to be protected from prying eyes so that the format of the "compilable" template source could not be easily reverse engineered. Before someone cries that any format can be reverse engineered with enough effort, I want merely to say that no doubt .lib and .obj files can be reverse engineered to a certain extent but at some highly difficult level few, if any, are going to bother. S I see no reason why template source code could not be "obfuscated" in a similar way. Do not get me wrong in attempting to believe that I think source should never be available. I applaud libraries like Boost and D's own libraries where the source is there for anybody to study. But these are not implementations being sold to make a direct profit and, even if they were, it is certainly the choice of the developer whether or not they want to distribute their source code or not. As a user of libraries largely based on templates, whether C++ or D, once I have faith in the quality of the library I feel I have no need to look at the source code in order to use it successfully and the actual template declarations should be enough, from the user's point of view, to interact with the library. Unfortunately in C++ and D this is not the case due to the way templates are currently implemented. But I think it could be the case if D pursued the line of thought that the end user of the template code, as opposed to the compiler itself, really has no need to interact with the template definition, as opposed to the template declaration, in order to use the template successfully. And with all that I will just let the sparks fly if they are any, or just watch the embers and ashes decompose ( I am in a pseudo-poetic mood on this fine spring day where I live ).
Apr 20 2008
next sibling parent reply Don Clugston <nospam nospam.com.au> writes:
Edward Diener Wrote:
 As a user of libraries largely based on templates, whether C++ or D, 
 once I have faith in the quality of the library I feel I have no need to 
 look at the source code in order to use it successfully and the actual 
 template declarations should be enough, from the user's point of view, 
 to interact with the library. Unfortunately in C++ and D this is not the 
 case due to the way templates are currently implemented. But I think it 
 could be the case if D pursued the line of thought that the end user of 
 the template code, as opposed to the compiler itself, really has no need 
 to interact with the template definition, as opposed to the template 
 declaration, in order to use the template successfully.

You can always follow the example of Microsoft (who seem to have embraced the open-source concept in .NET???? <g> -- it's trivially decompileable) -- run the template code through an obfustcator before distribution. CTFE is more interesting. I think we'll see far fewer uses of templates in D than C++; frequently, CTFE will be used instead. And it's easier to imagine CTFE being compiled, than templates being compiled.
Apr 20 2008
next sibling parent Bill Baxter <dnewsgroup billbaxter.com> writes:
Don Clugston wrote:
 Edward Diener Wrote:
 As a user of libraries largely based on templates, whether C++ or D, 
 once I have faith in the quality of the library I feel I have no need to 
 look at the source code in order to use it successfully and the actual 
 template declarations should be enough, from the user's point of view, 
 to interact with the library. Unfortunately in C++ and D this is not the 
 case due to the way templates are currently implemented. But I think it 
 could be the case if D pursued the line of thought that the end user of 
 the template code, as opposed to the compiler itself, really has no need 
 to interact with the template definition, as opposed to the template 
 declaration, in order to use the template successfully.

You can always follow the example of Microsoft (who seem to have embraced the open-source concept in .NET???? <g> -- it's trivially decompileable) -- run the template code through an obfustcator before distribution. CTFE is more interesting. I think we'll see far fewer uses of templates in D than C++; frequently, CTFE will be used instead. And it's easier to imagine CTFE being compiled, than templates being compiled.

Can you elaborate? Since CTFE funcs are essentially run by an interpreter I don't see how you would handle pre-compilation of them easily. Any easier than could be done with templates at least. --bb
Apr 20 2008
prev sibling parent Edward Diener <eddielee_no_spam_here tropicsoft.com> writes:
Don Clugston wrote:
 Edward Diener Wrote:
 As a user of libraries largely based on templates, whether C++ or D, 
 once I have faith in the quality of the library I feel I have no need to 
 look at the source code in order to use it successfully and the actual 
 template declarations should be enough, from the user's point of view, 
 to interact with the library. Unfortunately in C++ and D this is not the 
 case due to the way templates are currently implemented. But I think it 
 could be the case if D pursued the line of thought that the end user of 
 the template code, as opposed to the compiler itself, really has no need 
 to interact with the template definition, as opposed to the template 
 declaration, in order to use the template successfully.

You can always follow the example of Microsoft (who seem to have embraced the open-source concept in .NET???? <g> -- it's trivially decompileable) -- run the template code through an obfustcator before distribution.

No obfuscator, which the D compiler must be able to read as source, is going to hide the source more than very trivially.
 
 CTFE is more interesting. I think we'll see far fewer uses of templates in D
than C++; frequently, CTFE will be used instead. And it's easier to imagine
CTFE being compiled, than templates being compiled.

Fair enough, but as I understand it CTFE currently must deal with known values at compile time. Templates normally execute at run-time when values are created by the template code. I may be wrong but I doubt that templates can all be subsumed by CTFE.
Apr 20 2008
prev sibling parent Walter Bright <newshound1 digitalmars.com> writes:
See my reply in a new thread.
Apr 20 2008
prev sibling parent reply Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Walter Bright wrote:
 Edward Diener wrote:
 In D one of the negatives of having everything in a single .d file is 
 that there is no separation between the declaration and the definition 
 since the definition defines the declaration. Unfortunately this 
 appears to lead to the fact that if the .d file changes then any other 
 .d file which imports the original one ( has a dependency on the 
 original one ) needs to be recompiled.

But it is quite possible to separate D modules into "headers" and "implementations", if desired. Phobos does this for the gc, for example. It can also be done automatically by generating .di files.

Walter, seriously now, have you read e-t172's scenario with attention? (news://news.digitalmars.com:119/fuhq4t$26dg$1 digitalmars.com) In what he describes, the automatic generation of .di files is not safe, because some pieces of "implementation" get stuck in the .di files (even more in a indeterministic way to the programmer). Because of that, a change in implementation of a shared library, without change in the interface, may nonetheless require the users of the library to recompile their code. Now if you recall, that goes precisely against the purpose of a shared library (being able to update the shared library alone, without recompiling the whole program). So the creators of shared libraries, in order to create a proper library, have to search through the exposed .di files and look for places where "implementation details" are exposed (and then either removed them, or check if they have changed since previous versions). That's tedious, unnecessary, and error-prone unless fully automated. (I wonder why this issue hasn't cropped up before. The problem may occur only in very rare circumstances, but still, it's there.) -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Apr 25 2008
parent reply Christopher Wright <dhasenan gmail.com> writes:
Bruno Medeiros wrote:
 (I wonder why this issue hasn't cropped up before. The problem may occur 
 only in very rare circumstances, but still, it's there.)

Probably because not many people use .di files.
Apr 25 2008
parent reply Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Christopher Wright wrote:
 Bruno Medeiros wrote:
 (I wonder why this issue hasn't cropped up before. The problem may 
 occur only in very rare circumstances, but still, it's there.)

Probably because not many people use .di files.

Sean K. does, for example, and probably others of the Tango crew. I wonder how they deal with that. Do they manually check .di files and fix them? Sean, care to chime in? :) -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Apr 26 2008
parent Lars Ivar Igesund <larsivar igesund.net> writes:
Bruno Medeiros wrote:

 Christopher Wright wrote:
 Bruno Medeiros wrote:
 (I wonder why this issue hasn't cropped up before. The problem may
 occur only in very rare circumstances, but still, it's there.)

Probably because not many people use .di files.

Sean K. does, for example, and probably others of the Tango crew. I wonder how they deal with that. Do they manually check .di files and fix them? Sean, care to chime in? :)

I'll do it for him - I think the problem is real, just that dynamic libraries with stable interfaces/ABI's that needs to be upheld aren't in wide use (if used at all) in D. I think Sean's suggestion related to the -inline flag and header generation will be important to have implemented by the time we get there. -- Lars Ivar Igesund blog at http://larsivi.net DSource, #d.tango & #D: larsivi Dancing the Tango
Apr 26 2008
prev sibling parent reply Sean Kelly <sean invisibleduck.org> writes:
== Quote from Janice Caron (caron800 googlemail.com)'s article
 On 19/04/2008, Sean Kelly <sean invisibleduck.org> wrote:
 So you want the ability to import a module, but not have to rebuild

> Good luck with that one. Apparently you've never used C/C++. I apologize or the misunderstanding.

OK, so you're basically saying you want D to have header files, like C. Fair enough. The prospect doesn't thrill me, but I would be intrigued to know how other many people want this.

Personally, I'd just like the auto header generator to provide some means of not outputting bodies of any functions at all. The easiest way to accomplish this would be to make the feature sensitive to the -inline switch. Bonus points would be awarded for preserving the formatting of the original file, but I suspect that would be difficult to accomplish. Sean
Apr 19 2008
next sibling parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
Sean Kelly wrote:
 == Quote from Janice Caron (caron800 googlemail.com)'s article
 On 19/04/2008, Sean Kelly <sean invisibleduck.org> wrote:
 So you want the ability to import a module, but not have to rebuild

> Good luck with that one. Apparently you've never used C/C++. I apologize or the misunderstanding.

OK, so you're basically saying you want D to have header files, like C. Fair enough. The prospect doesn't thrill me, but I would be intrigued to know how other many people want this.

Personally, I'd just like the auto header generator to provide some means of not outputting bodies of any functions at all. The easiest way to accomplish this would be to make the feature sensitive to the -inline switch. Bonus points would be awarded for preserving the formatting of the original file, but I suspect that would be difficult to accomplish.

-inline has the wrong sense though. Most people are probably happy with things the way they are, so you'd want a -noinline flag for those folks who want to prevent outputting function bodies. Except if you're going to make a flag that's just for not outputting function bodies you might as well call it -Hnoimpl or something. I can't actually use -inline on my project because it pushes my modules over OPTLINK'S fixup limit. --bb
Apr 19 2008
parent reply "Hans W. Uhlig" <huhlig gmail.com> writes:
Bill Baxter wrote:
 Sean Kelly wrote:
 == Quote from Janice Caron (caron800 googlemail.com)'s article
 On 19/04/2008, Sean Kelly <sean invisibleduck.org> wrote:
 So you want the ability to import a module, but not have to rebuild

> Good luck with that one. Apparently you've never used C/C++. I apologize or the misunderstanding.

OK, so you're basically saying you want D to have header files, like C. Fair enough. The prospect doesn't thrill me, but I would be intrigued to know how other many people want this.

Personally, I'd just like the auto header generator to provide some means of not outputting bodies of any functions at all. The easiest way to accomplish this would be to make the feature sensitive to the -inline switch. Bonus points would be awarded for preserving the formatting of the original file, but I suspect that would be difficult to accomplish.

-inline has the wrong sense though. Most people are probably happy with things the way they are, so you'd want a -noinline flag for those folks who want to prevent outputting function bodies. Except if you're going to make a flag that's just for not outputting function bodies you might as well call it -Hnoimpl or something. I can't actually use -inline on my project because it pushes my modules over OPTLINK'S fixup limit. --bb

instead of module Time; use library module Time; Then it changes how the module itself is handled
Apr 19 2008
parent Bill Baxter <dnewsgroup billbaxter.com> writes:
Hans W. Uhlig wrote:

 Why not simply make a module have a modifier.
 
 instead of
 module Time;
 use
 library module Time;
 
 Then it changes how the module itself is handled

I don't think that should be a source file's decision to make. It seems clear to me it should be the responsibility of the build tools. --bb
Apr 19 2008
prev sibling parent Robert Fraser <fraserofthenight gmail.com> writes:
Sean Kelly wrote:
 Bonus points would be awarded for preserving
 the formatting of the original file, but I suspect that would be difficult
 to accomplish.

Well, there is a formatter for D code available... with 316 different options ;-P.
Apr 19 2008
prev sibling next sibling parent e-t172 <e-t172 akegroup.org> writes:
Janice Caron a écrit :
  Changing a source file should require the application to be re-linked.

  As someone who has worked on programs that can take half a day to build,
  I would throw a fit if every source change required a full rebuild of code
that
  simply includes a header related to this source.

So you want the ability to import a module, but not have to rebuild dependent files if that module changes? Good luck with that one.

It's very easy to do with header files: if the header file has changed, you have to recompile all dependent files, if the .d file has changed, you only have to relink your application. In the case of a .d file included in a shared library, you'll only need to rebuild the library, and all programs will automatically use the new library (that's the whole point of shared libraries). For this system to work, you have to know what needs to be put in the header (including inlined functions) and what needs to be put in the implementation (non-inlined functions). If the programmer doesn't know if a function will be inlined or not, this will obviously not work. Sean was just noticing that this problem is also affecting the efficiency of a build system. IMHO, shared libraries are a much bigger issue, but I agree the two have to be addressed.
Apr 19 2008
prev sibling parent reply Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Janice Caron wrote:
 
 I didn't say it was /easy/. I just said there's nothing to stop you
 doing it. I'm not a fan of closed source software, so I have no wish
 to make it easy. :-) The point is, if that's what you want to do, you
 can.
 

Certainly, you do realize that for D to be successful, it needs to be fully supportive of both open-source and closed-source scenarios. And since you are a person that (frequently) wishes to comment on D's design, and make language change proposals, *then* you should have in consideration closed source software, instead of "having no wish to make it easy". -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Apr 25 2008
parent Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Janice Caron wrote:
 On 25/04/2008, Bruno Medeiros <brunodomedeiros+spam com.gmail> wrote:
  And since you are a person that (frequently) wishes to comment on D's
 design, and make language change proposals,

We can all make proposals. However, so far as I am aware, only /one/ of my proposals has ever been taken up. (That was the proposal that "const T" should mean the same thing as "const(T)"). All the rest have fallen by the wayside, because I'm not the one who gets the final say. I'm not the one you have to convince!

I meant that it was important for the consideration of others that read and review your proposals, not just for Walter, or for the actual approval of the proposal. Although it's rare, sometimes people came up with proposals that go against the stated goals of D. Like a proposal to remove pointers from the language because they are not necessary and we already have class references, and 'ref'-kind references. Indeed pointers don't add any expressiveness to the language, but they allow for optimizations that are otherwise not possible, and allowing the writing of fast and low-level code is one of the goals of D. (this is an extreme example, but I have seen other not-so-extreme examples) -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Apr 25 2008
prev sibling next sibling parent reply Lars Noschinski <lars-2008-1 usenet.noschinski.de> writes:
* e-t172 <e-t172 akegroup.org> [08-04-19 15:12]:
- If I understand your statement correctly, it means the D compiler decides on 
its own whether to inline a function or not. I don't think it's a good idea, 
because it will lead to very strange problems and unexpected behaviour when 
dealing with shared libraries. (actually this is not a .di issue, but a more 
general one).

gcc does that, too (for C). And it does not cause problems, as there is always a out-of-line version generated for all exported functions (i.e. functions not declared static).
Apr 19 2008
parent e-t172 <e-t172 akegroup.org> writes:
Lars Noschinski a crit :
 gcc does that, too (for C). And it does not cause problems, as there is
 always a out-of-line version generated for all exported functions (i.e.
 functions not declared static).

I'm talking about shared libraries. GCC doesn't do that when using shared libraries. He can't, since he has only access to the header files of the library, not its implementation. However it is possible GCC does that inside of a project, but there is no problem here since all is done at compile time (as opposed to the shared library problem, where linking is done at runtime).
Apr 20 2008
prev sibling parent reply Walter Bright <newshound1 digitalmars.com> writes:
e-t172 wrote:
 - Like you said, indentation is stripped. This make .di files quite 
 ugly. Considering that .di files will often be directly read by the user 
 of the API, this is a problem.

The .di files are meant for the compiler to read, not the user. They're supposed to strip out all the extra whitespace and comments. Think of them as "precompiled headers."
 - There should be some kind of feature to automatically copy the 
 "documentation comments" (ddoc, doxygen, etc) from the .d files to the 
 .di files when they are generated. A solution would be to automatically 
 include all comments which are not in implementation code.

The human-readable form is the ddoc output.
 - If I understand your statement correctly, it means the D compiler 
 decides on its own whether to inline a function or not.

That's right.
 I don't think 
 it's a good idea, because it will lead to very strange problems and 
 unexpected behaviour when dealing with shared libraries. (actually this 
 is not a .di issue, but a more general one).

It shouldn't lead to any observable behavior difference (other than runtime speed and code size). Inlining should be the purvey of the compiler much like which variables are to be enregistered. The only reason there's a switch on the command line for it is because it's easier to debug code if it hasn't been inlined.
Apr 20 2008
next sibling parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
Walter Bright wrote:
 e-t172 wrote:
 - Like you said, indentation is stripped. This make .di files quite 
 ugly. Considering that .di files will often be directly read by the 
 user of the API, this is a problem.

The .di files are meant for the compiler to read, not the user. They're supposed to strip out all the extra whitespace and comments. Think of them as "precompiled headers."
 - There should be some kind of feature to automatically copy the 
 "documentation comments" (ddoc, doxygen, etc) from the .d files to the 
 .di files when they are generated. A solution would be to 
 automatically include all comments which are not in implementation code.

The human-readable form is the ddoc output.

I find it much easier to open up a header file in my text editor to search for the functions I'm looking. Easier than it is to go over to my browser and hope that the documentation exists and is complete. I can also run grep on files easily from within my editor to search for patterns in text files. My browser can't do that. But maybe ddoc could be enhanced to 1) generate doc stubs for *all* public memebers, not just the ones the author remembered to document. 2) generate plain text .d files with just doc comments and function signatures etc. But I think it would be easier to make -H generate nice output than doing those two. --bb
Apr 20 2008
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Bill Baxter wrote:
 The human-readable form is the ddoc output.

search for the functions I'm looking. Easier than it is to go over to my browser and hope that the documentation exists and is complete. I can also run grep on files easily from within my editor to search for patterns in text files. My browser can't do that. But maybe ddoc could be enhanced to 1) generate doc stubs for *all* public memebers, not just the ones the author remembered to document.

It deliberately does not do that, because this allows the library writer to have some exposed public members that are not part of the official API for it. The official API he should provide ddoc comments for, even if they are nothing more than: /// void foo(); which will cause foo() to be included in the ddoc output.
 2) generate plain text .d files with just doc comments and function 
 signatures etc.

But that's what ddoc is for. BTW, there's nothing about ddoc that forces the output to be html. You can write a .ddoc macro file which will cause ddoc to generate plain text.
 But I think it would be easier to make -H generate nice output than 
 doing those two.

The whole point of ddoc is to create nice human readable output, and the point of .di is to generate fast, precompiled 'headers'. I don't understand the argument for a third option. Perhaps what you're asking for is a D source code pretty-printer?
Apr 20 2008
next sibling parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
Walter Bright wrote:
 Bill Baxter wrote:
 The human-readable form is the ddoc output.

search for the functions I'm looking. Easier than it is to go over to my browser and hope that the documentation exists and is complete. I can also run grep on files easily from within my editor to search for patterns in text files. My browser can't do that. But maybe ddoc could be enhanced to 1) generate doc stubs for *all* public memebers, not just the ones the author remembered to document.

It deliberately does not do that, because this allows the library writer to have some exposed public members that are not part of the official API for it. The official API he should provide ddoc comments for, even if they are nothing more than: /// void foo(); which will cause foo() to be included in the ddoc output.

That's an unrealistic expectation of developers' diligence. It's too easy to forget a method or accidentally leave off a * in the intended doc comment, turning it into just a plain comment. What ddoc should have is a special comment tag to *supress* doc generation for a particular public member.
 2) generate plain text .d files with just doc comments and function 
 signatures etc.

But that's what ddoc is for. BTW, there's nothing about ddoc that forces the output to be html. You can write a .ddoc macro file which will cause ddoc to generate plain text.

Right, that's what I meant. That would be great I think. And it would strip out all the macro junk to make the doc comments truly human readable. Maybe replace $(B foo) with *foo* and $(I foo) with /foo/, etc.
 But I think it would be easier to make -H generate nice output than 
 doing those two.

The whole point of ddoc is to create nice human readable output, and the point of .di is to generate fast, precompiled 'headers'. I don't understand the argument for a third option.

The argument is that I would rather have plain text format documentation that I can read easily in my text editor. I'm a programmer. I like to work inside my text editor. Header files do that pretty well in C++. I still think making -H a little more usable would be easier, but I agree that a plain-text DDOC target would be the best way to go. But ddoc is too much of a pain if it's going to just silently omit methods that I or other authors forgot to put a doc comment on.
 Perhaps what you're asking for is a D source code pretty-printer?

Not really. --bb
Apr 21 2008
parent reply Walter Bright <newshound1 digitalmars.com> writes:
Bill Baxter wrote:
 Walter Bright wrote:
    ///
    void foo();

 which will cause foo() to be included in the ddoc output.

That's an unrealistic expectation of developers' diligence. It's too easy to forget a method or accidentally leave off a * in the intended doc comment, turning it into just a plain comment.

What are the consequences of him forgetting to do so? Nothing disastrous.
 What ddoc should have is a special comment tag to *supress* doc 
 generation for a particular public member.

I don't agree, I think it adds complexity with little benefit.
 I still think making -H a little more usable would be easier, but I 
 agree that a plain-text DDOC target would be the best way to go.  But 
 ddoc is too much of a pain if it's going to just silently omit methods 
 that I or other authors forgot to put a doc comment on.

I don't think it is asking too much of programmers to at least mark the functions that are part of the public face of their code.
Apr 21 2008
parent Bill Baxter <dnewsgroup billbaxter.com> writes:
Walter Bright wrote:
 Bill Baxter wrote:
 Walter Bright wrote:
    ///
    void foo();

 which will cause foo() to be included in the ddoc output.

That's an unrealistic expectation of developers' diligence. It's too easy to forget a method or accidentally leave off a * in the intended doc comment, turning it into just a plain comment.

What are the consequences of him forgetting to do so? Nothing disastrous.

Except the poor programmer may end up searching through a dozen modules' docs looking for a particular method that they're sure must exist, ultimately conclude that the function doesn't exist, and finally waste more time reimplementing it. Only to find out later that it was in some module, it just didn't have a doc comment. I can't recall the specifics but I'm pretty sure I've had this exact sequence happen to me before with something in Phobos.
 What ddoc should have is a special comment tag to *supress* doc 
 generation for a particular public member.

I don't agree, I think it adds complexity with little benefit.

I think it adds little complexity with significant benefit.
 I still think making -H a little more usable would be easier, but I 
 agree that a plain-text DDOC target would be the best way to go.  But 
 ddoc is too much of a pain if it's going to just silently omit methods 
 that I or other authors forgot to put a doc comment on.

I don't think it is asking too much of programmers to at least mark the functions that are part of the public face of their code.

You just can't count on the diligence of programmers. Especially when it comes to documentation or commenting their code. If it doesn't generate a compiler error, many programmers just aren't going to care, and many of the ones who do care just won't notice. Your position is completely unrealistic. Many people don't put /any/ ddoc comments in their code. Is that because they wanted all their methods to be hidden? If I want to use DDoc on their library it would be nice if I could at least get the public API out of it. Any way you slice it, not generating doc for unlabeled public parts is just not as useful as generating it. --bb
Apr 21 2008
prev sibling next sibling parent reply Sean Kelly <sean invisibleduck.org> writes:
== Quote from Walter Bright (newshound1 digitalmars.com)'s article
 Bill Baxter wrote:
 The human-readable form is the ddoc output.

search for the functions I'm looking. Easier than it is to go over to my browser and hope that the documentation exists and is complete. I can also run grep on files easily from within my editor to search for patterns in text files. My browser can't do that. But maybe ddoc could be enhanced to 1) generate doc stubs for *all* public memebers, not just the ones the author remembered to document.

to have some exposed public members that are not part of the official API for it. The official API he should provide ddoc comments for, even if they are nothing more than: /// void foo(); which will cause foo() to be included in the ddoc output.

Out of curiosity, would you advocate having public class members that were not intended to be a part of the class interface? I haven't been able to think of a reason why I would. The package attribute seems to cover every instance where I'd want to expose some special functionality outside a module.
 2) generate plain text .d files with just doc comments and function
 signatures etc.

the output to be html. You can write a .ddoc macro file which will cause ddoc to generate plain text.
 But I think it would be easier to make -H generate nice output than
 doing those two.

point of .di is to generate fast, precompiled 'headers'. I don't understand the argument for a third option. Perhaps what you're asking for is a D source code pretty-printer?

Sounds like more of a fancy header generator. I know that I tend to look at source code before documentation, for example, so there is some value in having that code be easily readable. And until I began running a pretty printer on the generated headers in Tango, we did receive fairly regular complaints that the headers weren't easily readable. But that said, I'm not sure there's any reason to have pretty printed headers be a compiler feature. Certainly not if doing so is unnecessarily complicated at any rate. Sean
Apr 21 2008
next sibling parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Sean Kelly wrote:
 The package attribute seems to cover every instance
 where I'd want to expose some special functionality outside a module.

But it doesn't work. A package function isn't added to the vtable, so you can't expose a function that says "this function should be used only within this package AND is designed to be overriden (by other classes in the package)".
Apr 21 2008
parent reply Sean Kelly <sean invisibleduck.org> writes:
== Quote from Robert Fraser (fraserofthenight gmail.com)'s article
 Sean Kelly wrote:
 The package attribute seems to cover every instance
 where I'd want to expose some special functionality outside a module.

you can't expose a function that says "this function should be used only within this package AND is designed to be overriden (by other classes in the package)".

Um, what? Who would ever want to do such a thing? And why do you think it should work? Static class member functions aren't virtual either. Sean
Apr 22 2008
parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Sean Kelly wrote:
 == Quote from Robert Fraser (fraserofthenight gmail.com)'s article
 Sean Kelly wrote:
 The package attribute seems to cover every instance
 where I'd want to expose some special functionality outside a module.

you can't expose a function that says "this function should be used only within this package AND is designed to be overriden (by other classes in the package)".

Um, what? Who would ever want to do such a thing? And why do you think it should work? Static class member functions aren't virtual either. Sean

Me. I wanted to create a reflection package for flute, where a number of cooperating classes provide reflection information. One class is used to provide stack traces, which are done in a system-specific manner (that is, differently on Windows and Unix) and so the Windows class an the Unix class both extend a single abstract base class. Only one of the methods in this stack trace provider class should be accessible outside the package -- getStackTrace() ("get a stack trace for this executing address"). However, a different method, getLineInfo() ("scan the debug info for the file/line of this executing address") is used by a different function in the package (but not within that module). In summary: - There's a set of cooperating modules in a package - Some of the functionality in a class needs to be exposed only within that package - That functionality is not "static", it relies on member variables - That functionality relies on virtual dispatch My solution? Put everything in one huge module and make the functions that were going to be package-protected module-private.
Apr 22 2008
parent reply Sean Kelly <sean invisibleduck.org> writes:
== Quote from Robert Fraser (fraserofthenight gmail.com)'s article
 Sean Kelly wrote:
 == Quote from Robert Fraser (fraserofthenight gmail.com)'s article
 Sean Kelly wrote:
 The package attribute seems to cover every instance
 where I'd want to expose some special functionality outside a module.

you can't expose a function that says "this function should be used only within this package AND is designed to be overriden (by other classes in the package)".

Um, what? Who would ever want to do such a thing? And why do you think it should work? Static class member functions aren't virtual either. Sean

cooperating classes provide reflection information. One class is used to provide stack traces, which are done in a system-specific manner (that is, differently on Windows and Unix) and so the Windows class an the Unix class both extend a single abstract base class. Only one of the methods in this stack trace provider class should be accessible outside the package -- getStackTrace() ("get a stack trace for this executing address"). However, a different method, getLineInfo() ("scan the debug info for the file/line of this executing address") is used by a different function in the package (but not within that module). In summary: - There's a set of cooperating modules in a package - Some of the functionality in a class needs to be exposed only within that package - That functionality is not "static", it relies on member variables - That functionality relies on virtual dispatch My solution? Put everything in one huge module and make the functions that were going to be package-protected module-private.

Oh, I see what you mean. I suppose one could argue that package methods are effectively static, but it still seems a bit weird to me. Can they access instance variables? Sean
Apr 22 2008
parent reply Robert Fraser <fraserofthenight gmail.com> writes:
Sean Kelly wrote:
 Oh, I see what you mean.  I suppose one could argue that package methods are
 effectively static, but it still seems a bit weird to me.  Can they access
instance
 variables?
 
 
 Sean

Yes, of course. What's your argument for "effectively static"? "package" is just a protection attribute, just like "public", "private", and "protected"; I don't see the static-ness involved at all. Maybe "package" means something different in another language? I'm just coming from a Java background where package-protection is the default & quite common.
Apr 22 2008
next sibling parent reply Bill Baxter <dnewsgroup billbaxter.com> writes:
Robert Fraser wrote:
 Sean Kelly wrote:
 Oh, I see what you mean.  I suppose one could argue that package 
 methods are
 effectively static, but it still seems a bit weird to me.  Can they 
 access instance
 variables?


 Sean

Yes, of course. What's your argument for "effectively static"? "package" is just a protection attribute, just like "public", "private", and "protected"; I don't see the static-ness involved at all. Maybe "package" means something different in another language? I'm just coming from a Java background where package-protection is the default & quite common.

What is a "package" in Java exactly? Does a package correspond to one source file in Java? Or a directory like D? I seem to remember some rule of only one main class per .java and the name of the class must match the name of the file. --bb
Apr 22 2008
parent Robert Fraser <fraserofthenight gmail.com> writes:
Bill Baxter wrote:
 What is a "package" in Java exactly?  Does a package correspond to one 
 source file in Java?  Or a directory like D?

A directory, just like in D.
 I seem to remember some rule of only one main class per .java and the 
 name of the class must match the name of the file.

One public class. You can have any number of package-protected classes with any names you want (as long as there's no conflicts).
Apr 22 2008
prev sibling next sibling parent Robert Fraser <fraserofthenight gmail.com> writes:
Robert Fraser wrote:
 Yes, of course.

Sorry, that came off a bit pompous.
Apr 22 2008
prev sibling parent reply Sean Kelly <sean invisibleduck.org> writes:
== Quote from Robert Fraser (fraserofthenight gmail.com)'s article
 Sean Kelly wrote:
 Oh, I see what you mean.  I suppose one could argue that package methods are
 effectively static, but it still seems a bit weird to me.  Can they access
instance
 variables?

is just a protection attribute, just like "public", "private", and "protected"; I don't see the static-ness involved at all. Maybe "package" means something different in another language? I'm just coming from a Java background where package-protection is the default & quite common.

Yeah, I was just being dumb. I think the reasoning is that, because 'package' is simply a slightly looser 'private' then it should follow the same rules. I think this could be argued either way, but it does make some sense. I don't suppose it's possible to implement a protected interface? package interface I { void fn(); } class C : package I { package void fn() {} } I'd like to believe that this would work, but it's hard to say. The other options I can think of would be some variation of the above--ie. providing an interface for package-level operations. I've done this sort of thing in C++ before to get around the "friends have access to everything" issue. Sean
Apr 23 2008
parent Robert Fraser <fraserofthenight gmail.com> writes:
Sean Kelly wrote:
 == Quote from Robert Fraser (fraserofthenight gmail.com)'s article
 Sean Kelly wrote:
 Oh, I see what you mean.  I suppose one could argue that package methods are
 effectively static, but it still seems a bit weird to me.  Can they access
instance
 variables?

is just a protection attribute, just like "public", "private", and "protected"; I don't see the static-ness involved at all. Maybe "package" means something different in another language? I'm just coming from a Java background where package-protection is the default & quite common.

Yeah, I was just being dumb. I think the reasoning is that, because 'package' is simply a slightly looser 'private' then it should follow the same rules. I think this could be argued either way, but it does make some sense. I don't suppose it's possible to implement a protected interface?

It doesn't make sense even for "private" because "private" means "private to this module", not "private to this class". "final" means (unless it's overriding something else) "don't put this in the vtable". But it's less of an issue for private than it is for package.
     package interface I {
         void fn();
     }
 
     class C : package I {
         package void fn() {}
     }
 
 I'd like to believe that this would work, but it's hard to say.  The other
options
 I can think of would be some variation of the above--ie. providing an interface
 for package-level operations.  I've done this sort of thing in C++ before to
get
 around the "friends have access to everything" issue.

It works, but it's still a design issue ;P.
Apr 23 2008
prev sibling parent Michel Fortin <michel.fortin michelf.com> writes:
On 2008-04-21 13:43:20 -0400, Sean Kelly <sean invisibleduck.org> said:

 Out of curiosity, would you advocate having public class members that were
 not intended to be a part of the class interface?  I haven't been able to think
 of a reason why I would. The package attribute seems to cover every instance
 where I'd want to expose some special functionality outside a module.

There are plenty of that in the D/Objective-C bridge because mixins can't access private members from other modules, which means that for using a mixin from another module all symbols in the template must be reachable from the mixin scope. That said, perhaps it could be improved. I'd really like to make a few more things private. -- Michel Fortin michel.fortin michelf.com http://michelf.com/
Apr 22 2008
prev sibling parent Bruno Medeiros <brunodomedeiros+spam com.gmail> writes:
Walter Bright wrote:
 BTW, there's nothing about ddoc that forces 
 the output to be html. You can write a .ddoc macro file which will cause 
 ddoc to generate plain text.
 
 

Yes. Which is crappy, instead of good. -- Bruno Medeiros - Software Developer, MSc. in CS/E graduate http://www.prowiki.org/wiki4d/wiki.cgi?BrunoMedeiros#D
Apr 25 2008
prev sibling next sibling parent reply e-t172 <e-t172 akegroup.org> writes:
Walter Bright a écrit :
 e-t172 wrote:
 - Like you said, indentation is stripped. This make .di files quite 
 ugly. Considering that .di files will often be directly read by the 
 user of the API, this is a problem.

The .di files are meant for the compiler to read, not the user. They're supposed to strip out all the extra whitespace and comments. Think of them as "precompiled headers."

I agree with Bill Baxter: .di files should be more than that.
 - If I understand your statement correctly, it means the D compiler 
 decides on its own whether to inline a function or not.

That's right.
 I don't think it's a good idea, because it will lead to very strange 
 problems and unexpected behaviour when dealing with shared libraries. 
 (actually this is not a .di issue, but a more general one).

It shouldn't lead to any observable behavior difference (other than runtime speed and code size). Inlining should be the purvey of the compiler much like which variables are to be enregistered.

I'm tired of repeating myself, so, copy/paste: "The same kind of problem arises if you release a general update of your shared library without changing the API : the programs that uses the shared library will use the updated versions if and only if those are not inlined. This is normal, but because inlining a function is not a the programmer's decision in D, the program will use an ugly and unpredictable mix between updated and non-updated functions that will inevitably lead to grave problems (DLL hell reloaded?)." In other words: when building a shared library, functions should *never* be inlined unless the programmer says so.
Apr 21 2008
parent Bill Baxter <dnewsgroup billbaxter.com> writes:
e-t172 wrote:
 Walter Bright a écrit :
 e-t172 wrote:
 - Like you said, indentation is stripped. This make .di files quite 
 ugly. Considering that .di files will often be directly read by the 
 user of the API, this is a problem.

The .di files are meant for the compiler to read, not the user. They're supposed to strip out all the extra whitespace and comments. Think of them as "precompiled headers."

I agree with Bill Baxter: .di files should be more than that.

But now I understand that Walter wants .di files to be more like precompiled headers. That's fine. It opens up the possibility for doing things like byte-compiling them to speed up compilation (if that would make any difference). The thing that I want them for is for plain text API documentation, and for that they're not so great, both because of the indentation and because the comments are stripped. So now I think what would be best is: 1) for the compiler to generate ddoc output for everything that's public so that I can get some output even from modules which aren't using ddoc. 2) for there to be a set of to-plaintext ddoc macros (perhaps built in). This is something I could work on, but I don't see the point if 1) is not dealt with first. --bb
Apr 21 2008
prev sibling parent Robert Fraser <fraserofthenight gmail.com> writes:
Walter Bright wrote:
 - There should be some kind of feature to automatically copy the 
 "documentation comments" (ddoc, doxygen, etc) from the .d files to the 
 .di files when they are generated. A solution would be to 
 automatically include all comments which are not in implementation code.

The human-readable form is the ddoc output.

IDEs generally expect the documentation to be there along with the header. Sure, there could be a way to link extra docs (i.e. Eclipse JDT allows you to link Javadoc for .class files), but this is extra work for the programmer.
Apr 21 2008
prev sibling next sibling parent "Janice Caron" <caron800 googlemail.com> writes:
On 19/04/2008, Sean Kelly <sean invisibleduck.org> wrote:
 So you want the ability to import a module, but not have to rebuild

> Good luck with that one. Apparently you've never used C/C++. I apologize or the misunderstanding.

Touché. But I was talking about D. OK, so you're basically saying you want D to have header files, like C. Fair enough. The prospect doesn't thrill me, but I would be intrigued to know how other many people want this.
Apr 19 2008
prev sibling parent "Janice Caron" <caron800 googlemail.com> writes:
On 25/04/2008, Bruno Medeiros <brunodomedeiros+spam com.gmail> wrote:
  And since you are a person that (frequently) wishes to comment on D's
 design, and make language change proposals,

We can all make proposals. However, so far as I am aware, only /one/ of my proposals has ever been taken up. (That was the proposal that "const T" should mean the same thing as "const(T)"). All the rest have fallen by the wayside, because I'm not the one who gets the final say. I'm not the one you have to convince!
Apr 25 2008
prev sibling parent Sean Kelly <sean invisibleduck.org> writes:
== Quote from Bill Baxter (dnewsgroup billbaxter.com)'s article
 e-t172 wrote:
 Koroskin Denis a écrit :
 Current situation is, someone writes code, probably nice one, and it
 is added
 to main trunk. Problem is, interface is implementation driven, not
 otherwise.
 It is not discussed. And thats bad. Tests first, then code, Kent Beck
 said.
 Of course, implementation can affect interface, but only after trial.

I have to agree on this one. As a side note, I definitely think we *need* "real" header files (like .h files in C/C++), which separates the API and his implementation. I see for advantages:

generated human-readable output rather than stripping all indentation.

Same here. In fact, I looked into modifying the front end to do this, but unfortunately, it isn't so simple. The output is effectively generated from the syntax tree rather than "left in" as the input is analyzed. What I've been doing for Tango is running uncrustify on the .di files before packaging them for distribution. Sean
Apr 19 2008
prev sibling next sibling parent "Janice Caron" <caron800 googlemail.com> writes:
On 19/04/2008, e-t172 <e-t172 akegroup.org> wrote:
  You did not read my entire message, did you?

Yes, I read your entire message. Your presumption was incorrect.
Apr 19 2008
prev sibling next sibling parent "Janice Caron" <caron800 googlemail.com> writes:
On 19/04/2008, e-t172 <e-t172 akegroup.org> wrote:
  If I understand your statement correctly, it means the D compiler decides
 on its own whether to inline a function or not.

That is correct. The rationale is that the compiler is able to make better optimisation decisions than the programmer.
 I don't think it's a good
 idea, because it will lead to very strange problems and unexpected behaviour
 when dealing with shared libraries.

It's an optimisation decision, so it should make no difference whatsoever, except to make your code run faster or slower.
Apr 19 2008
prev sibling next sibling parent reply "Janice Caron" <caron800 googlemail.com> writes:
On 19/04/2008, e-t172 <e-t172 akegroup.org> wrote:
 But it
 shouldn't be too hard to write a program which automatically extract the
 declarations and compile-time code out of a .d file, and use it to generate
 a header file.

That would mean that the interface would be driven by the implementation. I believe that the original poster was wanting to have the implementation contrained by the interface, which is an entirely different question.
Apr 19 2008
parent e-t172 <e-t172 akegroup.org> writes:
Janice Caron a écrit :
 On 19/04/2008, e-t172 <e-t172 akegroup.org> wrote:
 But it
 shouldn't be too hard to write a program which automatically extract the
 declarations and compile-time code out of a .d file, and use it to generate
 a header file.

That would mean that the interface would be driven by the implementation. I believe that the original poster was wanting to have the implementation contrained by the interface, which is an entirely different question.

I agree. I was just submitting this specific "feature" for those who prefer writing code the way it currently is.
Apr 19 2008
prev sibling next sibling parent "Koroskin Denis" <2korden gmail.com> writes:
On Sat, 19 Apr 2008 15:48:51 +0400, Janice Caron <caron800 googlemail.com>  
wrote:

 On 19/04/2008, e-t172 <e-t172 akegroup.org> wrote:
  I have to agree on this one. As a side note, I definitely think we  
 *need*
 "real" header files (like .h files in C/C++), which separates the API  
 and

What if my functions may be inlined? What if my functions are template functions? What if my functions are capable of compile-time-function-execution? What if my functions generate strings for use in string mixins?

Then, yes, you should provide source code for these functions, too. That's the Boost way. Most of the library staff is template-heavy and stored in *.hpp files. That's their biggest advantage - you don't need to compile Boost (well, most of it). And that's their biggest disadvantage - it's not human readable.
 Besides which - I don't want to have to maintain two separate files!
 Those days are gone, and good riddance to them.

I used to code in C#, and in those days my best friend was Reflector. You don't need any source code documentation with it, because it allows you to easily navigate between classes, shows you all information for any class: public/private methods/properties, implementation (if needed), base class, interfaces, etc. Great one! A must-have, when developing for dotNET. We should have similar one for D! Imagine, you point it to Tango root folder a get a complete objects hierarchy! Damn, I should definately look into starting such project.
Apr 19 2008
prev sibling next sibling parent "Janice Caron" <caron800 googlemail.com> writes:
On 19/04/2008, e-t172 <e-t172 akegroup.org> wrote:
  This function is now simple, it is likely it will be inlined by the
 compiler. Therefore, it will not be included in the shared library.

That assumption is false. Just because a function is small enough to be inlined, doesn't mean its object code won't be in the library. In fact, if I understand this correctly, the object code will always be in the library. The decision as to whether or not to inline cannot be made at the library level. To make that decision most optimally, the compiler also needs to know the calling code. If all calls to a function are inlined, the linker should not link it into the final executable, thereby avoiding code bloat.
Apr 19 2008
prev sibling next sibling parent reply Sean Kelly <sean invisibleduck.org> writes:
== Quote from e-t172 (e-t172 akegroup.org)'s article
 Koroskin Denis a écrit :
 - Ability to distribute closed-source libraries. I'm against
 closed-source libraries, but I know that a lot of people need them.
 Of course, header files also means additional maintenance issues. But it
 shouldn't be too hard to write a program which automatically extract the
 declarations and compile-time code out of a .d file, and use it to
 generate a header file. This way, each time a .d file is modified, the
 Makefile (or any other build system) would automatically trigger the
 regeneration of the associated header file.
 P.S.: I'm not talking about .di files here. Last time I tried to
 generate .di files, implementation was still included in them.

The bodies of functions that can be inlined are included by default in .di files. Recently, I suggested that this feature be sensitive to the -inline flag when creating the .di file. It would be easy to implement and it seems logical to boot. Feel free to submit it as an enhancement request, or any other ideas you have on this issue. Sean
Apr 19 2008
parent Yigal Chripun <yigal100 gmail.com> writes:
Sean Kelly wrote:
 == Quote from e-t172 (e-t172 akegroup.org)'s article
 Koroskin Denis a écrit :
 - Ability to distribute closed-source libraries. I'm against
 closed-source libraries, but I know that a lot of people need them.
 Of course, header files also means additional maintenance issues. But it
 shouldn't be too hard to write a program which automatically extract the
 declarations and compile-time code out of a .d file, and use it to
 generate a header file. This way, each time a .d file is modified, the
 Makefile (or any other build system) would automatically trigger the
 regeneration of the associated header file.
 P.S.: I'm not talking about .di files here. Last time I tried to
 generate .di files, implementation was still included in them.

The bodies of functions that can be inlined are included by default in .di files. Recently, I suggested that this feature be sensitive to the -inline flag when creating the .di file. It would be easy to implement and it seems logical to boot. Feel free to submit it as an enhancement request, or any other ideas you have on this issue. Sean

Question/Suggestion: where dmd currently puts the inlined function bodies inside the .di file? It should be in a section of the file clearly separated from the interface. what I mean is that the .di file should look something like this: --- indented interface definition with comments/documentation separator [like a comment saying: below is compiler stuff, don't touch] inlined function bodies/other implementation code needed by the compiler --- another thing, maybe D needs a separate file type for compile-time code. for example, a .dc file [for D compile-time]. this file would contain only compile-time code like templates/ctfe functions/macros/mixin definitions etc... and regular .d files would only be allowed to contain code for run-time and code that uses the compile time generated symbols. this separates code meant for regular compilation from code written for compile-time evaluation. That would help simplify the language, i.e instead of using static if, just use regular if. if it's in a .dc file than it'll be evaluated at compile-time. if you have a function that can be run both at compile-time and run time, then you should be able to "import" it to your .d file without rewriting it, [could be implemented via mixin or alias]. this would help both the user of the code to understand what code is used at what stage and it'll help the compiler to provide different handling for each case [for example, you need to compile only the .d files into a library, since the .dc files should be provided to the user as is so the user could provide them to his compiler. IDE writers would benefit from this as well. What do you think? --Yigal
Apr 19 2008
prev sibling next sibling parent "Janice Caron" <caron800 googlemail.com> writes:
On 19/04/2008, Sean Kelly <sean invisibleduck.org> wrote:
 Except sometimes the programmer doesn't want code exposed, even if
  it means faster execution.

There's nothing actually to stop you from withholding the source. Just make two copies of the .d file - one with implementations, and one without. Compile the one with, to make the library object file, and distribute the one without.
 Exposing implementation can also have implementation changes cause
  client code to have to be recompiled.

Changing /any/ source file should require all dependent source file to be recompiled. That's why we have makefiles and other build systems.
Apr 19 2008
prev sibling next sibling parent "Janice Caron" <caron800 googlemail.com> writes:
On 19/04/2008, Sean Kelly <sean invisibleduck.org> wrote:
 There's nothing actually to stop you from withholding the source. Just

> without. Compile the one with, to make the library object file, and > distribute the one without. The whole point of automatic header generation is to avoid the issues associated with manually maintaining header files.

I didn't say it was /easy/. I just said there's nothing to stop you doing it. I'm not a fan of closed source software, so I have no wish to make it easy. :-) The point is, if that's what you want to do, you can.
  Changing a source file should require the application to be re-linked.

  As someone who has worked on programs that can take half a day to build,
  I would throw a fit if every source change required a full rebuild of code
that
  simply includes a header related to this source.

So you want the ability to import a module, but not have to rebuild dependent files if that module changes? Good luck with that one.
Apr 19 2008
prev sibling next sibling parent reply "Hans W. Uhlig" <huhlig gmail.com> writes:
e-t172 wrote:
 Koroskin Denis a écrit :
 Current situation is, someone writes code, probably nice one, and it 
 is added
 to main trunk. Problem is, interface is implementation driven, not 
 otherwise.
 It is not discussed. And thats bad. Tests first, then code, Kent Beck 
 said.
 Of course, implementation can affect interface, but only after trial.

I have to agree on this one. As a side note, I definitely think we *need* "real" header files (like .h files in C/C++), which separates the API and his implementation. I see for advantages: - Clearer presentation for the user of the API. The user is only interested in the API, not in the implementation: if the user has to go through the implementation to understand what the library does, it is a result of bad documentation, and should be avoided. With header files, all the information the user needs is put in one place, without the "noise" of the implementation throughout the file.

This is true but why present code in the form of prototypes instead of auto built documentation ala Java. While the current JavaDocs lack some quality they do make up for a lot by auto generation documentation for just about everything. No headers necessary just an HTML file you can read and know everything public(and if the programmer did any documentation you know what it is, what it returns and what it does).
 
 - More efficient. For example, on a Linux distro, if you want to write a 
 program using a library, you need to install the "dev" package of the 
 library, which only contains header files. There is no point of 
 including the implementation in the package, because it is not useful to 
 the user (and definitely not useful if you only want to compile a 
 project, not modify it). See the glibc as an example : if you needed the 
 entire source code of the glibc every time you wanted to compile a 
 program, this would have been a pure waste in terms of disk space and 
 compiler efficiency.

Why should You need extra code files for development. I have always found this annoying, you either need the library or you don't. If the library has API documentation, why would you need headers.
 
 - Clear separation of "compiled" code and compile-time code. That is, if 
 a library provides "normal" code (accessed by an API) and compile-time 
 code (which is compiled in the application that uses the library, not in 
 the library itself), the two can be clearly distinguished: "normal" code 
 will only consist of declarations in the header file, while compile-time 
 code will be entirely defined in the header file. That way, the user 
 knows what IS in the library (the .a or .so file), and what will be 
 compiled in his application. (of course this is already possible, just 
 not as "clearly" for the user)

Ok, I can see this for fixed constants used by the library. Otherwise, why would you need them.
 
 - Ability to distribute closed-source libraries. I'm against 
 closed-source libraries, but I know that a lot of people need them.

Again, I say API Docs!
 
 Of course, header files also means additional maintenance issues. But it 
 shouldn't be too hard to write a program which automatically extract the 
 declarations and compile-time code out of a .d file, and use it to 
 generate a header file. This way, each time a .d file is modified, the 
 Makefile (or any other build system) would automatically trigger the 
 regeneration of the associated header file.

Self Built API Docs to the rescue!
 
 P.S.: I'm not talking about .di files here. Last time I tried to 
 generate .di files, implementation was still included in them.

Apr 19 2008
parent reply e-t172 <e-t172 akegroup.org> writes:
Hans W. Uhlig a écrit :
 e-t172 wrote:
 - More efficient. For example, on a Linux distro, if you want to write 
 a program using a library, you need to install the "dev" package of 
 the library, which only contains header files. There is no point of 
 including the implementation in the package, because it is not useful 
 to the user (and definitely not useful if you only want to compile a 
 project, not modify it). See the glibc as an example : if you needed 
 the entire source code of the glibc every time you wanted to compile a 
 program, this would have been a pure waste in terms of disk space and 
 compiler efficiency.

Why should You need extra code files for development. I have always found this annoying, you either need the library or you don't. If the library has API documentation, why would you need headers.

Because API documentation is documentation, not D code that can be parsed by a compiler. If I want to use a library, I have to write: import mylib; For that to work, you either need the header file "mylib.di" or the D file "mylib.d".
 - Clear separation of "compiled" code and compile-time code. That is, 
 if a library provides "normal" code (accessed by an API) and 
 compile-time code (which is compiled in the application that uses the 
 library, not in the library itself), the two can be clearly 
 distinguished: "normal" code will only consist of declarations in the 
 header file, while compile-time code will be entirely defined in the 
 header file. That way, the user knows what IS in the library (the .a 
 or .so file), and what will be compiled in his application. (of course 
 this is already possible, just not as "clearly" for the user)

Ok, I can see this for fixed constants used by the library. Otherwise, why would you need them.

Huh? Sorry, I don't understand.
 - Ability to distribute closed-source libraries. I'm against 
 closed-source libraries, but I know that a lot of people need them.

Again, I say API Docs!

See my first comment. There is no way API documentation can replace header files. Unless, of course, you tell the compiler how to parse and understand the API documentation, but that's just awkward.
Apr 20 2008
parent reply "Hans W. Uhlig" <huhlig gmail.com> writes:
Jesse Phillips wrote:
 On Sun, 20 Apr 2008 12:27:51 +0200, e-t172 wrote:
 
 Hans W. Uhlig a écrit :
 e-t172 wrote:
 - More efficient. For example, on a Linux distro, if you want to write
 a program using a library, you need to install the "dev" package of
 the library, which only contains header files. There is no point of
 including the implementation in the package, because it is not useful
 to the user (and definitely not useful if you only want to compile a
 project, not modify it). See the glibc as an example : if you needed
 the entire source code of the glibc every time you wanted to compile a
 program, this would have been a pure waste in terms of disk space and
 compiler efficiency.

found this annoying, you either need the library or you don't. If the library has API documentation, why would you need headers.

parsed by a compiler. If I want to use a library, I have to write: import mylib; For that to work, you either need the header file "mylib.di" or the D file "mylib.d".
 - Clear separation of "compiled" code and compile-time code. That is,
 if a library provides "normal" code (accessed by an API) and
 compile-time code (which is compiled in the application that uses the
 library, not in the library itself), the two can be clearly
 distinguished: "normal" code will only consist of declarations in the
 header file, while compile-time code will be entirely defined in the
 header file. That way, the user knows what IS in the library (the .a
 or .so file), and what will be compiled in his application. (of course
 this is already possible, just not as "clearly" for the user)

why would you need them.

 - Ability to distribute closed-source libraries. I'm against
 closed-source libraries, but I know that a lot of people need them.


header files. Unless, of course, you tell the compiler how to parse and understand the API documentation, but that's just awkward.

I'm sorry, but in reading through the posts I do not see how headers could replace API documentation. And here is the distinction I would like to put forth, what the programmer needs vs what the compiler needs. API documentation should be all you need to get information about the function you are calling. That is it for the user, they shouldn't need anything more, having a header file would just be redundant and not displayable on the web. Now there is the compiler. Many points for having header files have come up as they provide a means to compile library code without the source and other things. And it appears your view is that these header files should be created automatically. And all of this sounds exactly like the reasons for having di files. Which seems to have some problems, and it seems that it is a lot of speculation as to if it would be a problem or not. So my thoughts here would be: create examples as to would should be doable, show how it fails, and try to get di files fixed.

then they need not be human readable. api docs are for human consumption .di are for compiler consumption
Apr 20 2008
parent reply Yigal Chripun <yigal100 gmail.com> writes:
Hans W. Uhlig wrote:
 Jesse Phillips wrote:
 On Sun, 20 Apr 2008 12:27:51 +0200, e-t172 wrote:

 Hans W. Uhlig a écrit :
 e-t172 wrote:
 - More efficient. For example, on a Linux distro, if you want to write
 a program using a library, you need to install the "dev" package of
 the library, which only contains header files. There is no point of
 including the implementation in the package, because it is not useful
 to the user (and definitely not useful if you only want to compile a
 project, not modify it). See the glibc as an example : if you needed
 the entire source code of the glibc every time you wanted to compile a
 program, this would have been a pure waste in terms of disk space and
 compiler efficiency.

found this annoying, you either need the library or you don't. If the library has API documentation, why would you need headers.

parsed by a compiler. If I want to use a library, I have to write: import mylib; For that to work, you either need the header file "mylib.di" or the D file "mylib.d".
 - Clear separation of "compiled" code and compile-time code. That is,
 if a library provides "normal" code (accessed by an API) and
 compile-time code (which is compiled in the application that uses the
 library, not in the library itself), the two can be clearly
 distinguished: "normal" code will only consist of declarations in the
 header file, while compile-time code will be entirely defined in the
 header file. That way, the user knows what IS in the library (the .a
 or .so file), and what will be compiled in his application. (of course
 this is already possible, just not as "clearly" for the user)

why would you need them.

 - Ability to distribute closed-source libraries. I'm against
 closed-source libraries, but I know that a lot of people need them.


header files. Unless, of course, you tell the compiler how to parse and understand the API documentation, but that's just awkward.

I'm sorry, but in reading through the posts I do not see how headers could replace API documentation. And here is the distinction I would like to put forth, what the programmer needs vs what the compiler needs. API documentation should be all you need to get information about the function you are calling. That is it for the user, they shouldn't need anything more, having a header file would just be redundant and not displayable on the web. Now there is the compiler. Many points for having header files have come up as they provide a means to compile library code without the source and other things. And it appears your view is that these header files should be created automatically. And all of this sounds exactly like the reasons for having di files. Which seems to have some problems, and it seems that it is a lot of speculation as to if it would be a problem or not. So my thoughts here would be: create examples as to would should be doable, show how it fails, and try to get di files fixed.

then they need not be human readable. api docs are for human consumption .di are for compiler consumption

In a perfect world, you are right. however, not all programmers like writing documentation. what if the provided documentation is only partial? you've got a compiled lib and a .di file both not human readable as you propose. where do you get the API from? sometimes you need to look at the code and it helps if you only need to look at the header files that define the API and not all the code of the project which might be very large. --Yigal
Apr 20 2008
next sibling parent "Hans W. Uhlig" <huhlig gmail.com> writes:
Yigal Chripun wrote:
 Hans W. Uhlig wrote:
 Jesse Phillips wrote:
 On Sun, 20 Apr 2008 12:27:51 +0200, e-t172 wrote:

 Hans W. Uhlig a écrit :
 e-t172 wrote:
 - More efficient. For example, on a Linux distro, if you want to write
 a program using a library, you need to install the "dev" package of
 the library, which only contains header files. There is no point of
 including the implementation in the package, because it is not useful
 to the user (and definitely not useful if you only want to compile a
 project, not modify it). See the glibc as an example : if you needed
 the entire source code of the glibc every time you wanted to compile a
 program, this would have been a pure waste in terms of disk space and
 compiler efficiency.

found this annoying, you either need the library or you don't. If the library has API documentation, why would you need headers.

parsed by a compiler. If I want to use a library, I have to write: import mylib; For that to work, you either need the header file "mylib.di" or the D file "mylib.d".
 - Clear separation of "compiled" code and compile-time code. That is,
 if a library provides "normal" code (accessed by an API) and
 compile-time code (which is compiled in the application that uses the
 library, not in the library itself), the two can be clearly
 distinguished: "normal" code will only consist of declarations in the
 header file, while compile-time code will be entirely defined in the
 header file. That way, the user knows what IS in the library (the .a
 or .so file), and what will be compiled in his application. (of course
 this is already possible, just not as "clearly" for the user)

why would you need them.

 - Ability to distribute closed-source libraries. I'm against
 closed-source libraries, but I know that a lot of people need them.


header files. Unless, of course, you tell the compiler how to parse and understand the API documentation, but that's just awkward.

could replace API documentation. And here is the distinction I would like to put forth, what the programmer needs vs what the compiler needs. API documentation should be all you need to get information about the function you are calling. That is it for the user, they shouldn't need anything more, having a header file would just be redundant and not displayable on the web. Now there is the compiler. Many points for having header files have come up as they provide a means to compile library code without the source and other things. And it appears your view is that these header files should be created automatically. And all of this sounds exactly like the reasons for having di files. Which seems to have some problems, and it seems that it is a lot of speculation as to if it would be a problem or not. So my thoughts here would be: create examples as to would should be doable, show how it fails, and try to get di files fixed.

then they need not be human readable. api docs are for human consumption .di are for compiler consumption

In a perfect world, you are right. however, not all programmers like writing documentation. what if the provided documentation is only partial? you've got a compiled lib and a ..di file both not human readable as you propose. where do you get the API from?

Same place the DI file came from, Auto generated documentation like javadocs can provide the exact same information as a header file can plus add in any extra documentation that the programmer did provide. so: .d - source file .did - D interface Documentation .di - D interface "Header" .do - D object file (not sure what d uses as default. is it .obj?) Each (group of) "source" file(s) should end up creating 3 files from it, Interface, Documentation and object
 sometimes you need to look at the code and it helps if you only need to
 look at the header files that define the API and not all the code of the
 project which might be very large.

Why, Automatically generated documentation shouldn't be hard to create, I hate to say it but java did a good job in this respect even if their way about it needs to be upgraded.
 
 --Yigal

Apr 20 2008
prev sibling next sibling parent reply "Hans W. Uhlig" <huhlig gmail.com> writes:
Jesse Phillips wrote:
 On Sun, 20 Apr 2008 22:18:33 +0300, Yigal Chripun wrote:
 
 Hans W. Uhlig wrote:
 Jesse Phillips wrote:
 On Sun, 20 Apr 2008 12:27:51 +0200, e-t172 wrote:

 Hans W. Uhlig a écrit :
 e-t172 wrote:
 - More efficient. For example, on a Linux distro, if you want to
 write a program using a library, you need to install the "dev"
 package of the library, which only contains header files. There is
 no point of including the implementation in the package, because it
 is not useful to the user (and definitely not useful if you only
 want to compile a project, not modify it). See the glibc as an
 example : if you needed the entire source code of the glibc every
 time you wanted to compile a program, this would have been a pure
 waste in terms of disk space and compiler efficiency.

found this annoying, you either need the library or you don't. If the library has API documentation, why would you need headers.

parsed by a compiler. If I want to use a library, I have to write: import mylib; For that to work, you either need the header file "mylib.di" or the D file "mylib.d".
 - Clear separation of "compiled" code and compile-time code. That
 is, if a library provides "normal" code (accessed by an API) and
 compile-time code (which is compiled in the application that uses
 the library, not in the library itself), the two can be clearly
 distinguished: "normal" code will only consist of declarations in
 the header file, while compile-time code will be entirely defined
 in the header file. That way, the user knows what IS in the library
 (the .a or .so file), and what will be compiled in his application.
 (of course this is already possible, just not as "clearly" for the
 user)

Otherwise, why would you need them.

 - Ability to distribute closed-source libraries. I'm against
 closed-source libraries, but I know that a lot of people need them.


header files. Unless, of course, you tell the compiler how to parse and understand the API documentation, but that's just awkward.

could replace API documentation. And here is the distinction I would like to put forth, what the programmer needs vs what the compiler needs. API documentation should be all you need to get information about the function you are calling. That is it for the user, they shouldn't need anything more, having a header file would just be redundant and not displayable on the web. Now there is the compiler. Many points for having header files have come up as they provide a means to compile library code without the source and other things. And it appears your view is that these header files should be created automatically. And all of this sounds exactly like the reasons for having di files. Which seems to have some problems, and it seems that it is a lot of speculation as to if it would be a problem or not. So my thoughts here would be: create examples as to would should be doable, show how it fails, and try to get di files fixed.

then they need not be human readable. api docs are for human consumption .di are for compiler consumption

however, not all programmers like writing documentation. what if the provided documentation is only partial? you've got a compiled lib and a .di file both not human readable as you propose. where do you get the API from? sometimes you need to look at the code and it helps if you only need to look at the header files that define the API and not all the code of the project which might be very large. --Yigal

Yeah, so. The generated di files already do most of what is asked, sure it isn't very human readable and what ever other problems that come with inline code. If these files aren't doing enough, request correction in di generation, not an entirely new useless file.

If indentation and such are such a problem, just run it through a pretty printer.
 
 Here are my thoughts on the lackluster readability of di files. For one, 
 if your using a library with no documentation your liking going to have 
 other problems than just the di files. 

Very much agreed
 2nd you could request it be cleaned up. 

Again, Pretty printer
 3rd right a program to generate html that is human readable from the 

Why not make this part of the compiler.
Apr 20 2008
parent Bill Baxter <dnewsgroup billbaxter.com> writes:
Hans W. Uhlig wrote:

 Yeah, so. The generated di files already do most of what is asked, 
 sure it isn't very human readable and what ever other problems that 
 come with inline code. If these files aren't doing enough, request 
 correction in di generation, not an entirely new useless file.

If indentation and such are such a problem, just run it through a pretty printer.

The compiler has a feature to generate .di files. Why would you not want it to do that job nicely? Yeh, your suggestion is an ok work-around but the right solution is to just improve the built-in .di generator. All I'm talking about is adding some white space appropriately. I also wouldn't say making such changes should be a very high priority. It's just one of those bazillion bits of polishing that remains to be done, but should be done, some day. --bb
Apr 20 2008
prev sibling parent Bill Baxter <dnewsgroup billbaxter.com> writes:
Jesse Phillips wrote:
 On Sun, 20 Apr 2008 22:18:33 +0300, Yigal Chripun wrote:

 Here are my thoughts on the lackluster readability of di files. ...
 2nd you could request it be cleaned up.

By the way, I did this one already a while ago: http://d.puremagic.com/issues/show_bug.cgi?id=1427 --bb
Apr 20 2008
prev sibling next sibling parent Jesse Phillips <jessekphillips gmail.com> writes:
On Sun, 20 Apr 2008 12:27:51 +0200, e-t172 wrote:

 Hans W. Uhlig a écrit :
 e-t172 wrote:
 - More efficient. For example, on a Linux distro, if you want to write
 a program using a library, you need to install the "dev" package of
 the library, which only contains header files. There is no point of
 including the implementation in the package, because it is not useful
 to the user (and definitely not useful if you only want to compile a
 project, not modify it). See the glibc as an example : if you needed
 the entire source code of the glibc every time you wanted to compile a
 program, this would have been a pure waste in terms of disk space and
 compiler efficiency.

Why should You need extra code files for development. I have always found this annoying, you either need the library or you don't. If the library has API documentation, why would you need headers.

Because API documentation is documentation, not D code that can be parsed by a compiler. If I want to use a library, I have to write: import mylib; For that to work, you either need the header file "mylib.di" or the D file "mylib.d".
 - Clear separation of "compiled" code and compile-time code. That is,
 if a library provides "normal" code (accessed by an API) and
 compile-time code (which is compiled in the application that uses the
 library, not in the library itself), the two can be clearly
 distinguished: "normal" code will only consist of declarations in the
 header file, while compile-time code will be entirely defined in the
 header file. That way, the user knows what IS in the library (the .a
 or .so file), and what will be compiled in his application. (of course
 this is already possible, just not as "clearly" for the user)

Ok, I can see this for fixed constants used by the library. Otherwise, why would you need them.

Huh? Sorry, I don't understand.
 - Ability to distribute closed-source libraries. I'm against
 closed-source libraries, but I know that a lot of people need them.

Again, I say API Docs!

See my first comment. There is no way API documentation can replace header files. Unless, of course, you tell the compiler how to parse and understand the API documentation, but that's just awkward.

I'm sorry, but in reading through the posts I do not see how headers could replace API documentation. And here is the distinction I would like to put forth, what the programmer needs vs what the compiler needs. API documentation should be all you need to get information about the function you are calling. That is it for the user, they shouldn't need anything more, having a header file would just be redundant and not displayable on the web. Now there is the compiler. Many points for having header files have come up as they provide a means to compile library code without the source and other things. And it appears your view is that these header files should be created automatically. And all of this sounds exactly like the reasons for having di files. Which seems to have some problems, and it seems that it is a lot of speculation as to if it would be a problem or not. So my thoughts here would be: create examples as to would should be doable, show how it fails, and try to get di files fixed.
Apr 20 2008
prev sibling parent Jesse Phillips <jessekphillips gmail.com> writes:
On Sun, 20 Apr 2008 22:18:33 +0300, Yigal Chripun wrote:

 Hans W. Uhlig wrote:
 Jesse Phillips wrote:
 On Sun, 20 Apr 2008 12:27:51 +0200, e-t172 wrote:

 Hans W. Uhlig a écrit :
 e-t172 wrote:
 - More efficient. For example, on a Linux distro, if you want to
 write a program using a library, you need to install the "dev"
 package of the library, which only contains header files. There is
 no point of including the implementation in the package, because it
 is not useful to the user (and definitely not useful if you only
 want to compile a project, not modify it). See the glibc as an
 example : if you needed the entire source code of the glibc every
 time you wanted to compile a program, this would have been a pure
 waste in terms of disk space and compiler efficiency.

found this annoying, you either need the library or you don't. If the library has API documentation, why would you need headers.

parsed by a compiler. If I want to use a library, I have to write: import mylib; For that to work, you either need the header file "mylib.di" or the D file "mylib.d".
 - Clear separation of "compiled" code and compile-time code. That
 is, if a library provides "normal" code (accessed by an API) and
 compile-time code (which is compiled in the application that uses
 the library, not in the library itself), the two can be clearly
 distinguished: "normal" code will only consist of declarations in
 the header file, while compile-time code will be entirely defined
 in the header file. That way, the user knows what IS in the library
 (the .a or .so file), and what will be compiled in his application.
 (of course this is already possible, just not as "clearly" for the
 user)

Otherwise, why would you need them.

 - Ability to distribute closed-source libraries. I'm against
 closed-source libraries, but I know that a lot of people need them.


header files. Unless, of course, you tell the compiler how to parse and understand the API documentation, but that's just awkward.

I'm sorry, but in reading through the posts I do not see how headers could replace API documentation. And here is the distinction I would like to put forth, what the programmer needs vs what the compiler needs. API documentation should be all you need to get information about the function you are calling. That is it for the user, they shouldn't need anything more, having a header file would just be redundant and not displayable on the web. Now there is the compiler. Many points for having header files have come up as they provide a means to compile library code without the source and other things. And it appears your view is that these header files should be created automatically. And all of this sounds exactly like the reasons for having di files. Which seems to have some problems, and it seems that it is a lot of speculation as to if it would be a problem or not. So my thoughts here would be: create examples as to would should be doable, show how it fails, and try to get di files fixed.

then they need not be human readable. api docs are for human consumption .di are for compiler consumption

In a perfect world, you are right. however, not all programmers like writing documentation. what if the provided documentation is only partial? you've got a compiled lib and a .di file both not human readable as you propose. where do you get the API from? sometimes you need to look at the code and it helps if you only need to look at the header files that define the API and not all the code of the project which might be very large. --Yigal

Yeah, so. The generated di files already do most of what is asked, sure it isn't very human readable and what ever other problems that come with inline code. If these files aren't doing enough, request correction in di generation, not an entirely new useless file. Here are my thoughts on the lackluster readability of di files. For one, if your using a library with no documentation your liking going to have other problems than just the di files. 2nd you could request it be cleaned up. 3rd right a program to generate html that is human readable from the di files.
Apr 20 2008
prev sibling next sibling parent Tower Ty <towerty msn.com.au> writes:
Koroskin Denis Wrote:

 First of all, I don't want to start Tango vs. Phobos vs. ??? flame war.
 But the way Tango or Phobos envolves is not the best one.
 
 Current situation is, someone writes code, probably nice one, and it is  
 added
 to main trunk. Problem is, interface is implementation driven, not  
 otherwise.
 It is not discussed. And thats bad. Tests first, then code, Kent Beck said.
 Of course, implementation can affect interface, but only after trial.
 
 I mean, what we need is a detailed document (probably, wikified one) with
 detailed library interfaces, their rationale, use cases, examples, stress
 tests but NO implementation! Implementation is important, too, but only to  
 end
 users, and not for standardization. Reference implementation will follow, I
 promise. It shouldn't be fast, it should be CORRECT and standard compliant  
 in
 the first place, and it should pass D Library Stress Test.
 
 We need some kind of committee that would endorse that. And a separate  
 newsgroup
 section. Drafts should be stored in wiki.
 
 As such, my suggestion is to revive digitalmars.dtl group!
 The condition is it should be regularly monitored by Walter/Andrei or any  
 other
 person, that will be assigned for a duty.
 
 We should discuss and answer the following questions:
 
 - How DTL should be organized (bunch of files or structured like in  
 Java/C#/Tango)?
 - What modules should it consist of?
 - What classes does it provide?
 - What interfaces these classes expose?
 - What feature set of D should it use?
 - Templates vs. Object Oriented approach
 
 The library should document all these. Extensive set of functional and  
 unit tests should also be provided.
 Reference implementation for D1/D2 will exist. However, library should be  
 design driven, not implementation driven.
 
 Any module/class/method should be removed if Walter is not satisfied with  
 it.
 Invariant should be held, that at any given moment Walter is satisfied  
 with every piece of the library.
 
 I believe this is the only way we can create single powerful standard  
 library.

Come from a C++ environment and want to change it all back to C++ Simple ,get out of D and go back to C
Apr 19 2008
prev sibling parent Lars Ivar Igesund <larsivar igesund.net> writes:
Koroskin Denis wrote:

 First of all, I don't want to start Tango vs. Phobos vs. ??? flame war.
 But the way Tango or Phobos envolves is not the best one.
 
 Current situation is, someone writes code, probably nice one, and it is
 added
 to main trunk. Problem is, interface is implementation driven, not
 otherwise.
 It is not discussed. And thats bad. Tests first, then code, Kent Beck
 said. Of course, implementation can affect interface, but only after
 trial.

You don't have a perfectly correct image of how Tango evolves - writing code certainly is far from enough for inclusion in Tango. Everything that is included is heavily scrutinized - to the degree that is possible in an open source project. Without a short cut in process here and there, Tango would never see a single addition.
 
 I mean, what we need is a detailed document (probably, wikified one) with
 detailed library interfaces, their rationale, use cases, examples, stress
 tests but NO implementation! Implementation is important, too, but only to
 end
 users, and not for standardization. Reference implementation will follow,
 I promise. It shouldn't be fast, it should be CORRECT and standard
 compliant in
 the first place, and it should pass D Library Stress Test.

While it is very important considering the interface that should be used, cementing it _prior_ to implementation is crazy, and not at all good for neither usability or the implementation itself. FWIW, though, we do invite users to be involved in such processes - we even use our wiki for it and we have several going. As of now, the success of it is somewhat contended - mostly because that would mean that a lot of people will need to have the necessary mental bandwidth (which already is limited since this is an open source project).
 We need some kind of committee that would endorse that. And a separate
 newsgroup
 section. Drafts should be stored in wiki.

Design by committee has a bad reputation in D it seems, and until D actually reach a higher level of usage, this will not be a thing to be possible to consider. And even if D gets there, I'm certainly not sure that design processes such as that described above is smart. Even Java's JSR's (which may or may not be considered successful), are almost always based on existing implementations, and got there because the initial implementation had seen some success.
 As such, my suggestion is to revive digitalmars.dtl group!
 The condition is it should be regularly monitored by Walter/Andrei or any
 other
 person, that will be assigned for a duty.

Similar things has been suggested many times, some times also for a similar topic, but nope - won't happen.
 
 We should discuss and answer the following questions:
 
 - How DTL should be organized (bunch of files or structured like in
 Java/C#/Tango)?
 - What modules should it consist of?
 - What classes does it provide?
 - What interfaces these classes expose?
 - What feature set of D should it use?
 - Templates vs. Object Oriented approach

I personally thought that this kind of development died sometime in the 80's. You certainly wouldn't get a single code of implementation in the next 3 years (at least).
 The library should document all these. Extensive set of functional and
 unit tests should also be provided.

Indeed, that should be a goal of any library, and it even is for Tango - but beyond what is possible to extract automatically, there is a hard limit - because we're operating in an open source environment.
 Reference implementation for D1/D2 will exist. However, library should be
 design driven, not implementation driven.

If anything, the development of a library should be funtionality/usability driven. Design driven is madness - what happens if after implementation all the design turns out to suck? Like the initial versions of the JRE... You have then spent extraneous amounts of time on something that is not usable. Having the design discussions among the brightest minds won't help this - even they cannot offhand know which designs will work, although a team experienced with a language may get a notion of that after some years.
 Any module/class/method should be removed if Walter is not satisfied with
 it.
 Invariant should be held, that at any given moment Walter is satisfied
 with every piece of the library.

It is long since obvious that Walter will not get involved into any heavy library lifting.
 I believe this is the only way we can create single powerful standard
 library.

You would get a single non-existing standard library. As it is now, you have two alternatives, at least one of which are well designed (but of course not perfect). -- Lars Ivar Igesund blog at http://larsivi.net DSource, #d.tango & #D: larsivi Dancing the Tango
Apr 19 2008