Static Library Support: Project dependency to static libraries

Last time I was dealing with the project build types that allowed DDT to produce static libraries out of a project. That’s a vital element for a larger project infrastructure so its implementation should be priority. This time there was an other problem related to the larger projects. It’s all nice and good to have lib files produced but we couldn’t really do anything with them at all.

Actually, the same way as most of my development I heavily relied on what Bruno did before. As far as I understood the already existing code the heavy lifting was already done. That is, the project dependencies on libraries were implemented as a skeleton, only that it wasn’t useful… yet. The only thing left to implement is to use the project dependencies’ output (the static libraries themselves) as input modules and their source directory as input directory on the referee project. But nothing is as easy as it looks. I’ve just encountered a quite disturbing fact: the DLTK documentation is just virtually non-existent. There are some vague articles in their wiki, but the API documentation is basically isn’t any useful. That makes life quite hard to develop anything on the DLTK basis and frankly I’m worried about the scenario when DDT hit huge problems with the DLTK. The actual issue I encountered looked quite easy: how can I get libraries from a different project and resolve their paths relative to the workspace. At this point of time I didn’t find anything useful for this. So at the time being I had to hack the references to other projects with a simple “../[projectname]/src” as an  import directory and “../[projectname]/lib/[projectname].lib” as a input file. But the real solution would be to get access to the other project DeeBuildOptions object which could provide me the necessary settings for the output file, its build type from its IScriptProject info the source folders as they could be more than one.

In other words, to use this new feature the user has to make sure that the directory structure is that of the default and there are no additional source folders. Also it probably won’t work with anything but DMD and possibly only on Windows. The next target is to eliminate these problems.

The corresponding change set in the source code: r341098a21488


Static library support

After some messing around the EGit finally I came to understand how to push my changes to the google code clone. It came as a surprise because I was sure that I understand the git architecture. The problem was that I had proper ref specs to the google code clone’s repository.

Today’s exercise was to add some support of static libraries. I found that the DeeBuildOptions class had some traces for supporting different build types but it was set to EXECUTABLE without the option to change it. In the case of static library, the DMD compiler offer an option “-lib” to build static libraries. In addition I found that the there were already a combo box to set the build type in the DeeProjectOptionsBlock which is responsible for handle the compiler options in the UI.

So this part of the job was pretty easy: Uncomment the relevant part in the DeeProjectOptionsBlock.createControl method that shows the combo box for the build type setting. To handle the default cases better than we do currently, I hid all the properties of the DeeBuildOptions class and added some handling of the default values. If there’s no set output directory yet, it will depend on the build type, for instance: if the build type is executable or dynamic library, the output directory is ‘bin’, if the build type is static library, the output directory will be the ‘lib’ directory. However, I noticed a tiny little problem here. The D programming language is a native language which means that the compiled modules are subject of linking. That is, the real output of the build process is the executable/library/dynamic library and the object files themselves are kind of a by-product of this process. Most of the build system I saw there were a separation between the intermediate files (object files) and the final output. As a next step I would like to add an intermediate directory for the object files to handle (of course, if the user wishes so, he can keep the output files mixed with the object files).

On the UI with small little hack I only update those DeeBuildOption fields that were actually edited so that my changes on the default values could come to effect. But I need to updated the artifact name’s field which I just realized I didn’t. Next commit. If you switch to LIB_STATIC build type, the output folder and the output extension will change to lib.

Also, before I committed these changes I merged Bruno’s recent changes (mostly GDC related stuff).

The corresponding change set in the source code: r080d4e50bc25

IModelElement Icons

My previous experiments with DDT were about to change how the Outline View looks like in DDT. It was a good exercise to understand the Eclipse plug-in development basics and learn Bruno’s code. However, it didn’t work out as I expected as it took quite a while to understand how the DLTK assist the plug-in to build a model from the source code. This IModelElement hierarchy represents the basic model of the project’s tree and therefore this is to be used by any graphical representation as it is in the ScriptView or the OutlineView.

Later Bruno unified the whole stuff in the DeeModelElementLabelProvider which works in three different context: Script View (navigation), Outline View (view), and the Completion Proposal (CodeAssist). As I can’t settle with the current icons I decided to change this bit of the code. Bruno made it clear that he wants to use the private/public/protected modifiers as an overlay, but I rather would not use that in the case of fields and methods. The reason for this is simple: The overlayed icons have limited space, and a field (variable) or a method (function) could possible have quite a few modifiers such as “private static immutable” thus having three overlays would look like quite crowded.

As a result, I brought my previous patch in play, adding the JDT’s icons for methods and fields. At the moment I didn’t add other things, but it only needs a little painting to get a private/public/protected overlay for other elements, such as classes, structs, enums, interfaces, etc.

I couldn’t find my complete previous work however: Previously I added the types/return types to all these fields, similarly to JDT. Anyway, the stuff looks like this right now:

The corresponding change set in the source code: r427c36a3c2e4

Features that worth to look at

DDT is still quite immature so there are plenty of features missing to make it a productive development environment. Browsing in the code and messing with the product it self I decided to come up with an approximate list of features which should be implemented in order to make DDT worth to work with as a IDE for D development in the long run. The following list will change a lot in the future as I will revisit in the light of ongoing development.

Static and Dynamic library support: In a real software ecosystem static libraries are essential more than anything else. To add the static library support seems easy, but I haven’t tried any dynamic library with D yet.

Debugger support: This is perhaps the biggest one. There is no development IDE without a useful debugger interface and unfortunately DDT is lacking one. And also, this is quite a problem with D in it self. As far as my experiments went, the only compiler that produces some meaningful debug-info is DMD, but we have a limited support for command-line based debuggers. GDB would be a perfect choice, but at the moment I have no convincing evidence that the GDB’s D support is working to this moment. I need to investigate the matter further.

Refactoring: The CodeAssist for D is quite promising and I don’t see any issue to implement the most popular refactoring strategies: it’s just matter of time and arse.

Unit-testing IDE integration: It is imperative in the software development these days to offer a good, reliable testing facility for the developers. JUnit has an excellent support in Eclipse, where you can track in a graphical way what unit tests are present in the source code and we can have a good report on their progress in the test view.

As a frame work, I think it worth to have a look at the Felt project as it is aimed to provide all the agile goodies through several library. The DUnit framework in particular provides the unit testing framework which I could build a IDE support for. The real deal here is to parse the output of the unit test executable. Unfortunately it seems to rely on the Tango library which I find quite disturbing as the Tango is an optional library, but in this case all software that would use the Felt libraries, will depend on Tango. As the Felt library was updated quite a long time ago, as I try to explore it, I should remove the dependencies to the Tango library.

Quick Reminder How to Develop DDT

As I ran in to problems as I tried to set up my Eclipse environment in order to develop the DDT plugin, I decided to make a quick reminder how the installation should look like. My development environment is Eclipse Indigo (3.7.x) on Windows.

  1. Make sure that the Eclipse is up-to-date completely. (Help/Check for software updates)
  2. First I need to install the PDE, the plug-in development plug-in for Eclipse. This is easy, however there are always problems with the naming. So, open up Help/Install New Software dialog from the menu, and select the repository Indigo – Untick the Group items by category and search for the phrase: “Eclipse Plug-in Development Environment”. Install it.
  3. DLTK 3.0: This is the default version line for Eclipse Indigo, so we should find it also in the eclipse repository. Help/Install New Software dialog, Untick the Group items by category and search for the phrase: ” Dynamic Languages Toolkit – Core Frameworks”. Install it.
  4. Since the original Descent project exists on an SVN repository, I need to install the Subversive SVN Client for Eclipse. In the same dialog as before, I check for the phrase:   “Subversive SVN Team Provider (Incubation)” and go.

As I expected EGit became part of the Eclipse Indigo project thus no additional installation will be required. Yet.

I need to be careful of the import order of the related projects. What I need accordingly to Bruno’s tutorial is to import the descent.compiler project from SVN. In the Package Explorer’s context menu click on the Import and select the SVN/Project from SVN repository. Stick the URL to the repository address and select on the resource page the trunk/descent.compiler project. Ta-dam! Oh, wait… the first time when you want to import from SVN, the subversive team provider plug-in will find that you don’t have a connector so, there’s a need to install one. After several try, I found that the SVNKit 1.35 works just fine to use.

Now comes the real deal, the source code of the DDT. Luckily, it’s really simple: Again, in the Package Explorer’s context menu click on the Import, select Git and in the upcoming dialog, click on the Clone… button.  There it is: paste the URL: and select the all the branches (well, the branches themselves is a bit funny, because it still seems that I need to add a new remote in order to get the other branches, but heck, it didn’t kill anything). Note, that the clone isn’t my google clone address, I’ll add that later as an upstream clone while Bruno’s one gonna be the downstream update site.

At the remote naming, I chose to use the bruno-* (* = branch name) format, so it’s gonna be easier to sort them. After all these, the package explorer gets littered with projects.

Why D?

This blog is dedicated my work on the D Programming Language. Though my self had not even worked on a project and coded serious stuff in D I can see the possibilities and the real need of such a programming language. If you don’t know what is D, please refer to the official website.

I think when Walter Bright  come to the conclusion that there’s a need for a multi-paradigm, imperative, native language, he made the right choice. As a C++ developer myself, I found several frustrating issue with my main programming language and with all the efforts of the recent C++11 standard these pains doesn’t seem to pass. I don’t want to bash C++: it’s a nice piece of language which stays prominent for many years if not decades, but I think today, in 2011 we need to prepare ourselves to learn from the mistakes of the past, and have a clear start. C++ is dragging many compatibility issues with it self along its long way, and therefore a new language design, free of such a heavy burden could mean a fresh start in the native development.

As of today, the most prevalent programming languages in the native arena is C and C++. C is good as it is, restricted system programming language which fit most of the hardware environment. C++ on the other hand, though it made an astonishing progress with the C++11 standard, has major flaw: it just doesn’t hold the challenge against the world of Java and C# world. These programming languages became success not because of their revolutionary new technology but from rather the fact that they can keep the developers happy and enthusiastic. All this is because it was easy to produce a wide variety of coding tools that makes the development fun. In order to maintain the momentum of the native development, we need to get the fun back to the native development.

C++ unfortunately can not offer such an easy entertainment. As a project is getting more and more complicated the build time grows enormously. It desperately calls for a module system. The parsing is difficult because the extent and ambiguity of the C++ grammar. The compiler can be hardly manipulated, the reflection system is partial, and a language level garbage collector would just make life easier for everyone if the coder can decide to not use it in certain, well limited scenarios.

In this series of blog post I document my experiments, understanding and contribution to the language D.