Tag Archives: Openismus

Openismus Revived

Here’s an update since my last status post in June.

Things have improved for Openismus even more though we are not complacent. Several proposals, including the ones I mentioned in that last blog post, have resulted in customer contracts. So we are now busy working on the Maliit input method system (virtual keyboard), Wayland, Rygel UPnP/DLNA and Evolution Data Server (EDS). We are even thinking of hiring another developer if we can find someone who is just right.

We are now established in the habit of creating proposals for customers, revising them, and shifting into implementation. We take our customer proposals seriously, making sure that the developers are the main authors and making sure that they don’t leave questions unanswered. If there’s something we might help your company with then we’d like the chance to convince you too.

Michael Hasselmann is now the main person negotiating new work for us and keeping some of that work on schedule. He has been very successful – a surprise to himself but not to us. We are calling him a Sales Engineer but that doesn’t really do his dedication justice.

Michael can travel much more than I could for the last few years. Right now he’s at the Automotive Linux Summit in the UK and tomorrow he will be at the X Developers Conference in Nürnberg with Jan Arne Petersen (where our interest is mostly in Wayland and input methods). On Monday and Tuesday he’ll visit me in the Munich office.

We have achieved this thanks, of course, to the hard work of our whole team at Openismus. They have fought hard so we can all keep doing worthwhile work that we enjoy. I am proud of them and glad to be part of this.

Generating SPDX files with licensecheck

This week I had to provide an SPDX file to a customer. SPDX seems to be a way to describe the licensing of software components, to help with open source compliance. Here is an official example (though it is probably not up to date with the current SPDX specification).

However, there are no open source tools to create or edit SPDX files, so I created a little openismus-spdx-generator.py Python script that uses Debian’s licensecheck utility to scan a project and then outputs a skeleton SPDX file in RDF format. It is a quick hack with no real error checking and I have barely read the SPDX specification, so please do improve it.

My first impression is that SPDX is rather unwieldy. The RDF (XML) format is verbose and seems to focus on being a snapshot of software via checksums of all its source files, rather than specifying a particular version or revision of the software as a whole. I don’t see any attempt to list dependencies and their licenses. More strangely, it looks like the .XLS (Microsoft Excel) spreadsheet format is the preferred format, which sets off my corporate-drones-doing-painfully-silly-things-that-they-believe-are-normal alarm bells.

There are official Java-based SPDX tools to convert between the various SPDX formats, and maybe to validate SPDX files. You’ll need to build them with the ruby-based buildr build system. Then you are left with some .jar files that have to be run via “java -jar target/whatever.jar the-spdx-file” after setting JAVA_HOME correctly.  Java programs are hard enough to package on Linux distros, but I’m sure that buildr makes it even less likely.

Anyway, the tools crash for me on the provided example files. The git repository has no branches or tags, so it’s hard to know which version is supposed to work and I don’t have confidence that the specification, example files, and tools are in sync with each other at any particular time.

Most of the SPDX file contents will be the result of a scan anyway, so rather than demanding that source code is supplied to me with an SPDX file, I’d generally prefer that the software just had proper COPYING files and source code headers. That seems like an easier requirement to comply with.

It’s all a bit linuxfoundationy.

Openismus Status

As I mentioned in March, Openismus is now down to a small core of super developers.

Jon Nordby and Michael Hasselmann are now fully involved in finding new projects, and traveling when I cannot. This month we were relieved to get the go-ahead on projects for two new customers, for work on the Maliit on-screen keyboard. Those are our first new customers since Nokia disappeared. Hopefully we can talk about them soon.

However, this is still not enough work for us to feel particularly safe. So we have spent some time on open source work that we hope will attract attention by customers who want more of the same. Don’t hesitate to contact me if you’d like our help.

Rygel for a DLNA Player

Rygel currently provides a UPnP and DLNA Server, serving media to DLNA Players and Renderers such as TVs, smartphones, and speakers. It has achieved UPnP and DLNA Certification as a server in the Nokia N9 smartphone, thanks to work by Openismus and others.

But Rygel also already has some mature support for software that wishes to be a DLNA Player. With some work, this could let a GStreamer-based media player become a DLNA Player, able to play video, music, or pictures from DLNA Servers. We believe it could achieve DLNA certification as a Player. As a side-effect, this would add some (DLNA standardized) remote-control functionality to the media player.

This is work that Openismus would like to do for a customer. In fact, we’ve done some of it already, but we haven’t published it yet. The rest of this post describes what we’d like to do. UPDATE: Some of this is now done.

DLNA Player

A DLNA Player is a device that can access media files on a DLNA server and plays them back.

Browsing

A DLNA Player must implement a DLNA Controller (Known as a Control Point in UPnP terminology) that implements the “Browse” operation, to request data from the DLNA server, and must be able to play media in its own UI.

The user interface could either:

  • Display the raw structure as retrieved from the server, allowing direct browsing, or
  • Collect all the data from the server beforehand (maybe caching it), to present it like other media collections, for instance with searching, thumbnails, etc.

As inspiration, we can use code from the Helium (a UPnP control point) and Korva (a daemon used by PushUp) open source programs, We have tested both against the DLNA tool and they seem to pass quite well.

Playing

DLNA Servers provide media via HTTP streaming, so any GStreamer-based player can easily play that content via the souphttpsrc element. Rygel’s own (unfinished) Renderer (its playbin plugin – see below) uses this souphttpsrc element. (Note that souphttpsrc does not use SOAP, and use of SOAP is not required for basic playing. souphttpsrc’s name is based on its use of libsoup, which once also did SOAP, but now does not.)

DLNA Servers may additionally support RTSP as an alternative to HTTP, but GStreamer also has support for RTSP. Most DLNA Servers only support HTTP.

Codecs

The Player must support some standard media codecs according to the DLNA Profile that it claims to support:

The minimum for “Home networked device” (meaning it is usually connected via Ethernet, is rather stationary and is powerful):

  • LPCM
  • JPEG_SM (meaning resolution <= 640×480)
  • A variety of MPEG2 profiles. There are three regions, North America, Europe and Japan, each of which requires a different set. A device can have support for multiple regions.

The minimum for “Mobile networked device” (meaning it is usually connected via Wi-Fi, can be carried around and is less powerful):

  • MP3 and AAC_320 either without any container (ADTS framing) or in a 3GP or MP4 container.
  • JPEG_SM (meaning resolution <= 640×480)
  • AVC_MP4_BL_CIF15_AAC_520, that’s baseline H.264 video, 320×288, 15 FPS with AAC audio in an MP4 container.

Most of the formats (MP3, MPEG2, AAC, H.264) have their own license and patent issues. MP3 could be covered by using the Fluendo plugin while MPEG2 and H.264 are usually covered by using a DSP-based implementation. Commercial software decoders without dubious copyright for MPEG2, H.264 and AAC used to be available e.g. from MainConcept, though of course not as GStreamer elements. Device vendors still need to make an agreement with the MPEG-LA if they are not already in that patent pool, depending on the predicted sales volume. The MPEG-LA usually doesn’t care what implementation is used, we believe.

DSP integration in GStreamer already exists for SOCs made by TI (The OMAP/DaVinci family). For TI there are two alternative techniques:

  • dspbridge and gst-dsp.
    • This is mostly maintained and used by Nokia (by Felipe Contreras), though we can assume that Nokia has now abandoned it.
    • We have heard that this doesn’t work with recent 3.2 kernels, though that would need to be confirmed.
  • The newer dsplink with gstreamer-ti.
    • gstreamer-ti supports more codecs anyway.

Instead of relying on vendor-specific implementations, the preferred way seems to be via OpenMAX. GStreamer elements for that are included in upstream and freely available.

The MainConcept software decoders basically have a function that takes a blob of encoded data and spits out a blob of decoded data. It should be relatively easy to get a basic GStreamer element going. A Demo SDK is only available upon request and after signing an NDA. As with most commercial software, getting bugs fixed in them is next to impossible.

Possible Architecture: Player/Renderer/Controller

A DLNA Player does not need to be a DLNA Renderer (which could be controlled by another Controller, such as a remote control, for instance to pause or to change color settings). However, we think it would be wise to implement DLNA Renderer, despite the extra work, because:

  • A DLNA Renderer can be tested automatically without user intervention, meaning that more code and functionality would be tested more often and more accurately.
  • A DLNA Renderer can be controlled by a DLNA remote control device.
  • Code reuse: Implementing Controller functionality to control our own Renderer via local standard UPnP/DLNA messages, instead of custom in-process code, would allow the implementation to be reused in a DLNA remote control device.

These are the DLNA Renderer services which could be used by a remote control DLNA Controller device:

Rygel already provides a simple Renderer via its “playbin” plugin, which uses GStreamer’s souphttpsrc and playbin2 elements. However, this currently just displays directly to the screen and runs out of process, so it would need to be refactored as a shared library (the code is already LGPL). This would avoid interprocess communication of media data, and allow direct manipulation of the playbin2 GStreamer element, for instance, to tell the playbin2 to use the media player’s own display sink.

This already provides the Renderer services needed by a Controller (see above). However, some work would be required to achieve certification.

UPnP Certification

The “Player” device class does not exist yet in UPnP. However, since February 2012 a DLNA Media Player (DMP) must have an UPnP certification. We are not sure yet exactly which UPnP certification is required but we assume that it is a UPNP-AV Controller certification with only the Server-side interaction tested. This would need clarification with DLNA.

UPnP certification requires use of the UPnP Certification Test Tool, which runs on Windows and is available to Basic UPnP members, though you need to be a UPnP Implementor member to go through the Certification process.

During UPnP certification the device must support Link-Local address generation. This is only necessary for the sake of the UPnP certification and does not need to be present in the actual device. NetworkManager seems to do either DHCP or Link-Local. For example to achieve this on the N9, we set a special gconf key to enable Auto IP fall-back only for testing. The production devices can’t do Auto-IP out-of-the box. This is of course not possible with DLNA as you have to send a production device, but one can request a “Auto-IP voucher” during device registration which removes the requirement. For a pure software certification this does not apply of course.

The Player’s interaction with a DLNA Server would be based on GUPnP, whose basic UPnP technologies, such as discovery, eventing, and message syntax have already been well tested during the Nokia N9’s DLNA Server development and certification.

As mentioned above, we have also informally tested Helium’s control point implementation against UPnP. The test run shows a pass in the basic plumbing and for all mandatory operations from the control point requirement documents (connection manager, content directory, control and AV transport).

UPnP Testing Procedure

UPnP testing is done in-house. A log file from the test tool together with a packet capture of the test run is then sent to the UPnP forum via its website. The UPnP forum checks the result and awards a certificate. Certification is usually bound to a device and free of charge. See http://upnp.org/sdcps-and-certification/certification/how-to-certify-a-device/

The test tool does its testing partly automatically, if the Player also supports Renderer (see above), and partly by requesting user interaction, either to use the Player’s UI, or to plug and unplug the network cable.

UPnP Test Results for the Rygel Renderer plugin

We have already informally tested the current version of Rygel’s Renderer (playbin) plugin, which we recommend that we use (see above).

The full test results are probably confidential according to the UPnP software license, so we probably cannot share them. However, we will summarize the results here:

The test result shows that we pass everything in the basic UPnP plumbing: discovery, eventing, syntax. We also pass every test that was marked as supported by our implementation. This includes all mandatory operations as defined in the relevant UPnP specification documents as well as the optional “Pause” operation.

DLNA Certification

Special attention must be paid to the required media formats. Most of them are proprietary and might need additional licenses. Those formats are:

  • Minimum for “Home networked device” (means it’s usually connected via Ethernet, rather stationary and powerful)
    • LPCM
    • JPEG_SM (meaning resolution <= 640×480)
    • A variety of MPEG2 profiles. There are three regions, North America, Europe and Japan, each of which requires a different set. A device can have support for multiple regions.
  • Minimum for “Mobile networked device” (means it’s usually connected via Wi-Fi, can be carried around and less powerful)
    • MP3 and AAC_320 either without any container (ADTS framing) or in a 3GP or MP4 container.
    • JPEG_SM (meaning resolution <= 640×480)
    • AVC_MP4_BL_CIF15_AAC_520, that’s baseline H.264 video, 320×288, 15 FPS with AAC audio in an MP4 container.

The main issue will be with the HTTP implementation of whatever displays the media. For instance, if GStreamer is used, it needs to include the fix for Bug #676020.

DLNA Testing Procedure

DLNA Certification requires use of the DLNA testing software, which runs on Windows. Any company would need to become at least a DLNA Contributor member. The membership includes access to the DLNA guidelines (a hard copy can be bought by everyone for 500 USD).

The testing software does its testing partly automatically, if the Player also supports Renderer (see above), and partly by requesting user interaction, either to use the Player’s UI, or to plug and unplug the network cable.

DLNA testing is a two-step process. First a test run is conducted in-house. The device is registered and the results from this test are then sent to the DLNA which check if it’s “worthy” for certification. Then the device needs to be sent to a certification laboratory where the same tests are done according with the registration.

UPnP certification is a pre-requirement for DLNA certification. DLNA certification is usually bound to a device, but it’s possible to certify software. The only certified software so far is a controller, as far as we know.

DLNA Plugfests

In addition to testing against the certification software, DLNA quarterly offers so-called plug-fests, where members can test their implementations or devices against the devices of other members. Attendance to these is not free, of course, but the experience with the N9 showed that it really helps to sort out interoperability problems early in the process. Fees are usually 400 USD person and 500 USD early bird fee, 1000 USD normal fee per device class, not including accommodation. A special rate for a hotel near to the venue is usually available.

DLNA Test Results for the Rygel Renderer plugin

We have already informally tested the current version of Rygel’s Renderer (playbin) plugin, which we recommend that we use (see above).

The full test results are confidential according to the DLNA software license, so we cannot share them, but we can summarize them here.

Our initial test run shows two problems that prevent a proper test run as they shadow most of the real problems:

  • The default implementation does not claim support for any DLNA-approved media format.
  • There is a problem with the meta-data the renderer sends to the controller (in this case the controller is the test tool). This seems to be a problem with XML escaping of the state variables.

When we work around these problems temporarily, the result shows that we pass almost everything related to basic UPnP/DLNA functionality. This is hardly surprising because we have achieved certification for the Server functionality already, which shares code. The remaining bugs mostly deal with inconsistencies between the evented state variables and the values returned in explicit queries, and a missing function call that is optional in UPnP but apparently required in DLNA.

Summary

Openismus can do this. We can even get it through certification. Clearly we know how.

Please contact us if you’d like us to do this work for your company. We can quickly provide a full proposal with tasks and milestones.

Online Glom is now all Java

Over the last couple of weeks I have reimplemented just enough of the C++ libglom code as Java in Online Glom‘s gwt-glom, removing the need for java-libglom (which wrapped libglom for Java via SWIG). It’s now working and deployed on the Online Glom test server.

This makes both development and deployment much easier. It also made the source code all camelCase so it’s not offensive to the eyes of Java coders.

To replace libglom’s use of GdaSqlBuilder, I used jOOQ. That worked well, thanks to its maintainer, Lukas Eder, who was very helpful and who quickly added some API that I needed.

Now that the code is all Java I really hope that more people will look over the code and point out anything that can be improved. I still don’t know Java like I know C++ so please don’t be shy about telling me that I have made mistakes.

GWT, Javascript, and serialization

Removing the use of java-libglom let me simplify the code, because I can now send object, such as LayoutItem, to the client without needing to copy it into a separate LayoutItemDTO object that existed just because the original wasn’t serializable, so it could be sent from the server (Java) to the client (JavaScript, compiled from Java).

However, this raised some new issues. I wanted some of the objects to contain some extra cached data, so that the client code did not have to calculate it itself, often by retrieving some other related object. Right now these are extra member variables in the classes, but that prevents me from splitting the code off into a new java-glom library.

Furthermore, any class that is sent between the client and server must fully conform to the requirements of the GWT Java-to-JavaScript compiler, even if that method will not actually be run on the server. For instance, I tried to add clone() methods, for use on the server, but that broke the JavaScript compilation because it doesn’t have an equivalent for Object.clone().

Those restrictions on the Java code that is allowable on the client side (because it will be compiled to JavaScript) were particularly awkward when the compiler (or Maven, or something) refused to give me clues about what was wrong. For instance, it took me 2 frustrating days to fix this small error by breaking the code apart until only the problem code remained. At other times, there was no real error on stdout, but there were clues (in a variety of hard-to-read formats) in the HTML generated by mvn site. Or sometimes, I could see errors when building inside Eclipse, but not outside.

The GWT system works great, but something is inconsistent about how it shows errors, and it can’t be right that some compilation errors only show up when running, rather than when building.

Next steps

As much as I would like to move on to implementing editing, I need to spend some time now on getting some regression tests set up in the maven build. These must create and run temporary PostgreSQL database instances like I do in the Glom autotools build. For instance, I need to check that the new SQL-building code, using jOOQ, is really safe from SQL injection like the libglom code seems to be.

My recent changes also caused the OnlineGlomService async API to be particularly inefficient, sometimes sending far more data back to the server than should be necessary. I will try to avoid that and try to make this API smaller, to avoid so many round trips.

Online Glom: Deployment

Today I deployed the latest Online Glom (gwt-glom) on a new Amazon AWS instance of Ubuntu Precise, again connecting Apache with Tomcat. This time I took notes about exactly how I did it. I wonder if this is something that I should put in a Puppet configuration (I have never used Puppet).

It took me a while to figure this out last time, but now it’s clearer to me. However, I still don’t know how to avoid the need for the /OnlineGlom/ suffix in the URL.

Online Glom: Reports

Over the last few weeks, in occasional spare moments, I have added report generation to Online Glom. It kind of works now, using the regular report definitions from the existing .glom files, and generating reports that look much like the reports in the desktop Glom version. I deployed this at onlineglom.openismus.com/OnlineGlom/ .

I used JasperReports for this, writing some code to map Glom’s report structure to the JasperReports structure. I chose JasperReports mostly based on its popularity. There are various tools and libraries that use it and the JasperSoft company seems to be very successful by supporting it.

However, I will probably replace JasperReports with some custom code instead, because:

  • The JasperReports API is almost completely undocumented. I figured out what to do only by looking at various snippets of code from random places. The project seems to focus on the use of visual tools to build report designs, rather than the use of a generic API by such tools.
  • JasperReports demands absolute positioning, with no automatic layout, and that will never make sense for HTML.

I will try the DynamicJasper wrapper/helper before abandoning JasperReports, but I don’t see how it can avoid the fundamental problem of absolute positioning.

Glom 1.22

I released stable Glom 1.22.0 a few days ago.

For easy installing, I also created a Glom 1.22 package for Ubuntu Precise, in the Openismus PPA. Ubuntu Precise normally has Glom 1.20. I wish there was some similar way to create packages for Fedora, or someone to do that for me.

Beside the multiple bug fixes, Glom 1.22 has a few new features:

  • Details: Foreign key ID fields: Add a New button next to the existing Find button.
  • Allow custom (not related) choices to be translated, with only the original text being stored in the database. This only happens when the choices are restricted, at least for now.
  • Related Choices: Default to showing the primary key but allow the field to be be other than the primary key.
  • Related Choices: Allow the user to specify a sort order.
  • Allow the database title to be translated.
  • Added command-line utilities to help with translation via po files.

Openismus Getting Smaller

Openismus has recently had to let some great developers, and good friends, go. We are now much smaller.

This is really sad because it took us a long time to find and train these people, and they would be massive assets if the future looked better. Although they will be greatly missed, I am at least reassured that they will have no problem finding new jobs. I do hope that they don’t settle for work that is anything but worthy of their experience and enthusiasm.

This downsizing happened because we are now finally losing the customer work we had from Nokia, as expected since their February 11th 2011 decision to kill MeeGo. Nokia are not our only customer, but they are by far our largest. That gave us the opportunity to diversify, and we tried, but without much success. That failure is mine. On the one hand, it’s unfortunate that we have been so dependent on one customer. But, on the other hand, we would never have been so big for so long without them. I would do the same again without regrets.

I view the last few years of Openismus very positively:

  • We gave several young developers their first chance to prove themselves as professionals.
  • We trained new developers. They are now established as respected and experienced developers.
  • We made a few contributions to our favorite projects. For instance, natural layout in GTK+ 3.

Personally, Openismus has allowed me to work part-time, so I can spend time with my children. My first child was born soon after the company was founded, and for the first year, I worked from home, contrary to the myth that founding a company means working all hours of the day and neglecting your family. The tech industry is excessively male, with little understanding for men who want to share in the work of child-rearing, so I’m glad I had the freedom to work part-time. I am highly motivated to keep this freedom.

This has been a disadvantage, of course. For instance, I strongly suspect that I could win more customer work by traveling more to conferences and to customers on site. That has given good results when I’ve managed to do it. But this is simply an impossibility for someone who needs to take care of kids. I think the custom of business travel might be one of the greatest obstacle to women reaching top executive positions. It’s one of many things that won’t improve until men are forced to share more of the burden.

So, Openismus goes on, with some uncertainty. Our specific expertise in the Maliit input methods, in the QtContacts and EDS contacts systems, and in DLNA via Rygel should be very attractive, but time will tell. If you need help with GTK+ or Qt on Linux, from people who really know how, and are not afraid to tell you how, then we are still here and still ready.

 

I like writing regression tests

Over the last few months I have developed the habit of writing tests when fixing bugs in Glom. These run during “make check” or “make distcheck” so I can be sure that the bugs have not come back.  (I know these tests should be smaller and even more numerous, but I can’t yet see a sensible way to break them up).

I had to do lots of build-system and code work to make this possible, because some code had to be moved out of the UI code, and many tests depend on a temporary local server instance. But now, it’s easy to add new tests and it feels so good.

When the tests succeed I have a feeling of confidence in the code, instead of a feeling of uncertainty. When they fail I feel glad that I had a chance to fix a regression before releasing a new version. I am surprised at how much more enjoyable this has made coding.