Shift Happens

January 1, 2008

Nice presentation I found on the blog of an old friend.


UML Plugin for XWiki

February 19, 2007

Yesterday I got the idea of making an UML plugin for XWiki based on uml2svg. But since there was the need for a new uml2svg release I could put this in practice only today. Developing XWiki plugins is not as easy as it could be (more about this later), however it was as easy as I originally expected. Also things are not yet quite polished so I need to test and document the plugin before I can submit it. However, as you can see it already works:

Entering a UML macro in XWiki

UML diagram in XWiki

Adobe SVG Viewer Discontinued

November 10, 2006

As I have anticipated in a post a very long time ago: Adobe has decided to discontinue support for Adobe SVG Viewer starting January 1, 2008 (ASV End of Life FAQ). Adobe even plans to remove Adobe SVG Viewer from the download area on January 1, 2009. While this development is not surprising at all, I just hope that by that time Renesis and friends will be mature alternatives to Adobe SVG Viewer. That is because it is hard to believe that Firefox will gain significant market share on IE, especially now that IE7 is out.

Mozilla, Adobe and JavaScript 2.0

November 10, 2006

Adobe has released the code of their ActionScript VM to the Mozilla Foundation, and from now on they will work together on it. This is very good news for the JavaScript 2.0 (an implementation of ECMAScript 4) support in Mozilla Firefox (probably somewhere after 2008). What’s nice about JavaScript 2.0? Well, it has classes, namespaces and, most important: it has types. More coverage of the Adobe code contribution here:

Semantic Wet Dreams

October 31, 2006

Some days ago I had a discussion about the Semantic Web with Henry Story (one of the creators of BabelFish, now working for Sun). While he thinks that the Semantic Web is something attainable and he is probably not the only one, I think it’s just a dream some people share, but will not happen no matter how nice the dream itself might be. I was busy with other (more useful) things, like working on my thesis, so I didn’t have time to write these thoughts properly until now. Moreover, before posting, I also wanted to read this: “The Semantic Web Revisted (pdf, 2006)”. Sure, I already knew pretty well what this is all about (after all, my licentiate thesis was about “Semantic Web-Based Agent Communication”) and I had already read their previous vision statements. However, those statements were all at least 5 years old (which by web standards is ancient) so I expected great changes. Changes like coming back down from the clouds into reality. As you might expect by the title of the post, these changes did not happen. So let me summarize in just one phrase for those who don’t have time to waste reading all this:

It’s the same old bullshit!.

Read the rest of this entry »

Latests Web News

October 29, 2006

The Browser Wars 2: The Empire Strikes Back
Both Firefox 2 and Internet Explorer 7 are now out, and the browser wars have started again. More friendly than ever before, the new browser war is great news for the users and even better news web developers who had to live with the countless bugs of Internet Explorer 6. Not that Firefox is bug free though, it took them three years to fix what was a show stopper on the Mac. BTW., if you ever wanted to do me a favor, please vote for these five open Firefox bugs: 293581, 305859, 276431, 231179 and 272288.

Renesis Reloaded
Renesis, the (maybe too) promising SVG viewer is still alive and under heavy development. When their parent company lost interest in SVG, most of the Renesis developers have gone outside and formed a new company – Emia Systems – located in Regensburg, Germany. They have since then bought the rights and everything on Renesis from their previous parent company.

I expect it will take a long time before it is officially released, especially on platforms other than Windows (it’s .NET), but after Adobe buying Macromedia, this is probably our only chance to finally have a solid, multi-platform, multi-browser SVG plugin. And they might even make the player open source.

STIX Fonts: Always Coming Soon
The STIX Fonts project is aimed at creating a free font to cover an important part of Unicode — i.e. a free pan-unicode font. Since the project was started more than ten year ago it was always “coming soon”, but this time it really looks like they have almost done it. A beta test is expected to take place during the next months.

You might wonder why is this so important, when commercial pan-unicode fonts like Microsoft’s Arial Unicode have existed for years. Well, the commercial part in the last statement tells it all. The only free enough (shareware) pan-unicode font that covers all mathematical symbols is Code2000. And while Code2000 can display these symbols reasonably well, the modest 5$ license fee is only for supporting James Kass, it’s generous creator, and Code2000 is overall an astonishing achievement with its more than 60000 glyphs, there is a small problem. The focus with Code2000 was on coverage rather than high quality, and the STIX Fonts would cover exactly that, while being totally free for everybody.

Finally, you might also wonder why do I care about this. Quite simple, the STIX Fonts would cover all symbols needed to draw LaTeX and MathML formulae for sMArTH, our open source online equation editor.

HTML Back from the Grave
The W3C recently decided to restart work on HTML and “incrementally evolve it” to a point where it’s easier and logical for everybody to transition to XHTML. This means that the transition from HTML to XHTML is not going well at all, with tools and developers alike producing bad mark-up (“tag soup”). It also means that the voices of the people supporting WHATWG were finally heard, which is of course a good thing (on the other hand this does not make WHATWG right when promoting “HTML5” instead of XHTML), and the new developments seem to be positive.

However, in my opinion, in their quest to “evolve” their standards, the W3C could become an obstacle for their adoption. It seems evident that it is impossible to make something a true standard (adopted by the wide majority), when you make it a moving target at the same time. So maybe the W3C should concentrate on making higher quality standards for which they gather more community support before releasing, rather than releasing poor standards often, and fixing things that are not really broken. “Release early, release often!” is good practice for (open source) developers, not for standard bodies. Multiple incompatible versions are already a problem for HTML, so do we really need more? And do they really expect that continuing support for HTML won’t harm the adoption of XHTML?

So what might have been an alternative solution? Now I really don’t know any more. When I first posted this I thought that deprecating HTML and XHTML Transitional entirely (and maybe removing the validators for them, they are poor anyway) in favor of XHTML would have been an alternative. Then whoever would still want to publish “tag soup” online would not adhere to any standard, and whoever wants to render “tag soup” in a browser would not adhere to any standard — this is the current situation anyway, and it’s quite unlikely to change. However, maybe this could have been an incentive for everybody (web developers, web publishing software developers and web browser developers) to go away from “tag soup” and towards something more “meaningful”. Maybe I am wrong, but I don’t think that they can built their semantic wet dreams by having “tag soup” as a foundation. But well, who cares about their semantic dreams anyway? At least not me. Not any more.

[Edited the last paragraph: 2006-10-31]

SSL is killing the Web interactivity

May 22, 2005

It is well-known that the server performance degrades considerably for SSL transactions compared to the non-SSL case. However, many people running Web servers are (mis)using SSL for a lot of not-security-critical content. In most cases this leads sever overloading and unacceptable long waiting times for the clients. The best example of this is probably, whose servers are overloaded 100% of the time as they are using SSL for almost all of the administration tasks. Most of these tasks are not security critical, but they are time critical to many of us. Using SSL for them is like wearing a firefighter’s full turn out gear to protect you from getting a minor burn when having a family barbecue. It is overkill.

As for me, I am starting to graw tiered of waiting for Web pages to load over HTTPS. It takes so long that I sometimes give up before it’s done. If there was a way to disable SSL for services like, I would do it right away. The productivity decrease is so high with SSL that I am ready to give away security just to be able to get my work done.

Important Facts:

  • SSL increases computational cost of transactions by a factor of 5 to 7
  • On a 1.4 GHz Xeon machine the computational demand of an initial handshake is around 175 ms and that of a resumed handshake is around 2 ms.
  • RSA computations are the single most expensive operation in TLS, consuming 20-58% of the time spent in the web server.

Further Reading:

  1. V. Beltran, J. Guitart, D. Carrera, J. Torres, E. Ayguadé and J. Labarta, Performance Impact of Using SSL on Dynamic Web Applications., XV Jornadas de Paralelismo, pp. -, Almeria, Spain. September 15-17, 2004. PDF File (144 KB)
  2. Kant, K., Iyer, R., & Mohapatra, P. (2000). Architectural impact of secure socket layer on Internet servers. Computer Design, 2000, International Conference, 7-14. PDF File (248 KB)

  3. C. Coarfa, P. Druschel, and D. Wallach. Performance analysis of TSL web servers, 2002. PDF File (144KB)

  4. H. Xubin, A Performance Analysis of Secure HTTP Protocol. PDF File (154 KB)