J2ME Accessbility

Unfortunately, J2ME MIDP 2.0 had limited library for accessibility (Not even close). The reason why Talks doesn’t read out the canvas, because everything that painted by canvas are graphics based. For example, you can try to create your own custom text (using Canvas) to display it out on the screen and let the Talks software to read it for you. The text automatically become graphics and with no luck the Talks just ignore it.

For example: with J2ME MIDP specs, even when you try to append long string into form, the Talks only read the last sentence if the string is too long.

This is a very very very
very long sentence” <—– Only read this sentence if you are using form.

The reason why is because, text to speech software vendors read text based on cursor where it pointed at.

Because 3rd party text-to speech vendors seems to develop their application on dependent platform such as Symbian OS on Nokia phone, which use C++ basically. That means, if you want 3rd party to read everything that display on the screen. I suggest that you use Symbian_C++ and Nokia SDK to develop the apps. (Other’s words is , build an interface that looks like Nokia interface that use Symbian_C++). Symbian_C++ is much more powerful than J2ME but not as easy and straight forward like J2ME.

Whether TALKS understands which label belongs to which text field depends on whether the focus is moved correctly on both the labels and the text fields when the user navigates (e.g. by wrapping both into a CEikControlGroup and focusing that). Just having fields visually associated with text editors is not enough, as this type of screen layout is not commonly used in the standard S60 UI.

The best way to enter data into multiple fields is by using the S60 control intended for it, called CAknForm (http://wiki.forum.nokia.com/index.php/Forms_in_Symbian_c++). As this is commonly used in many places across the phone’s standard UI, e.g. for editing contact entries, TALKS is set up to support these quite well.

The other options is, i haven’t tried yet, i think it might help is do not rely on 3rd party text-to speech software. Use J2ME text-to speech optional library : http://jcp.org/en/jsr/detail?id=113. (Currently i am not sure which phones in the market can supports this library but maybe, Nokia N95 did support).

Java TimeZone

Recently, i was working with J2ME apps to get current time system. I faced several confusion when i tried to execute the code below :

long time = System.currentTimeMillis(); did not return the current hours on my computer, instead it return the hours based on Standard Time +0000 UTC. That means, no matter which machine or device you try to retrieve the current time, it always give you the time based on UTC. You can check the current UTC time on this link.

Since, i am in Melbourne/Australia, where my current timezone is UTC+11 and begin of the day light savings. One way to workaround to get the current time is using Calendar and Date in Java:

Calendar myCal= Calendar.getInstance (TimeZone.getTimeZone ("UTC"));

//The point here is to calculate the minutes based on UTC+11
Date date = new Date (myCal.getTime().getTime() + (11*60*60*1000));

long time = date.getTime();
int milliseconds = (int) (time % 1000);
int seconds = (int) ((time / 1000) % 60);
int minutes = (int) ((time / 60000) % 60);
int hours = (int) ((time / 3600000) % 24);

The above result will return in milliseconds when you use date.getTime() to get the value.

Result : 01:10:22 (based on the time i complie and run the code)

Useful Link. The world Clock.
Tips : If it is the end of Day Light Savings, you should -1, that means, UTC 10 instead of UTC 11.

Limitation of J2ME Web Service JSR 172 package #2

Continue from previous post that i mention the limitation of J2ME JSR 172. Apparently, not all data types in XML schema are supported in J2ME JSR 172. I am not sure why the specification did not mention the points.

Basically, according to JSR 172 specification, it doesn’t support several XML encoding format.  When i tried using Netbeans 6.1 to generate the stubs from the given WSDL file, it pop-up an error as it cannot validate these field as below against the JSR 172 specs:

Limitation that mention in JSR 172:


2.SimpleTypes enumeration and restriction

Limitation that JSR 172 did not mention:

1. XSD: Date, Choice, Duration, Time.

More useful information can be found on this IBM website.

Limitation of J2ME MIDP 2.0

If you are already developing mobile application using J2ME, i assumed you already know a lot of functionality that J2ME doesn’t support. Yet, for my personal opinion, J2ME are not very “powerful language” in terms of accessibility.

Developing J2ME application to corporate with 3rd party text-to speech software is very hard. I am not sure about in future, but now it is very hard.  It doesn’t have accessibility library package as in Java Swing or AWT or alt tag like HTML.

If you tried to use canvas to create your own custom items, none of the 3rd party text-to speech software can read the labels, titles or contents. So if you are developing J2ME application for vision impaired escpecially, it is very challeging. I would suggest you use Symbian_C/C++, if you are deploying in Symbian OS platform because normally 3rd text-to speech software targeting those dependent platform so that it can work well with their own software products.

Here are some information if you choose to develop using Symbian_C++.

Limitation of J2ME Web Service JSR 172 package.

Me and my team members currently work on Mobile application that require web services data. We found that J2ME had limited support of WSDL binding style. According to J2ME JSR 172 specification section 3.2.2 Operation mode, the only WSDL binding style is supported is document/literal.

If we trying to use RPC/literal, we will get error when we tried to generate stub with the WSDL file. It is pity that, the limitation of this package had limited the way WSDL bind the document into SOAP.

There are few options to overcome this, either use REST way or regenerate the WSDL file with document/literal binding style.

J2ME LightWeight UI Toolkit

Developing J2ME mobile application using MIDP 1.0 or 2.0 is very challenging when you wish to add some stylish graphical and animations.  You can use MIDP low level graphics library that use sprites. But, Thanks God, Java open source community recently released LWUIT. Quote from the website.

LWUIT is a UI library that is bundled together with applications and helps content developers in creating compelling and consistent Java ME applications. LWUIT supports visual components and other UI goodies such as theming, transitions, animation and more.

It is good and save J2ME developers alot of time figuring out how to design “cool” and liveable graphics or animations. It is open-source software, and it should be quick easy to pick up if you know well MIDP.

But, If you are working with 3rd party text-to speech software such as Nuance Talks, to read out the text such as label, textfield, textbox, title or anything, just forget it. The 3rd party text-to speech software cannot read anything out because LWUIT using graphics and images as it’s contents.

[edited since 05/01/2010]
But keep in minds that, since the release of IPhone SDK (c like with object-oriented) and Android SDK (J2ME like, but more advanced), I preferred to developed my mobile application based on these two platform.