Maps for visually impaired

From Sugar Labs
Revision as of 15:00, 28 March 2009 by Vaish.rajan (talk | contribs) (New page: ====About you==== :'''1 What is your name? ''' :Rajan Vaish :'''2 What is your email address? ''' :vaish<dot>rajan<at>gmail<dot>com , vaishrajan<at>fedoraproject<dot>org :'''3 What i...)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

About you

1 What is your name?
Rajan Vaish
2 What is your email address?
vaish<dot>rajan<at>gmail<dot>com , vaishrajan<at>fedoraproject<dot>org
3 What is your Sugar Labs wiki username?
vaish.rajan
4 What is your IRC nickname?
generoushous
5 What is your primary language? (We have mentors who speak multiple languages and can match you with one of them if you'd prefer.)
English, Hindi.I prefer English.
6 Where are you located, and what hours do you tend to work? (We also try to match mentors by general time zone if possible.)
I am located in India (5:30+ GMT),since I am graduating in May'09 so I can work anytime while staying at home (Except 4am to 10am - India time).
7 Have you participated in an open-source project before? If so, please send us URLs to your profile pages for those projects, or some other demonstration of the work that you have done in :open-:source. If not, why do you want to work on an open-source project this summer?
Yes,I have.I volunteered and did my online Internship for One Laptop per Child (OLPC)in summers'08.
OLPC Profile URL > http://wiki.laptop.org/go/User:Vaish.rajan
I have been involved in various Real-world projects and Research works.One can have a look into my Resume for the same > http://vaish.rajan.googlepages.com/rajanvaish_resume.pdf


About your project

1 What is the name of your project?
Maps for Visually Impaired
2 Describe your project in 10-20 sentences. What are you making? Who are you making it for, and why do they need it? What technologies (programming languages, etc.) will you be using?.
Today, there are millions (according to Wikipedia > In the November 2004 article Magnitude and causes of visual impairment, the WHO estimated that in 2002 there were 161 million (about 2.6% of the world population) visually impaired people in the world, of whom 124 million (about 2%) had low vision and 37 million (about 0.6%) were blind.) of visually impaired people who use Computer and Internet, they use Screen Readers like JAWS to email, fill forms, read articles etc. But there is something where Screen Reader fails, and that is, it cannot read image (no Screen Reader can).MAPS is one of those images which a visually impaired person cannot read. Hence, one of the most needed thing by them. In an attempt to solve it, many large companies like AOL have worked on the same (Please see > http://sensations.aol.com/sensations-app-accessible-walking-directions/ ) and their concept lies in the fact such that imagery result to a query is produced with text describing it in html .Once there is text as an output to the query, it can be read by Screen Reader (or any text to speech software).Basically text needs to confirm W3C guidelines (http://www.w3.org/WAI/quicktips/Overview.php )to make it Screen Reader enabled in the form of HTML header tags .
For the GSoC’09, keeping in mind the time constraints, I intend to develop a Directions tool using OpenSteetMap/OpenLayers for Visually Impaired people (as well as general mass) such that after user enters source and destination as his/her query in text boxes, the route or walking/driving directions are resulted as output not only on maps but also as text explaining entire route with major point of Interests and minute details like a Square/Junction/Traffic Signal etc (similar to the one currently implemented by MapQuest) ,confirming to W3C guidelines such that it is easily readable.
Currently the text based outputs are given in terms of either m/km or miles. I intend to create metric convertor so that user uses the convention he/she is comfortable with and since it’s specifically for visually impaired a new metric included will be “foot steps” under long/short strides. So, that a blind user can actually count the number of foot steps and reach his/her destination (generally a blind will need the service for short distances, since he/she cannot drive on their own). However, directions for cars, walking, bicycle etc will be provided too (keeping in mind that the service can be used by all).
To implement the same, I intend to use CloudMade’s Library /APIs and Services like Geocoding and Geosearch in combination with Routing (the services are many complicated server machines, which could be accessed via HTTP, for the same there exists API wrappers in Ruby, Java and Python). I totally understand OSM Tags for Routing and explored various other options like OSMNavigation and LibOSM, GraphServer, Pyroute Lib and other services like OpenRouteService and YOURS which promises the implementation totally possible. For the text to speech, visually impaired people will use ScreenReader, however, to make the system machine independent I intend to develop application's own text to speech software using Washington University’s OpenSource project for Online Screen reader known as WebAnyWhere (http://webanywhere.cs.washington.edu/ ).
The application will be totally keyboard accessible with various short cut options, with very easy to understand interface such that people with Cognitive disabilities can also use the service with ease.
The application will benefit millions of children studying in normal as well as blind schools,schools which cannot afford costly computers and softwares/hardwares.
What is the timeline for development of your project?
Since, I am graduating this mid- May’09; I can devote all my time till August while working from home , after which I expect date of joining from companies I have been offered jobs from (Tata Consultancy Services and Accenture) ,in September.
I break the time line as:
1-Community bonding Period (April 21 to May 22): Study minute details of the technologies involved in the project.
2- May 23 to May 31: Creating and designing workflow of the project and initiate coding.
3- June 1 to July 5: Completing the directions application by the end of it for blinds, still machine dependent and needs Screen Reader for the blinds. A cushion period of few days before deadline for mid-term evaluations .
4-July 14 to July 28: Implementing text to speech for the application, totally costumed for it and hence making it machine independent.
5-July 29 to Aug 5: Adding keyboard accessibility features.
6-Aug 6 to Aug 13: Testing and Documentation, with few days cushion time. End of project.


Convince us, in 5-15 sentences, that you will be able to successfully complete your project in the timeline you have described.


You and the community