Changes

Jump to navigation Jump to search
no edit summary
Line 22: Line 22:  
:'''6 Where are you located, and what hours do you tend to work? (We also try to match mentors by general time zone if possible.) '''
 
:'''6 Where are you located, and what hours do you tend to work? (We also try to match mentors by general time zone if possible.) '''
   −
:I am located in India (5:30+ GMT),since I am graduating in May'09 so I can work anytime while staying at home (Except 4am to 10am - India time).
+
:I am located in India (+5:30 GMT),since I am graduating in May'09 so I can work anytime while staying at home (Except 4am to 10am - India time).
    
:'''7 Have you participated in an open-source project before? If so, please send us URLs to your profile pages for those projects, or some other demonstration of the work that you have done in :open-:source. If not, why do you want to work on an open-source project this summer? '''
 
:'''7 Have you participated in an open-source project before? If so, please send us URLs to your profile pages for those projects, or some other demonstration of the work that you have done in :open-:source. If not, why do you want to work on an open-source project this summer? '''
Line 29: Line 29:  
:OLPC Profile URL > http://wiki.laptop.org/go/User:Vaish.rajan
 
:OLPC Profile URL > http://wiki.laptop.org/go/User:Vaish.rajan
 
:I have been involved in various Real-world projects and Research works.One can have a look into my Resume for the same > http://vaish.rajan.googlepages.com/rajanvaish_resume.pdf  
 
:I have been involved in various Real-world projects and Research works.One can have a look into my Resume for the same > http://vaish.rajan.googlepages.com/rajanvaish_resume.pdf  
:I have been Fedora Ambassador-India and member of University's LUG for almost an year and have contributed by organizing workshops and encouraging students to adopt the same.  
+
:I am a member of International Association of Engineers and have been Fedora Ambassador-(India list) and member of University's LUG for almost an year now,and have contributed by organizing various University level workshops.
      Line 44: Line 44:  
: For the GSoC’09, keeping in mind the time constraints, I intend to develop a Directions tool using OpenSteetMap/OpenLayers for Visually Impaired people (as well as general mass) such that after user enters source and destination as his/her query in text boxes, the route or walking/driving directions are resulted as output not only on maps but also as text explaining entire route with major point of Interests and minute details like a Square/Junction/Traffic Signal etc (similar to the one currently implemented by MapQuest) ,confirming to W3C guidelines such that it is easily readable.
 
: For the GSoC’09, keeping in mind the time constraints, I intend to develop a Directions tool using OpenSteetMap/OpenLayers for Visually Impaired people (as well as general mass) such that after user enters source and destination as his/her query in text boxes, the route or walking/driving directions are resulted as output not only on maps but also as text explaining entire route with major point of Interests and minute details like a Square/Junction/Traffic Signal etc (similar to the one currently implemented by MapQuest) ,confirming to W3C guidelines such that it is easily readable.
 
: Currently the text based outputs are given in terms of either m/km or miles. I intend to create metric convertor so that user uses the convention he/she is comfortable with and since it’s specifically for visually impaired a new metric included will be “foot steps” under long/short strides. So, that a blind user can actually count the number of foot steps and reach his/her destination (generally a blind will need the service for short distances, since he/she cannot drive on their own). However, directions for cars, walking, bicycle etc will be provided too (keeping in mind that the service can be used by all).  
 
: Currently the text based outputs are given in terms of either m/km or miles. I intend to create metric convertor so that user uses the convention he/she is comfortable with and since it’s specifically for visually impaired a new metric included will be “foot steps” under long/short strides. So, that a blind user can actually count the number of foot steps and reach his/her destination (generally a blind will need the service for short distances, since he/she cannot drive on their own). However, directions for cars, walking, bicycle etc will be provided too (keeping in mind that the service can be used by all).  
: To implement the same, I intend to use CloudMade’s Library /APIs and Services like Geocoding and Geosearch in combination with Routing (the services are many complicated server machines, which could be accessed via HTTP, for the same there exists API wrappers in Ruby, Java and Python). I totally understand OSM Tags for Routing and explored various other options like OSMNavigation and LibOSM, GraphServer, Pyroute Lib and other services like OpenRouteService and YOURS which promises the implementation totally possible.  For the text to speech, visually impaired people will use ScreenReader, however, to make the system machine independent I intend to develop application's own text to speech software using Washington University’s OpenSource project for Online Screen reader known as WebAnyWhere (http://webanywhere.cs.washington.edu/ ),which can further be accustomed to meet needs of other applications and support text to speach.  
+
: To implement the same, I intend to use CloudMade’s Library /APIs and Services like Geocoding and Geosearch in combination with Routing (the services are many complicated server machines, which could be accessed via HTTP, for the same there exists API wrappers in Ruby, Java and Python). I totally understand OSM Tags for Routing and explored various other options like OSMNavigation and LibOSM, GraphServer, Pyroute Lib and other services like OpenRouteService and YOURS which promises the implementation totally possible.  For the text to speech, visually impaired people will use ScreenReader, however, to make the system machine independent I intend to develop application's own text to speech software using Washington University’s OpenSource project for Online Screen reader known as WebAnyWhere (http://webanywhere.cs.washington.edu/ ),which can further be accustomed to meet needs of other applications and support text to speach..Generally DHTML interface is difficult to access for someone using a speech-output interface,to solve the issue,the embeded XML metadata delivered by the application will be used to generate an audio-formatted representation of the content.  
 
: The application will be totally keyboard accessible with various short cut options, with very easy to understand interface such that people with Cognitive disabilities can also use the service with ease.  
 
: The application will be totally keyboard accessible with various short cut options, with very easy to understand interface such that people with Cognitive disabilities can also use the service with ease.  
 
: The application will benefit millions of children studying in normal as well as blind schools,schools which cannot afford costly computers and softwares/hardwares.  
 
: The application will benefit millions of children studying in normal as well as blind schools,schools which cannot afford costly computers and softwares/hardwares.  
Line 71: Line 71:     
: From me: In its initiative to bring education to every child,this project fills the gap of children who are blind.The text to speech Screen reader developed during the project will not only help the Maps application,but various other applications in the XO thereby helping Sugar Labs in its mission to bring education to all.  
 
: From me: In its initiative to bring education to every child,this project fills the gap of children who are blind.The text to speech Screen reader developed during the project will not only help the Maps application,but various other applications in the XO thereby helping Sugar Labs in its mission to bring education to all.  
: From my mentor: Mr.Nestor Guerrero: Awaited ! ( nestorgr<at>gmail<dot>com )  
+
: From my mentor: The Screen reader project look forward to empower the minorities through the use of an interface to get full functionality of diverse applications. We encourage to make wider the concept of inclusiveness in the Sugar Lab community not only closing the gap between economic situation of the kid but the disables too.  ( nestorgr<at>gmail<dot>com )
    
:'''2 Sugar Labs will be working to set up a small (5-30 unit) Sugar pilot near each student project that is accepted to GSoC so that you can immediately see how your work affects children in a deployment. We will make arrangements to either supply or find all the equipment needed. Do you have any ideas on where you would like your deployment to be, who you would like to be involved, and how we can help you and the community in your area begin it?'''
 
:'''2 Sugar Labs will be working to set up a small (5-30 unit) Sugar pilot near each student project that is accepted to GSoC so that you can immediately see how your work affects children in a deployment. We will make arrangements to either supply or find all the equipment needed. Do you have any ideas on where you would like your deployment to be, who you would like to be involved, and how we can help you and the community in your area begin it?'''
Line 93: Line 93:     
: http://vaish.rajan.googlepages.com/RajanVaishScreenshotXO.png .
 
: http://vaish.rajan.googlepages.com/RajanVaishScreenshotXO.png .
: It's my XO screenshot,I could not get Sugar Environment due to download issues(which I discussed on IRC).  
+
: It's my XO screenshot,I could not get Sugar Environment due to temporary download issues(which I discussed on IRC).  
    
:'''2 T-Shirt Size . '''
 
:'''2 T-Shirt Size . '''
Line 111: Line 111:     
[[Category:2009_GSoC_applications]]
 
[[Category:2009_GSoC_applications]]
 +
 +
==Related Links:==
 +
 +
[http://www.finroo.com '''T shirts''']
1

edit

Navigation menu