Changes

Jump to navigation Jump to search
no edit summary
Line 62: Line 62:     
Q: '''Describe your project in 10-20 sentences. What are you making? Who are you making it for, and why do they need it? What technologies (programming languages, etc.) will you be using?'''
 
Q: '''Describe your project in 10-20 sentences. What are you making? Who are you making it for, and why do they need it? What technologies (programming languages, etc.) will you be using?'''
  −
      
This project aims to create an Activity to allow users to easily broadcast and stream audio and video from their webcam and microphone or their computer's display output to a classroom's central server, and share the live feed with their peers. This will be integrated into the Neighborhood View (as a "broadcast audio and video to" option or similar) to allow for most seamless sharing of broadcasts.
 
This project aims to create an Activity to allow users to easily broadcast and stream audio and video from their webcam and microphone or their computer's display output to a classroom's central server, and share the live feed with their peers. This will be integrated into the Neighborhood View (as a "broadcast audio and video to" option or similar) to allow for most seamless sharing of broadcasts.
Line 87: Line 85:  
Since students may be interested in viewing multiple streams at once (for example, one stream showing the overall demonstration of the experiment and one showing a magnified view of a particular bacteria culture), then the interface should be able to automatically resize and tile individual streams to display them all at once while best utilizing screen space.
 
Since students may be interested in viewing multiple streams at once (for example, one stream showing the overall demonstration of the experiment and one showing a magnified view of a particular bacteria culture), then the interface should be able to automatically resize and tile individual streams to display them all at once while best utilizing screen space.
   −
'''Video Sources'''
+
'''Audio and Video Sources'''
   −
Webcam
+
For the project, Webcams and X11 desktop output would be supported as video sources, and the microphone for would be an audio source. These will of course not be hard-coded into the application, but will be represented as generic "media source" plugins. For example, since this architecture could have the potential to replace the standard projector-based lecture presentation mechanism, perhaps other media formats, namely PDF lecture slides, might be supported as a future enhancement (beyond the scope of this GSoC) as well.
X11 desktop output
  −
Icecast will be used for streaming video sources to the laptop.
      
'''Video Interaction'''
 
'''Video Interaction'''
   −
There should be some equivalent of a "pointer" on the video broadcasting-side, for highlighting important information. There might also be a "pointer" for pointing out comments on the receiver's side, though this would probably cause more trouble and distraction in the classroom rather than promote useful feedback to the presenter, thus this probably won't be implemented.
+
There should be some equivalent of a "pointer" on the video broadcasting-side, for highlighting important information. There might also be a "pointer" for pointing out comments on the receiver's side, though this would probably cause more trouble and distraction in the classroom rather than promote useful feedback to the presenter, thus this probably won't be implemented.
 +
 
 +
'''Programming Languages and Libraries'''
 +
 
 +
Since sugar-chat-activity uses Python, I will likely be using it for this project as well. The GStreamer libraries will be used for communicating with the Icecast server and uploading multimedia. Ogg will probably be used as the container, Theora will be used as the video codec, and Vorbis will be used as the audio codec, since this is a relatively well-supported combination (supported for in-browser playback on upcoming browsers) and poses no licensing issues, though thanks to the flexibility of GStreamer any other supported codecs and container could alternatively be used.
 +
 
 +
Q: '''What is the timeline for development of your project? The Summer of Code work period is 7 weeks long, May 23 - August 10; tell us what you will be working on each week. (As the summer goes on, you and your mentor will adjust your schedule, but it's good to have a plan at the beginning so you have an idea of where you're headed.) Note that you should probably plan to have something "working and 90% done" by the midterm evaluation (July 6-13); the last steps always take longer than you think, and we will consider cancelling projects which are not mostly working by then.'''
 +
 
 +
'''Prior to the start of the coding period'''
 +
 
 +
Review the GStreamer and Sugar APIs.
 +
 
 +
'''Week 1'''
 +
 
 +
Write a GStreamer-based webcam video capture backend and X11 desktop video capture backends.
 +
 
 +
'''Week 2'''
 +
 
 +
Write the microphone audio capture backend, and the Icecast server uploading code.
 +
 
 +
'''Week 3'''
 +
 
 +
Write the server-side authentication code and security mechanisms.
   −
'''Potential Enhancements'''
+
'''Milestone 1 Reached'''
   −
These enhancements are probably beyond the scope of a single GSoC project. However, since this architecture has the potential to replace the standard projector-based lecture presentation mechanism, perhaps other, non-video formats, namely PDF lecture slides, might be supported as a future enhancement as well.
+
'''Week 4'''
   −
'''Programming Languages and Libraries'''
+
This week is set aside for finishing up work from weeks 1-3 if needed, soliciting and addressing any feedback regarding the backend, documenting the backend code on the wiki, and performing testing and bug-fixing of the backend with a rudimentary, very simple client-side frontend.
 +
 
 +
'''Week 5'''
 +
 
 +
Create the client-side frontend, as described in "Interface".
 +
 
 +
'''Week 6'''
   −
Since sugar-chat-activity uses Python, I will likely be using it for this project as well. The Telepathy and Farsight2 libraries will be used for implementing the multimedia chat feature.
+
Finish work on the client-side frontend, and integrate options to broadcast multimedia into the Neighborhood view.
   −
Q: '''What is the timeline for development of your project? The Summer of Code work period is 7 weeks long, May 23 - August 10; tell us what you will be working on each week. (As the summer goes on, you and your mentor will adjust your schedule, but it's good to have a plan at the beginning so you have an idea of where you're headed.) Note that you should probably plan to have something "working and 90% done" by the midterm evaluation (July 6-13); the last steps always take longer than you think, and we will consider cancelling projects which are not mostly working by then.'''
+
'''Week 7'''
   −
A: Week 1:
+
Finish any work on the client-side frontend that has yet to be finished, perform a second cycle of feedback gathering, and testing, then write client-side documentation on the wiki.
Week 2:
  −
Week 3:
  −
Week 4:
  −
Week 5:
  −
Week 6:
  −
Week 7:
      
----
 
----
Line 122: Line 140:     
In that project, the focus was on interaction of mobile Internet Tablet devices with other household electronics, such as TVs and the like, as well as the exploration of a time-centric desktop computing paradigm. However, the icecast-based broadcasting architecture there is similar (albeit far more primitive, lacking security features, and restricted to only local devices as targets) to what I aim to implement here. Having learned more about the constraints and architecture of mass-video broadcasting over Icecast, I believe I now have the necessary knowledge to deliver a production-ready implementation of the described multimedia-broadcasting activity.
 
In that project, the focus was on interaction of mobile Internet Tablet devices with other household electronics, such as TVs and the like, as well as the exploration of a time-centric desktop computing paradigm. However, the icecast-based broadcasting architecture there is similar (albeit far more primitive, lacking security features, and restricted to only local devices as targets) to what I aim to implement here. Having learned more about the constraints and architecture of mass-video broadcasting over Icecast, I believe I now have the necessary knowledge to deliver a production-ready implementation of the described multimedia-broadcasting activity.
 +
 +
In addition, I have also worked with video APIs (primarily FFmpeg's libavcodec/libavformat, and OpenCV, though I have used GStreamer on another project) as part of my current undergaduate research project focused on emotion recognition based on facial and speech features, so I am familiar with how to perform real-time video and audio capture and addressing the usual issues that show up in terms of laggy capture performance or locking threads; in this case this should be even simpler as no data processing must be done.
 +
 +
Regarding development pace, I will have no other commitments this summer so I will be able to spend my full time on development.
    
====You and the community====
 
====You and the community====
41

edits

Navigation menu