Changes

Jump to navigation Jump to search
update
Line 21: Line 21:  
Having a strong background in git (my old testbed where all of my code went is at github.com/alexhairyman/ unfortunately it has not been contributed to in a long time as high school and a job took over my free time), I would love to implement the Git backend for sugar journal. Essentially I would want to modify sugar journal's code to essentially propagate all changes as a commit to a local repository. The end goal would be to wrap up git with a UI to explore all previous changes, each commit would contain a metadata file generated by either the datastore or the journal source code, and then perhaps a binary file containing the content too. This would update a locally stored git repository, which would then propagate changes to any one of the git hosting sites out there (github has a nice python API to facilitate using github specific features). One way of doing it could be store all of the binary/large data to google drive, the API available for drive would make it easy to store data in an existing drive setup. Instead of using git at all, having a metadata file keep track of the journal entries would allow the computer to just interact with google, and expanding storage for google drive is ridiculously cheap. This would be the first discussion I have with my mentor
 
Having a strong background in git (my old testbed where all of my code went is at github.com/alexhairyman/ unfortunately it has not been contributed to in a long time as high school and a job took over my free time), I would love to implement the Git backend for sugar journal. Essentially I would want to modify sugar journal's code to essentially propagate all changes as a commit to a local repository. The end goal would be to wrap up git with a UI to explore all previous changes, each commit would contain a metadata file generated by either the datastore or the journal source code, and then perhaps a binary file containing the content too. This would update a locally stored git repository, which would then propagate changes to any one of the git hosting sites out there (github has a nice python API to facilitate using github specific features). One way of doing it could be store all of the binary/large data to google drive, the API available for drive would make it easy to store data in an existing drive setup. Instead of using git at all, having a metadata file keep track of the journal entries would allow the computer to just interact with google, and expanding storage for google drive is ridiculously cheap. This would be the first discussion I have with my mentor
   −
= community =
+
= Rough Timeline =
 
  −
'''What will you do if you get stuck and your mentor isn't around:''' The nature of my project will allow me to leave a message on either IRC or as an email for mentor, and while they're busy, there are enough facets of my project so that
  −
=== Rough Timeline ===
   
* The first few weeks (no later than may 15th) will be when I explore, in-depth, the journal and datastore and talk with my mentor on the best possible way of "gitifying" the events and journal entries
 
* The first few weeks (no later than may 15th) will be when I explore, in-depth, the journal and datastore and talk with my mentor on the best possible way of "gitifying" the events and journal entries
* the next few weeks (mid May to beginning of June) will be modifying these to produce an output metadata file each time a journal entry is created that is associated with some kind of binary or compressed content. This will be either pushed to git or google drive. I'm thinking a small custom format that is human readable is good, I'm thinking YAML could be especially good.
+
* the next few weeks (mid May to beginning of June) will be modifying these to produce an output metadata file each time a journal entry is created that is associated with some kind of binary or compressed content. This will be either pushed to local git and the file system. A small format that is human readable is good, I'm thinking YAML could be especially good.
 
* the beginning of June I would hope to start stress testing the system as no more than a local repository
 
* the beginning of June I would hope to start stress testing the system as no more than a local repository
 
* half way through June I would begin the remote propagation part, where content hosts such as github, google drive, imgur would come into play. This is where the meat will be, keeping track of what's been pushed and how to access it. I am aiming for google drive first. Getting hooked into the API with python to store the binary data, and then having one large metadata file or do it the git way
 
* half way through June I would begin the remote propagation part, where content hosts such as github, google drive, imgur would come into play. This is where the meat will be, keeping track of what's been pushed and how to access it. I am aiming for google drive first. Getting hooked into the API with python to store the binary data, and then having one large metadata file or do it the git way
 
* This will take a while, interfacing smoothly with the API's of the other services. I estimate up to late July
 
* This will take a while, interfacing smoothly with the API's of the other services. I estimate up to late July
 
* The final weeks of GSOC in August will be the testing and integration phase, where hopefully community members can get their hands on it and test it out, see how well it fairs in the "real world"
 
* The final weeks of GSOC in August will be the testing and integration phase, where hopefully community members can get their hands on it and test it out, see how well it fairs in the "real world"
 +
 +
== Impact on the Community ==
 +
'''Me''': Having the journal be propagated to the cloud will allow someone to switch between devices. Allowing someone to make changes at both school and home will make it easier
 +
 +
to switch between devices and eventually upgrade to bigger hardware. It will essentially be the fostering from the environment that is sugar to windows, OS X, Linux, etc. Besides that, hardware fails, and being able to rebound from that is a
 +
 +
== Miscellaneous ==
 +
[[File:AMH 2015 dev environment.png|thumb]]

Navigation menu