Difference between revisions of "Math4Team/RIT/Projects/Question Support API"
(Obsolete) |
|||
Line 1: | Line 1: | ||
+ | {{Obsolete}} | ||
+ | |||
{{TOCright}} | {{TOCright}} | ||
==Question Support API== | ==Question Support API== |
Latest revision as of 23:05, 5 June 2016
Question Support API
Project Information
Current Goals
Setting Up a Development Environment (For Newbies)These are links, tutorials and things that Brian Long has found along the way, as well as help gained from Greg Stevens. Brian is new to the Linux / OpenSource community, and is working on documenting a development environment for you and for us to continue developing this API. The small amount of information below is meant to help someone new step into the world of developing for the Sugar environment, from Ubuntu and have all dependencies and things setup to continue development of this API. Next Brian would like to document how to setup these sort of items for Activity developers to include our API and be able to extend it.
Dependencies
sudo apt-get install python-setuptools sudo easy_install pyparsing sudo easy_install peak-rules sudo easy_install sqlalchemy
MotivationIn the RIT class working on the Math4 projects, many proposed activities require a question database of some kind. A common API or library for accessing databases in different formats, stored either locally or remotely, along with a simple mechanism to determine more complex formatting or presentation than simple text (e.g. to include simple graphics or mathematical notation) would cover a majority of the cases where the activity needs some configurable "curriculum data". Eventually this library could be extended to provide hints, explanations, or walkthroughs for questions, in addition to the basic metadata about level, grouping, difficulty, and subject matter that would be part of the base system. Envisioned UsageConsider a simple flash-card-like activity. It presents a question from a list of questions, allows the student to select an answer from the provided answers for the question or to enter their own answer. Then the correct answer is revealed and the student it told whether their answer is correct. If the question has an explanation of the correct answer, the flash-card activity will show the explanation of the correct answer. (Note that this is just a simple usage example, the interaction design of a drilling activity could be very different.) The flash-card activity would use this proposed Quiz API for the following:
To start with, the library would simply be a time-saving tool for developers needing similar functionality, but as the XS (School Server) becomes more fully developed the library should integrate the functions provided by the XS to enable automated update of course material for the current topic of study so the students can drill material using any tool they prefer, while still reporting progress to the instructor using the XS services. Current StatusCurrently, (14 May 2009) the API supports parsing the GIFT file format well enough to import Multiple Choice and True/False questions, along with complete implementations for basic functionality of the corresponding question objects. No support for partial credit answers is currently implemented, nor are other question types working correctly (though most can be parsed to some extent). Export to CSV works as intended, though it is intentionally simple. Documentation on usage and integration of the API into an activity is in the doc/ directory of [the repository]. A simple, but complete usage example, using a console interface is available in tests/complete_test.py. Current implementation1. Ensure that PyParser and PEAK-Rules are installed import quizdata 3. Select plain text output from quizdata.text import plain_text 4. Select desired question types from quizdata.question import MultipleChoiceQuestion, MissingWordQuestion 5. Import when function to assist with question sorting from peak.rules import when 6. Handle questions (this section should be rewritten as needed for your activity # this is the base case for any question type we don't handle otherwise. def do_question(q): print "Unhandled question type.", type(q) # for multiple choice questions (incl. subclasses) we do this... @when(do_question, (MultipleChoiceQuestion,)) def do_multi_questions(q): print plain_text(q.text) for a in zip('0123456789', q.answers): print "%5s: %s" % a answer = int(raw_input()) q.answer = q.answers[answer] print q.correct # for missing word-style questions, which aren't implemented correctly yet, and # are a subclass of multiple choice questions... we make sure to ignore them # with a more specific rule. @when(do_question, (MissingWordQuestion,)) def do_mw_question(q): # XXX: inheritance is annoying here... print "Unhandled question type. (MissingWordQuestion)" 7. Open questions questions = quizdata.open("file://%s?format=gift" % path.join(base_path, 'tests', 'multi_choice.txt')) for q in questions: do_question(q) "How to Play/Use" for end userThis is a requirement of the RIT OLPC seminar. The question doesn't exactly pertain to the API team's project, a more applicable question might be “How development of the API is done?” and “How do activity developers include / extend our API?”.
Education StandardsAs a general API and not a standalone Activity, along with the nature of the API in specific, this project does not directly address any specific education standards or learning outcomes. It is reliant on educators to write questions, or have questions available to them in usable formats. As such, the potential educational standards this project could help to meet includes **any** standard where drilling or question/response evaluation is appropriate. (This includes a wider range of topic areas than the Math4 focus, even.) A Teacher's Guide Abstract
Milestones
Community ContactsUser:FGrose, User:Tony37 (see talk page) RIT Project Usage
See Also |