Skip to main content
Chemistry LibreTexts

2.1: FAQ

  • Page ID
    353816
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vectorC}[1]{\textbf{#1}} \)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)

    I don't see where to modify the points for question. I would like to make some questions value zero points 

    You assign points when you view the questions in the assignment. If you are in "basic" question view, flip the toggle to "advanced" to see the option. Look for "This question is worth [] points.

    clipboard_e37e8d015623f67de068389bc9489b7de.png

     

    What is the difference between Random Sampling and Algorithmic questioning options?

    Randoming means you have an assignment of n questions and you set it so that a student gets m of these questions randomly sampled in their personal assignment. Then each student (most likely) gets a different set of questions (depending how big n, m and your class is). For example, it is used in the prelab problems where students need to demonstrate that they read over the labs. Check the prelabs in this course for an example "Chem 2B Lab: General Chemistry II" (in the public courses area).

    Algorithmic questions are questions that are programmed to have random numbers selected within the number (e.g., different masses or different moles etc). To work effectively, the question and solution need to be explicitly coded which takes more time than a hardcoded question. We are slowly going through the questions to update them to algorithmic, but it takes time. While we have a lot of algorithmic questions in ADAPT, the first two chapters of ChemVantage (in the public course or commons) have been built with both algorithmic and signification figures to be the best we have (but expanding).

    How to randomize the question order in an assignment?

    There isn't an explicit randomize order feature in ADAPT, but the "Random Sample" option will work instead. If an instructor created an assignment of 20 questions and indicates that the student samples 20 questions from this pool then then each student will get a randomized order of questions.

    clipboard_ea13e818378500b62aae4e76d2f49c5ae.png

    How to download the actual submissions for an assignment?

    While the scores for an assignment can be download from the gradebooks, the actual submissions can be downloaded via the "Download Submission" button in the Auto-Graded Submissions. A CSV file of the last submission of each student will be initiated. A history of submissions is available upon request.

    clipboard_e2f9b12770d94e899218a47108cf3062e.pngclipboard_e23d95595af6e44b1cc5974c1addc6ad7.png

    Is ADAPT free?

    Yes, mostly. Verified instructors can access the ADAPT question bank without any cost, regardless of whether they utilize the underlying technology for assessment or class analytics.

    For instructors and students within California, ADAPT can be used as a free homework system, thanks to funding from the State of California that supports its core development. For educators located outside the state of California who wish to utilize ADAPT, a financial contribution is necessary to ensure ongoing support and sustainability of the platform.* Our objective is to maintain these expenses at a low level. We request a fee of $15 per student per academic term when students cover the access fee themselves. This fee is reduced to $12 per student if the institution covers the access cost, and further reduced to $10 per student for campuses that are members of LibreNet. These fees cover all classes that a student may enroll in during the term, without any limitations. Additionally, there is a yearly maximum cost, ensuring that no student pays more than $30 per year, regardless of the number of classes or terms they participate in during that period. Notably, the summer term is typically exempt from charges.

    * "ADAPT developers" can use ADAPT without financial buy-in. If a faculty member desires to join the development team and contribute questions/solutions etc. to the project, they can contact us directly at info@libretexts.org

    Coupling ADAPT to LMSs (via LTI)

    Does integrating ADAPT to your Canvas LMS require developer key?

    Connecting ADAPT to Canvas does not require a developer key (this is the sort of thing needed for the API where ADAPT could create/destroy assignments). However an LTI 1.3 key is required and this for grade passback. This is the most secure model available, with a limited placement of the app of where faculty create assignments, and just the ability to log in students and pass back grades.

    Is it possible to integrate ADAPT at a Canvas privacy level outside of public?

    In terms of "Private", this would mean that ADAPT just gets an ID from the LMS instead of name/email. ADAPT doesn't current support this mode of operation as it would disrupt the possibility of students then logging in via SSO since we would not have their email. It would also slow down student support quite a bit since we'd have to ask for their "LMS identification". Furthermore, troubleshooting in general could become quite painful.

    Which values control student usages of questions - ADAPT or the LMS?

    Canvas setting should be unlimited because grades are passed back with each submission that a student makes (Canvas won't accepted grade pass backs past the number dictated in Canvas).  And, the control comes into play on the ADAPT side since ADAPT won't let students submit past the number allotted within the ADAPT assignment.

    • Canvas: should always be unlimited 
    • ADAPT: should be what the instructor wants for student submissions

    If they don't set this correctly, the Canvas grade passback will fail but they just need to update it in Canvas and it will be retried once every 24 hours and fix up any scores.

    Which values control scores of student submissions - ADAPT or the LMS?

    In terms of points, grade passback passes back a "proportion correct" (note that this just how grade passback works in LTI). If you look at the points awarded, you'll notice that they are both 31.82% of the total (70/220 in ADAPT and 31.82/100 in Canvas).  So, you can make ADAPT points anything you want and Canvas points anything you want and the "proportion correct" will be sent back with Canvas doing the final computation on their side to do the actual number of points.

    Do the points have to sync up correctly on ADAPT and LMS for the assignment score that is passed to the LMS?

    The points don't have to match each other at the question level.  To get the total score for the assignment, ADAPT will sum up the total number of "correct" points divided by the total number for the assignment.  Whatever proportion this is will then be passed back to Canvas.

    How can I switch my questions to algorithmic to that each student gets a different set of numbers or intra-question structure?

    Only IMathAs and WebWork question types have algorithmic capabilities. Native and H5P may have limited variability in terms of multiple choice order, but not full algorithmic features like IMathAs and WebWork.

    For Webwork: The random numbers are dictated by a 9-digit seed that is set when the question is first rendered by the student the first time (i.e., when the assignment is first loaded which loads all the questions). The seed is set to a fixed value and doesn't change each subsequent time the student views the question so the student never knows if a question is algorithmic until they compare questions with fellow students.

    Again, you have to set the assignment to algorithmic in the Assignment Properties" to activate this feature. Otherwise, the students get the same number.

    When you view the question, you can reset the seed via the "Reset Submission" which both erases the submission from the system and resets the seed.

    To view the students' version of the question, you have to log in at them and then you can see what the number(s) are specifically for them.

    For IMathAS: Details will be forthcoming.

    I have an assignment that is extra credit in ADAPT. In Canvas I entered the points students can earn as zero (so they are not penalized if they don't do the extra credit). However, this appears to have messed up how ADAPT and Canvas sync, meaning a few students who have completed the assignment and have points in ADAPT, still have zero in Canvas. Is there something I need to change in order for this to be corrected?

    ADAPT will always send back the proportion correct. Canvas will then take the number of points for the assignment and multiply it by the proportion. So, if you assign 0 points in Canvas, they will get 0 points there. If you do not want students to be penalized you can not include it in the Final Score in Canvas: https://community.canvaslms.com/t5/Instructor-Guide/How-do-I-exclude-an-assignment-from-the-course-s-final-grades/ta-p/958

    (If you change the Canvas points to something that's not 0, please let us know and we can re-force the grade passback)

    What are the possible status levels of an assignment?

    Once created, the assignment can have one of four status levels depending on the dates assigned to the assignment and the date the assigment is viewed:

    • Upcoming (before start date/time). Student cannot see nor submit questions for an assignment that is upcoming.
    • Open (after start but before first due date). Once the open date/time has passed, then the assignment switches to Open and students can view and submit questions (up the set limit of times).
    • Late (after first due date and before second/final date if applicable). If the assignment has a late policy for submitting, then a second 'final' due date was set in the assignment properties and if the first due date as has passed but not the second, then the assignment is set as Late. Students can still view and submit but may have a penalty depending on the assignment parameters.
    • Closed (after second/final date). If the assignment has a no-late submission policy, then the assignment switches to closed after the end (due) date. Students can view questions, but not submit answer to the questions.

    2.1: FAQ is shared under a CC BY 4.0 license and was authored, remixed, and/or curated by LibreTexts.

    • Was this article helpful?