diff --git a/.coderabbit.yaml b/.coderabbit.yaml index f5bd95f1..911c3f1e 100644 --- a/.coderabbit.yaml +++ b/.coderabbit.yaml @@ -21,6 +21,7 @@ reviews: instructions: | Review files for: - Consistent formatting (e.g., headings, lists, links). + - Anywhere there are tables, they should use `list-table`. - Clear and concise language. - Correct grammar and spelling. - Proper use of rst syntax (e.g., avoid broken links or invalid code blocks). diff --git a/source/img/canvascoursecoursetselect.png b/source/img/canvascoursecoursetselect.png new file mode 100644 index 00000000..b7c5ce50 Binary files /dev/null and b/source/img/canvascoursecoursetselect.png differ diff --git a/source/img/guides/assessment_gb_rubric.png b/source/img/guides/assessment_gb_rubric.png index c767878e..e77b722b 100644 Binary files a/source/img/guides/assessment_gb_rubric.png and b/source/img/guides/assessment_gb_rubric.png differ diff --git a/source/img/guides/assessment_mc_exec.png b/source/img/guides/assessment_mc_exec.png index 859db428..94f9a203 100644 Binary files a/source/img/guides/assessment_mc_exec.png and b/source/img/guides/assessment_mc_exec.png differ diff --git a/source/img/guides/assessment_mc_grading.png b/source/img/guides/assessment_mc_grading.png index 683d158e..87d8a597 100644 Binary files a/source/img/guides/assessment_mc_grading.png and b/source/img/guides/assessment_mc_grading.png differ diff --git a/source/img/guides/freetext-grading.png b/source/img/guides/freetext-grading.png index 82ce662a..c024b7d2 100644 Binary files a/source/img/guides/freetext-grading.png and b/source/img/guides/freetext-grading.png differ diff --git a/source/img/guides/freetext_navigate.png b/source/img/guides/freetext_navigate.png index 84db11f4..3806342e 100644 Binary files a/source/img/guides/freetext_navigate.png and b/source/img/guides/freetext_navigate.png differ diff --git a/source/img/guides/freetextanswer.png b/source/img/guides/freetextanswer.png index 93709531..d800f767 100644 Binary files a/source/img/guides/freetextanswer.png and b/source/img/guides/freetextanswer.png differ diff --git a/source/img/guides/freetexticon.png b/source/img/guides/freetexticon.png index ba2e9a7a..2695f2e0 100644 Binary files a/source/img/guides/freetexticon.png and b/source/img/guides/freetexticon.png differ diff --git a/source/img/guides/notpartial.png b/source/img/guides/notpartial.png index 278c675c..5e9fcf0d 100644 Binary files a/source/img/guides/notpartial.png and b/source/img/guides/notpartial.png differ diff --git a/source/img/guides/parsonspuzzlegrading.png b/source/img/guides/parsonspuzzlegrading.png new file mode 100644 index 00000000..939e5aa6 Binary files /dev/null and b/source/img/guides/parsonspuzzlegrading.png differ diff --git a/source/img/guides/partial.png b/source/img/guides/partial.png index 229143d2..d89027ac 100644 Binary files a/source/img/guides/partial.png and b/source/img/guides/partial.png differ diff --git a/source/img/lti-json-url.png b/source/img/lti-json-url.png new file mode 100644 index 00000000..1f7aa89f Binary files /dev/null and b/source/img/lti-json-url.png differ diff --git a/source/instructors/admin/integration/lti1-3Canvas.rst b/source/instructors/admin/integration/lti1-3Canvas.rst index 405b81df..c98eb841 100644 --- a/source/instructors/admin/integration/lti1-3Canvas.rst +++ b/source/instructors/admin/integration/lti1-3Canvas.rst @@ -20,7 +20,6 @@ The Canvas user who carries out these steps must be a system administrator. When copying links, please use the copy button adjacent to each link to ensure accuracy and prevent linking errors. - **In Codio:** 1. Click your username in the top-right corner, then select **Organization** from the menu. In the Organizations area, click the name of your organization. @@ -39,25 +38,66 @@ The Canvas user who carries out these steps must be a system administrator. 5. Click on **Developer Key** and select **+LTI key**. -6. Complete the **Key Name, Title** and **Description** fields. Make sure to set the **method** to **Manual Entry**. +(Step 6, Option 1) Using JSON configuration Url +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -|image2| +You can use a JSON configuration URL to automatically configure most Canvas settings: + +.. image:: /img/lti-json-url.png + :alt: Canvas configuration for JSON configuration Url + :width: 80% + +You only need to configure: + +.. list-table:: + :header-rows: 1 + :widths: 30 70 + + * - **Value** + - **What to add** + * - **Key Name:** + - A name for the Tool, i.e.: "Codio" + * - **Owner Email:** + - An email, you can use your own email. + * - **Redirect URIs:** + - Paste the Redirect URL from your Codio integration + * - Set the Method as **Enter URL** + - + * - **JSON URL** + - Paste the URL from **Canvas JSON configuration Url** + +.. note:: + If you use the JSON configuration URL method, skip to **Part 2**. To manually configure everything, continue with the steps below. +(Step 6, Option 2) Completing Canvas Steps Manually +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -+------+--------------------------------------------------------+-----------------------------------------------------------------------+ -| | **Copy** | **Paste** | -+======+========================================================+=======================================================================+ -| 7 | From Codio, under **LTI 1.3 Integration, copy the** | Paste it into the **Canvas Redirect URI** field. | -| | **Redirect URL** | | -+------+--------------------------------------------------------+-----------------------------------------------------------------------+ -| 8 | Copy the **LTI URL** | Paste it into the **Target Link URI field** in Canvas. | -+------+--------------------------------------------------------+-----------------------------------------------------------------------+ -| 9 | Copy the **Initiate Login URL** | Paste it into the **OpenID Connect Initiation URL**. | -+------+--------------------------------------------------------+-----------------------------------------------------------------------+ -| 10 | In Canvas, change **JWK Method** to **Public JWK URL**.| | -+------+--------------------------------------------------------+-----------------------------------------------------------------------+ -| 11 | From Codio, copy the **Keyset URL** | Paste it into the **Public JWK URL** field. | -+------+--------------------------------------------------------+-----------------------------------------------------------------------+ +Complete the **Key Name, Title** and **Description** fields. Make sure to set the **method** to **Manual Entry**. + +|image2| + +.. list-table:: + :header-rows: 1 + :widths: 5 50 50 + + * - + - **Copy** + - **Paste** + * - 7 + - From Codio, under **LTI 1.3 Integration, copy the Redirect URL** + - Paste it into the **Canvas Redirect URI** field. + * - 8 + - Copy the **LTI URL** + - Paste it into the **Target Link URI** field in Canvas. + * - 9 + - Copy the **Initiate Login URL** + - Paste it into the **OpenID Connect Initiation URL**. + * - 10 + - In Canvas, change **JWK Method** to **Public JWK URL**. + - + * - 11 + - From Codio, copy the **Keyset URL** + - Paste it into the **Public JWK URL** field. |image3| @@ -100,12 +140,11 @@ Link Selection and Assignment Selection https://static-assets.codio.com/dashboard/images/icons/favicon-16x16.da14ae918fd9bc3b.png -Course Navigation and Editor Button -~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - +Editor Button +~~~~~~~~~~~~~ .. image:: /img/canvascourseeditortselect.png - :alt: Canvas Course Navigation and Editor Button placement + :alt: Canvas Editor Button placement :width: 750px @@ -114,7 +153,31 @@ Course Navigation and Editor Button .. code-tab:: text :caption: Target Link URI - https://apollo.codio.com/lti/resource_selection + https://apollo.codio.com/lti/editor_button + +.. tabs:: + + .. code-tab:: text + :caption: Icon URL + + https://static-assets.codio.com/dashboard/images/icons/favicon-16x16.da14ae918fd9bc3b.png + + +Course Navigation +~~~~~~~~~~~~~~~~~ + + +.. image:: /img/canvascoursecoursetselect.png + :alt: Canvas Course Navigation placement + :width: 750px + + +.. tabs:: + + .. code-tab:: text + :caption: Target Link URI + + https://apollo.codio.com/lti/course_navigation .. tabs:: @@ -223,7 +286,7 @@ Updating the fields in Platform Information .. code-tab:: text :caption: Platform ID - https://canvas.instructure.com + https://canvas.instructure.com @@ -240,7 +303,7 @@ Updating the fields in Platform Information .. code-tab:: text :caption: Public Keyset URL - https://[CANVAS DOMAIN]/api/lti/security/jwks + https://[CANVAS DOMAIN]/api/lti/security/jwks 6. **Access Token URL:** @@ -249,7 +312,7 @@ Updating the fields in Platform Information .. code-tab:: text :caption: Access Token URL - https://[CANVAS DOMAIN]/login/oauth2/token + https://[CANVAS DOMAIN]/login/oauth2/token 7. **Authentication Request URL:** @@ -258,7 +321,7 @@ Updating the fields in Platform Information .. code-tab:: text :caption: Authentication Request URL - https://[CANVAS DOMAIN]/api/lti/authorize_redirect + https://[CANVAS DOMAIN]/api/lti/authorize_redirect 8. Click **Create** diff --git a/source/instructors/authoring/assessments/advanced-code-test.rst b/source/instructors/authoring/assessments/advanced-code-test.rst index d2b6ed36..0542c95e 100644 --- a/source/instructors/authoring/assessments/advanced-code-test.rst +++ b/source/instructors/authoring/assessments/advanced-code-test.rst @@ -10,10 +10,10 @@ The advanced code test assessment type allows you to easily implement unit tests To ensure that your test scripts run securely and to prevent student access to your testing files or executables, place them in the **.guides/secure** folder. Create a folder if one does not already exist. This folder is not available to students in their assignments and they cannot access it from the command line. Only teachers with editing privileges have access to the **.guides/secure** folder. -.. Note:: If your assignment will contain multiple assessments, Code files and Test Cases for individual assessments should be placed in separate folders to avoid compiling all files. +.. note:: If your assignment will contain multiple assessments, Code files and Test Cases for individual assessments should be placed in separate folders to avoid compiling all files. -Complete each section to set up your advanced code test. For more information on **General**, **Metadata** and **Files** see :ref:`Assessments `. +Complete each section to set up your advanced code test. For more information on **General**, **Metadata** (Optional) and **Files** (Optional) see :ref:`Assessments `. 1. Complete **General**. @@ -27,7 +27,7 @@ Complete each section to set up your advanced code test. For more information on - **Python**: `pycodestyle`_ or `UnitTest`_ - **JavaScript**: `JSHint and JSLint`_ -.. Note:: For more information, see the :ref:`test-types` section or click any test name above to navigate directly to that section. +.. note:: For more information, see the :ref:`test-types` section or click any test name above to navigate directly to that section. - **Language Assessment Subtype** - Click the drop-down and choose a subtype for the selected language type, if applicable. @@ -35,22 +35,18 @@ Complete each section to set up your advanced code test. For more information on 3. Click **Grading** in the top navigation pane and complete the following fields: - .. image:: /img/guides/ACTGradingScreen.png - :alt: Grading - :width: 500px - - - **Points** - The score given to the student if the code test passes. You can enter any positive numeric value. If this assessment should not be graded, enter 0 points. - - **Partial Points** - Toggle to enable a percentage of total points to be given based on the percentage correctly answered. Note that it's not enough to turn partial points on; the advanced code test has to be written to handle partial points. See :ref:`Partial Points ` for more information. - - **Define Number of Attempts** - enable to allow and set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. - - **Show Rationale to Students** - Toggle to display the rationale for the answer to the student. This guidance information will be shown to students after they have submitted their answer and any time they view the assignment after marking it as completed. You can set when to show this selecting from **Never**, **After x attempts**, **If score is greater than or equal to a % of the total** or **Always** - - **Rationale** - Enter guidance for the assessment. This is always visible to the teacher when the project is opened in the course or when opening the student's project. - - **Use maximum score** - Toggle to enable assessment final score to be the highest score attained of all runs. - -5. **(Optional)** Complete **Metadata**. +.. image:: /img/guides/ACTGradingScreen.png + :alt: Grading + :width: 500px -6. **(Optional)** Complete **Files**. +- **Points** - The score given to the student if the code test passes. You can enter any positive numeric value. If this assessment should not be graded, enter 0 points. +- **Partial Points** - Toggle to enable a percentage of total points to be given based on the percentage correctly answered. Note that it's not enough to turn partial points on; the advanced code test has to be written to handle partial points. See :ref:`Partial Points ` for more information. +- **Define Number of Attempts** - enable to allow and set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. +- **Show Rationale to Students** - Toggle to display the rationale for the answer to the student. This guidance information will be shown to students after they have submitted their answer and any time they view the assignment after marking it as completed. You can set when to show this selecting from **Never**, **After x attempts**, **If score is greater than or equal to a % of the total** or **Always** +- **Rationale** - Enter guidance for the assessment. This is always visible to the teacher when the project is opened in the course or when opening the student's project. +- **Use Maximum Score** - Toggle to enable assessment final score to be the highest score attained of all runs. -7. Click **Create** to complete the process. +4. Click **Create** to complete the process. See a Working Example diff --git a/source/instructors/authoring/assessments/auto-grade-scripts.rst b/source/instructors/authoring/assessments/auto-grade-scripts.rst index c21067db..ab02ae65 100644 --- a/source/instructors/authoring/assessments/auto-grade-scripts.rst +++ b/source/instructors/authoring/assessments/auto-grade-scripts.rst @@ -5,18 +5,19 @@ Assignment Level Scripts ======================== + You can use assignment level scripts to evaluate student code, normalize points, and mark for participation grading. Assignment level scripts are added in the **Script Grading** field on the :ref:`Script Grading ` settings page. These scripts can then transfer the grading value into the grading field. Assignment level scripts are run when an assignment is **Marked as Complete**. .. Note:: The script must execute within 3 minutes or a timeout error occurs. There is a maximum size for the feedback that can be returned of 1Mb. If this limit is exceeded, the message **Payload content length greater than maximum allowed: 1048576** will be returned. If you are using an LMS platform with Codio, be sure to enter a percentage value in the **Grade Weight** field to maintain compatibility with LMS gradebooks. This value is then transferred into your LMS gradebook once you :ref:`release the grades `. -Secure scripts +Secure Scripts -------------- If you store grading scripts in the **.guides/secure** folder, they run securely and students cannot see the script or the files in the folder. Only instructors can access this folder. -You can find more information about assessment security :ref:`here `. +You can find more information about assessment security in the :ref:`Assessment Security ` page. -Access assessment results +Access Assessment Results ------------------------- You can access student scores for the auto-graded assessments in an assignment. All this information is in JSON format and can be accessed in the ``CODIO_AUTOGRADE_ENV`` environment variable. The following tabs show the format of this data. The first tab shows just the assessment portion of the data and the second depicts all the available values. @@ -258,24 +259,26 @@ The student grade is calculated based on whether they answered the question, not - In the course assignment settings :ref:`Grade Weights ` section, enable **Script Grading** set **Set custom script path** to that file and disable **Assessments Grading**. -Regrade an individual student's assignment ------------------------------------------- -If students have clicked **Mark as Complete** and the custom script is triggered, you can regrade their work by resetting the `complete` switch, and then set it to *complete* again, which triggers the custom script to run again. +Regrade an Individual Student's Assignment +------------------------------------------- +If students have clicked **Mark as Complete** and the custom script has been triggered, you can regrade their work by clicking the three vertical dots next to the student's name to access additional actions. Select **Mark as Incomplete**, then click the three vertical dots again and select **Mark as Complete** to retrigger the custom script. -Regrade all student's assignments +Regrade All Students' Assignments --------------------------------- -You can regrade all student's assignments that have already been auto-graded from the **Actions** button on the assignment page. +You can regrade all students' assignments that have already been auto-graded by clicking the **Regrade Completed** button at the bottom of the assignment page. + +1. Navigate to the assignment and click it. +2. Then click **Regrade Completed** at the bottom of the page. This is useful if you have found a bug in your assignment level grading script. **Regrade Completed** does not run individual code test assessments. -1. Navigate to the assignment and open it. -2. Click the **Actions** button and then click **Regrade Completed**. This is useful if you have found a bug in your assignment level grading script. **Regrade Completed** does not run individual code test assessments. +Test and Debug Your Grading Scripts +------------------------------------ -Test and debug your grading scripts ------------------------------------ -.. Note:: Codio provides the ability to test your auto-grading scripts when creating your project, this should be done before publishing your project to a course. Once an assignment has been published to the course, any changes made to files in the student workspace (/home/codio/workspace) are not reflected in the published assignment. Grading scripts should be stored in the **.guides/secure** folder. Files in the .guides and guides/secure folders can be published even if students have already started. +Codio provides the ability to test your auto-grading scripts when creating your project, this should be done before publishing your project to a course. Once an assignment has been published to the course, any changes made to files in the student workspace (/home/codio/workspace) are not reflected in the published assignment. Grading scripts should be stored in the **.guides/secure** folder. Files in the **.guides** and **.guides/secure** folders can be published even if students have already started. -Test your script in the IDE -........................... +Test Your Script in the IDE +---------------------------- + You can test your auto-grading script in the Codio IDE from the **Education > Test Autograde Script** on the menu bar. This option allows you to specify the location of your auto-grading script and run it against the current project content. It also allows you simulate scores attained by any auto-graded assessments located in the Codio Guide and select which autograded assessments to test. .. image:: /img/autograde-test.png @@ -283,17 +286,18 @@ You can test your auto-grading script in the Codio IDE from the **Education > Te Be sure to take the following into account when using this feature: -- When you click **Test Script**: +1. When you click **Test Script**: + +- All output to ``stdout`` and ``stderr`` are displayed in the dialog. +- The grade returned by your test script is at the bottom of the output section. - - All output to ``stdout`` and ``stderr`` are displayed in the dialog. - - The grade returned by your test script is at the bottom of the output section. +2. ``stdout`` and ``stderr`` output is not available when running the actual auto-grading script (not in test mode) because it runs invisibly when the assignment is marked as complete. Because of this, you should only generate output for testing and debugging. +3. If you want your script to provide feedback to the student, you should output it to a file that can be accessed by the student when opening the project at a later date. In this case, you should allow read-only access to the project from the assignment settings after being marked as complete. -- ``stdout`` and ``stderr`` output is not available when running the actual auto-grading script (not in test mode) because it runs invisibly when the assignment is marked as complete. Because of this, you should only generate output for testing and debugging. -- If you want your script to provide feedback to the student, you should output it to a file that can be accessed by the student when opening the project at a later date. In this case, you should allow read-only access to the project from the assignment settings after being marked as complete. +Test Your Script Using Bootstrap Launcher +------------------------------------------ -Test your script using bootstrap launcher -......................................... -You can also use a simple bootstrap launcher that loads and executes the script from a remote location so that you can edit and debug independently of the Codio box. The following example bash script shows a Python script that is located as a Gist on GitHub. This script might be called **.guides/secure/launcher.sh**. +You can also use a simple bootstrap launcher that loads and executes the script from a remote location so that you can edit and debug independently of the Codio box. The following example is a bash launcher script that downloads and runs a Python script from a GitHub Gist. This script would be saved as **.guides/secure/launcher.sh**. .. code:: bash @@ -312,7 +316,7 @@ Sending Points to Codio Codio provides a Python library to facilitate reporting points from your custom scripts. There are four functions in this library: `send_grade`, `send_grade_v2`, `send_partial` and `send_partial_v2`. - .. Note:: Partial points are not used in assignment level scripts, see :ref:`Allow Partial Points ` for more information about setting up partial points. +.. Note:: Partial points are not used in assignment level scripts. See :ref:`Allow Partial Points ` for more information about setting up partial points. In order to use this library you need to add the following code to the top of your grading script: @@ -336,23 +340,31 @@ The calls to use these functions are as follows: send_grade(grade) -`grade` - Should be the percent correct for the assessment. +or: .. code:: python send_grade_v2(grade, feedback, format=FORMAT_V2_TXT, extra_credit=None) -`grade` - Should be the percent correct for the assessment. - -`feedback` - The buffer containing the feedback for your student - maximum size is 1 Mb. -`format` - The format can be Markdown, HTML or text and the default is text. +.. list-table:: + :widths: 20 80 + :header-rows: 1 -`extra_credit` - Extra points beyond the value for doing this correctly. These do not get passed to an LMS system automatically, just the percentage correct. + * - Field + - Description + * - ``grade`` + - Should be the percent correct for the assessment. + * - ``feedback`` + - The buffer containing the feedback for your student - maximum size is 1 Mb. + * - ``format`` + - The format can be Markdown, HTML or text and the default is text. + * - ``extra_credit`` + - Extra points beyond the value for doing this correctly. These do not get passed to an LMS system automatically, just the percentage correct. .. _autograde-enhance: -Auto-grading enhancements +Auto-Grading Enhancements ------------------------- The V2 versions of the grading functions allow you to: @@ -362,14 +374,25 @@ The V2 versions of the grading functions allow you to: - Notify (instructors and students) and reopen assignments for a student on grade script failure. -If you don't use the send_grade_v2 functions, this URL (passed as an environment variable) can be used:```CODIO_AUTOGRADE_V2_URL``` +If you don't use the send_grade_v2 functions, this URL (passed as an environment variable) can be used:``CODIO_AUTOGRADE_V2_URL`` These variables allow POST and GET requests with the following parameters: -- **Grade** (```CODIO_AUTOGRADE_V2_URL```) - return 0-100 percent. This is the percent correct out of total possible points. -- **Feedback** - text -- **Format** - html, md, txt - txt is default -- **Penalty** - Penalty is number between 0-100, +.. list-table:: + :widths: 20 80 + :header-rows: 1 + + * - Field + - Description + * - **Grade** (``CODIO_AUTOGRADE_V2_URL``) + - Return 0-100 percent. This is the percent correct out of total possible points. + * - **Feedback** + - Text + * - **Format** + - html, md, txt - txt is default + * - **Penalty** + - Penalty is number between 0-100 + If you want to calculate penalties in the grading script you can use the **completedDate** (in UTC format) in ``CODIO_AUTOGRADE_ENV`` to calculate penalties. See Python example below. @@ -430,7 +453,7 @@ These Python and Bash files that can be loaded by a bootstrap script or as expla main() -Example grading scripts +Example Grading Scripts ----------------------- This section provides example assignment level scripts using the older methods to send grades. diff --git a/source/instructors/authoring/assessments/edit-assessment-points.rst b/source/instructors/authoring/assessments/edit-assessment-points.rst index 56eef98e..52752e97 100644 --- a/source/instructors/authoring/assessments/edit-assessment-points.rst +++ b/source/instructors/authoring/assessments/edit-assessment-points.rst @@ -7,9 +7,11 @@ Edit Assessment Points ====================== To edit assessment points, follow these steps: -1. In the Guide Editor, click the **Assessments** button to view the list of all assessments. +1. In the Guide Editor, click the **Assessments** button. -2. Click the assessment to open it and modify the points. +2. Then click **"View Existing Assessments"** in the bottom right corner to view the list of all assessments. + +3. Modify the point value in the box to the left of the assessment you want to update. Once done, click the **Close** button on the bottom left. .. image:: /img/assessmentpoints.png :alt: Edit Assessment Points \ No newline at end of file diff --git a/source/instructors/authoring/assessments/edit-assessment.rst b/source/instructors/authoring/assessments/edit-assessment.rst index 2568e9c5..384368dd 100644 --- a/source/instructors/authoring/assessments/edit-assessment.rst +++ b/source/instructors/authoring/assessments/edit-assessment.rst @@ -8,12 +8,16 @@ Edit an Assessment To edit an assessment, either: -1. In the Guide Editor, click the **Edit: ** button to the right of the assessment. +- In the Guide Editor, click the **Edit: ** button to the right of the assessment. .. image:: /img/guides/editassessmentbutton.png :alt: Edit Assessment -2. Click the **Assessment** button to view the list of all assessments and click the assessment to open it and make your changes. +Or, in the Guide Editor, use the following steps: + 1. Click the **Assessment** button. + 2. Click **"View Existing Assessments"** in the bottom right corner. + 3. Click the assessment to open it and make your changes. + .. image:: /img/guides/editassessmentlist.png :alt: Edit Assessment List \ No newline at end of file diff --git a/source/instructors/authoring/assessments/fill-in-blanks.rst b/source/instructors/authoring/assessments/fill-in-blanks.rst index b916c5a4..6602aaff 100644 --- a/source/instructors/authoring/assessments/fill-in-blanks.rst +++ b/source/instructors/authoring/assessments/fill-in-blanks.rst @@ -7,125 +7,92 @@ Fill in the Blanks ================== A **Fill in the blank question** can use either free text or offer options to be chosen from a drop-down list: - - Free Text Answers - Shows a question where the student must complete the missing words (fill in the blank) by entering the answers. - .. image:: /img/guides/assessments-fitb1.png - :alt: Free Text +.. list-table:: + :widths: 30 70 + :header-rows: 1 - - Drop-Down Answers - The possible answers are available in a drop-down list where the student chooses the correct answer. + * - Answer Type + - Description + * - Free Text Answers + - Shows a question where the student must complete the missing words (fill in the blank) by entering the answers. - .. image:: /img/guides/assessments-fitb2.png - :alt: Drop-Down + .. image:: /img/guides/assessments-fitb1.png + :alt: Free Text + * - Drop-Down Answers + - The possible answers are available in a drop-down list where the student chooses the correct answer. + + .. image:: /img/guides/assessments-fitb2.png + :alt: Drop-Down Assessment Auto-Generation ++++++++++++++++++++++++++ -Assessments can be auto-generated using the text on the current guides page as context. Follow the steps below to auto-generate a Fill in the Blanks assessment: - -1. Select **Fill in the Blanks** assessment from Assessments list. - -2. Press the **Generate** button at bottom right corner. - - .. image:: /img/guides/generate-assessment-button.png - :alt: Generate assessment button - -3. The Generation Prompt will open, press **Generate Using AI** to preview the generated assessment. - - .. image:: /img/guides/assessment-generation-prompt.png - :alt: Assessment Generation Prompt +Assessments can be auto-generated using the text on the current guide page as context. For more information, see the :ref:`AI assessment generation ` page. - If you are not satisfied with the result, select **Regenerate** to create a new version of the assessment. You can provide additional guidance in the **Generation Prompt** field. For example, *create assessment based on the first paragraph with 2 correct answers.* - -4. When you are satisfied with the result, press **Apply** and then **Create**. - -More information about generating assessments may be found on the :ref:`AI assessment generation ` page. Assessment Manual Creation ++++++++++++++++++++++++++ -Follow these steps to set up fill in the blank assessments manually: - -1. On the **General** page, enter the following information: - - .. image:: /img/guides/assessment_general.png - :alt: General - - - **Name** - Enter a short name that describes the test. This name is displayed in the teacher dashboard so the name should reflect the challenge and thereby be clear when reviewing. +Follow these steps to set up fill in the blank assessments manually. For more information on **General**, **Metadata** (optional) and **Files** (optional) see :ref:`Assessments `. - Toggle the **Show Name** setting to hide the name in the challenge text the student sees. - - - **Instruction** - Enter the instructions to be shown to the students. +1. Complete **General**. 2. Click **Execution** in the navigation pane and complete the following information: .. image:: /img/guides/assessment_fitb_exec.png :alt: Execution - - **Text** - Enter the question in markdown, including the correct answer specification. For example: +- **Text** - Enter the question in markdown, including the correct answer specification. For example: -``A prime number (or a prime) is a <<>> number greater than <<<1>>> that has no positive divisors other than <<<1>>> and <<>>.`` - - - - **Show Possible Values** - Toggle to display possible options for the correct answer: +.. code-block:: text + + A prime number (or a prime) is a <<>> number greater than <<<1>>> that has no positive divisors other than <<<1>>> and <<>>. + +- **Show Possible Values** - Toggle to display possible options for the correct answer: - - For text-free questions, blank fields are available for the student to enter the correct answer. - - For drop-down questions, all the correct values (anything within the `<<< >>>` chevrons) are provided in each of the answer positions in a randomized order. You can also add incorrect answers (one per line). + - For free text questions, blank fields are available for the student to enter the correct answer. + - For drop-down questions, all the correct values (anything within the `<<< >>>` chevrons) are provided in each of the answer positions in a randomized order. You can also add incorrect answers (one per line). - .. image:: /img/guides/distractors.png - :alt: Distractors +.. image:: /img/guides/distractors.png + :alt: Distractors - **Regular Expression Support** +**Regular Expression Support** - Codio regular expression matching follows the Java language standards. +Codio regular expression matching follows the Java language standards. - Examples of regular expressions supported for blank fields: - - - Answer allows any characters - ``<<>>`` - - Answer starts with word "begin" - ``<<>>`` - - Answer ends with word "end" - ``<<>>`` - - Answer can contain one or more spaces in "this is" - ``<<>>`` - - Answer contains 1 or 2 or 3 - ``<<>>`` - - Answer allows color or colour - ``<<>>`` - - Answer allows yes or "yes" - ``<<<"yes", ""yes"">>>`` - - Answer allows hat or cat - ``<<>>`` - - Answer allows i==0 or i == 0 - ``<<>>`` - - Answer must only contain digits - ``<<>>`` - - Answer must only contain non-digits - ``<<>>`` - - Answer requires word character - ``<<>>`` - - Answer requires non-word character - ``<<>>`` - - Answer allows several answers (Place1 or Place2) - ``<<<"Place1", "Place2">>>`` - - Answer requires case sensitivity - ``<<>>`` or ``<<>>`` +Examples of regular expressions supported for blank fields: + +- Answer allows any characters - ``<<>>`` +- Answer starts with word "begin" - ``<<>>`` +- Answer ends with word "end" - ``<<>>`` +- Answer can contain one or more spaces in "this is" - ``<<>>`` +- Answer contains 1 or 2 or 3 - ``<<>>`` +- Answer allows color or colour - ``<<>>`` +- Answer allows yes or "yes" - ``<<<"yes", ""yes"">>>`` +- Answer allows hat or cat - ``<<>>`` +- Answer allows i==0 or i == 0 - ``<<>>`` +- Answer must only contain digits - ``<<>>`` +- Answer must only contain non-digits - ``<<>>`` +- Answer requires word character - ``<<>>`` +- Answer requires non-word character - ``<<>>`` +- Answer allows several answers (Place1 or Place2) - ``<<<"Place1", "Place2">>>`` +- Answer requires case sensitivity - ``<<>>`` or ``<<>>`` 3. Click **Grading** in the navigation pane and complete the following fields: .. image:: /img/guides/assessment_fitb_grading.png :alt: Grading - - **Points** - Enter the score for correctly answering the question. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). +- **Points** - Enter the score for correctly answering the question. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). - - **Show Expected Answer** - Toggle to show the students the expected output when they have submitted an answer for the question. To suppress expected output, disable the setting. +- **Show Expected Answer** - Toggle to show students the expected output when they have submitted an answer for the question. To suppress expected output, disable the setting. - - **Define Number of Attempts** - enable to allow and set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. - - - **Show Rationale to Students** - Toggle to display the rationale for the answer, to the student. This guidance information will be shown to students after they have submitted their answer and any time they view the assignment after marking it as completed. You can set when to show this selecting from **Never**, **After x attempts**, **If score is greater than or equal to a % of the total** or **Always** - - - **Rationale** - Enter guidance for the assessment. This is always visible to the teacher when the project is opened in the course or when opening the student's project. - - **Use maximum score** - Toggle to enable assessment final score to be the highest score attained of all runs. - -4. Click on the **Parameters** tab if you wish to edit/change **Parameterized Assessments** (deprecated) using a script. See :ref:`Parameterized Assessments ` for more information. New parameterized assessments can no longer be set up. - -5. Click **Metadata** in the left navigation pane and complete the following fields: - - .. image:: /img/guides/assessment_metadata.png - :alt: Metadata - - - **Bloom's Level** - Click the drop-down and choose the level of Bloom's Taxonomy: https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/ for the current assessment. - - **Learning Objectives** The objectives are the specific educational goal of the current assessment. Typically, objectives begin with Students Will Be Able To (SWBAT). For example, if an assessment asks the student to predict the output of a recursive code segment, then the Learning Objectives could be *SWBAT follow the flow of recursive execution*. - - **Tags** - The **Content** and **Programming Language** tags are provided and required. To add another tag, click **Add Tag** and enter the name and values. +- **Define Number of Attempts** - enable to set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. -6. Click **Files** in the left navigation pane and check the check boxes for additional external files to be included with the assessment when adding it to an assessment library. The files are then included in the **Additional content** list. +- **Show Rationale to Students** - Toggle this option to display answer explanations to students. The rationale will appear after they submit their answer and whenever they view the assignment after marking it as complete. Control when students see the rationale by selecting: **Never**, **After x attempts**, **If score is greater than or equal to a % of the total**, or **Always**. - .. image:: /img/guides/assessment_files.png - :alt: Files +- **Rationale** - Enter guidance for the assessment. This is always visible to the teacher when the project is opened in the course or when opening the student's project. +- **Use maximum score** - Toggle to enable assessment final score to be the highest score attained of all runs. -7. Click **Create** to complete the process. \ No newline at end of file +4. Click **Create** to complete the process. diff --git a/source/instructors/authoring/assessments/free-text.rst b/source/instructors/authoring/assessments/free-text.rst index 5f8db790..abd6516b 100644 --- a/source/instructors/authoring/assessments/free-text.rst +++ b/source/instructors/authoring/assessments/free-text.rst @@ -5,91 +5,62 @@ Free Text ========= -Free text assessments allow students to answer questions in their own words. Because Free Text assessments allow for LaTeX formatting, this type of assessment is recommended for math assessments. Teachers are then able to review and manually grade their answers. +Free text assessments allow students to answer questions in their own words. This type of assessment is recommended for math assessments since it allows for LaTeX formatting. Teachers are then able to review and manually grade their answers. -Assessment Auto-Generation -++++++++++++++++++++++++++ - -Assessments can be auto-generated using the text on the current guides page as context. Follow the steps below to auto-generate a Free Text assessment: - -1. Select **Free Text** assessment from Assessments list. - -2. Press the **Generate** button at bottom right corner. - - .. image:: /img/guides/generate-assessment-button.png - :alt: Generate assessment button - -3. The Generation Prompt will open, press **Generate Using AI** to preview the generated assessment. - .. image:: /img/guides/assessment-generation-prompt.png - :alt: Assessment Generation Prompt -If you are not satisfied with the result, select **Regenerate** to create a new version of the assessment. You can provide additional guidance in the **Generation Prompt** field. For example, *create assessment based on the first paragraph with 2 correct answers.* - -4. When you are satisfied with the result, press **Apply** and then **Create**. +Assessment Auto-Generation +++++++++++++++++++++++++++ -More information about generating assessments may be found on the :ref:`AI assessment generation ` page. +Assessments can be auto-generated using the text on the current guides page as context. For more information about generating assessments, see the :ref:`AI assessment generation ` page. Assessment Manual Creation ++++++++++++++++++++++++++ -Follow these steps to set up a free text assessment manually: +Follow these steps to set up a free text assessment manually. For more information on **General**, **Metadata** (optional), and **Files** (optional), see :ref:`Assessments `. -1. On the **General** page, enter the following information: - - .. image:: /img/guides/assessment_free_general.png - :alt: General - - - **Name** - Enter a short name that describes the test. This name is displayed in the teacher dashboard so the name should reflect the challenge and thereby be clear when reviewing. - - If you want to hide the name in the challenge text the student sees, toggle the **Show Name** setting to disable it. - - - **Instruction** - Enter the instructions in markdown to be shown to the students. +1. Complete **General**. 2. Click **Grading** in the navigation pane and complete the following fields: .. image:: /img/guides/assessment_free_grading.png :alt: Grading - - **Points** - Enter the score for correctly answering the question. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). - - - **Allow Partial Points** - Toggle to enable a percentage of total points to be given based on the percentage of answers they correctly answer. Once you toggle this on, you will be able to add rubric items. Rubric items are negative points, they will be subtracted from the total score of the assessment. - - - **Preview Type** - Choose the input (plaintext or markdown) to be provided by the student. LaTex is also supported and is useful when students need to enter mathematical formulas in their answers. The following options are available: - - - **Plaintext** - Students enter ordinary text with no markdown formatting; there is no preview window. - - **Plaintext + LaTeX** - Students enter plaintext with no markdown formatting but offers support for LaTeX commands. A preview window is available where students can see the rendered LaTeX output. - - **Markdown + LaTeX** - Students enter markdown that also offers support for LaTex commands. A preview window is available where students can see the rendered markdown with LaTeX output. - - - **Define Number of Attempts** - enable to allow and set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. - - - **Show Rationale to Students** - Toggle to display the rationale for the answer, to the student. This guidance information will be shown to students after they have submitted their answer and any time they view the assignment after marking it as completed. You can set when to show this selecting from **Never**, **After x attempts**, **If score is greater than or equal to a % of the total** or **Always** +- **Points** - Enter the score for correctly answering the question. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). - - **Rationale** - Enter guidance for the assessment. This is visible to the teacher when the project is opened in the course or when opening the student's project. This guidance information can also be shown to students after they have submitted their answer and when they reload the assignment after marking it as completed. +- **Partial Points** - Toggle to enable a percentage of total points to be awarded based on the percentage of correct answers. Once you enable this, you can add rubric items. Rubric items are negative points that will be subtracted from the total score. -3. Click on the **Parameters** tab if you wish to edit/change **Parameterized Assessments** (deprecated) using a script. See :ref:`Parameterized Assessments ` for more information. New parameterized assessments can no longer be set up. +- **Preview Type** - Choose the format for student input (plaintext, markdown, or LaTeX). LaTeX is useful when students need to enter mathematical formulas in their answers. The following options are available: -4. Click **Metadata** in the left navigation pane and complete the following fields: +.. list-table:: + :widths: 20 80 + :header-rows: 1 - .. image:: /img/guides/assessment_metadata.png - :alt: Metadata + * - Format Type + - Description + * - **Plaintext** + - Students enter ordinary text with no markdown formatting; there is no preview window. + * - **Plaintext + LaTeX** + - Students enter plaintext with no markdown formatting but offers support for LaTeX commands. A preview window is available where students can see the rendered LaTeX output. + * - **Markdown + LaTeX** + - Students enter markdown that also offers support for LaTeX commands. A preview window is available where students can see the rendered markdown with LaTeX output. - - **Bloom's Level** - Click the drop-down and choose the level of Bloom's Taxonomy: https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/ for the current assessment. - - **Learning Objectives** specific educational goal of the current assessment. Typically, objectives begin with Students Will Be Able To (SWBAT). For example, if an assessment asks the student to predict the output of a recursive code segment, then its Learning Objectives could be *SWBAT follow the flow of recursive execution*. - - **Tags** - By default, **Content** and **Programming Language** tags are provided and required. To add another tag, click **Add Tag** and enter the name and values. +- **Define Number of Attempts** - Enable to allow and set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. + +- **Show Rationale to Students** - Toggle to display the rationale to students. This guidance information will be shown after they have submitted their answer and whenever they view the assignment after marking it as completed. You can control the display by selecting **Never**, **After x attempts**, **If score is ≥ x% of total**, or **Always**. -5. Click **Files** in the left navigation pane and check the check boxes for additional external files to be included with the assessment. The files are then included in the **Additional content** list. +- **Rationale** - Enter guidance for the assessment. This is visible to the teacher when the project is opened in the course or when viewing the student's project. It can also be shown to students after submission or when they revisit the assignment after marking it as completed. - .. image:: /img/guides/assessment_files.png - :alt: Files +3. Click **Create** to complete the process. -6. Click **Create** to complete the process. -Grading free text assessments +Grading Free Text Assessments ----------------------------- To review and grade answers given by students in a free text assessment, follow these steps: -1. Select the assignment to view the list of all assessments in the assignment for the student. +1. Select the assignment. + +2. Find the student you want to grade and click the number under **Points** to view all assessments in the assignment. .. image:: /img/guides/freetext-grading.png :alt: Free Text Grading @@ -99,36 +70,41 @@ To review and grade answers given by students in a free text assessment, follow .. image:: /img/guides/freetexticon.png :alt: Free Text Assessments Icon -2. Click any line to view the question and the answer submitted by the student. +3. Click any line to view the question and the answer submitted by the student. -3. In the **Points** for answer field, perform one of the following depending on whether **Allow Partial Points** was enabled or disabled for the question: +4. In the **Points** for answer field, perform one of the following depending on whether **Allow Partial Points** was enabled or disabled for the question: - If **Allow Partial Points** was disabled, click **Correct** or **Incorrect**: .. image:: /img/guides/notpartial.png :alt: Allow Partial Points Disabled + :width: 450px - - If **Allow Partial Points** was enabled, select the points to give for the answer up to the maximum points: - + - If **Allow Partial Points** is enabled, assign points for the answer up to the maximum allowed. You can also add rubric items and specify the point value for each: + .. image:: /img/guides/partial.png :alt: Allow Partial Points Enabled + :width: 450px + + +5. In the **Comments** field, enter any information about the grade, which can be viewed by the student when the grade is released, and then click **Submit Comment**. + -4. In the **Comments** field, enter any information (in markdown + LaTeX) about the grade, which can be viewed by the student when the grade is released, and then click **Submit Comment**. +Navigate Student Assessments +---------------------------- -Navigate student assessments -............................. -You can navigate through student assessments using the left (**<**) and right (**>**) arrow buttons at the top of the **Assessments grading** dialog. +You can navigate through student assessments using the left (**<**) and right (**>**) arrow buttons at the bottom of the **Assessments grading** dialog. .. image:: /img/guides/freetext_navigate.png :alt: Navigating Assessments -View graded free text assessments -................................. +View Graded Free Text Assessments +---------------------------------- You can view the points given and the Correct column checked for all free text assessments that have been graded. .. image:: /img/guides/freetextanswer.png :alt: View Graded Assessment -Free text assessments that are automatically graded as correct -.............................................................. -You can do this with :ref:`Free Text Autograde `. \ No newline at end of file +Free Text Assessments That Are Automatically Graded as Correct +-------------------------------------------------------------- +Free text assessments can be automatically graded as correct. To learn more, see :ref:`Free Text Autograde `. diff --git a/source/instructors/authoring/assessments/grade-book.rst b/source/instructors/authoring/assessments/grade-book.rst index d3eec04d..6015c4e0 100644 --- a/source/instructors/authoring/assessments/grade-book.rst +++ b/source/instructors/authoring/assessments/grade-book.rst @@ -5,51 +5,39 @@ Grade Book ========== -A Grade Book assessment may be used to manually grade assignments, it will not appear in the student guides. When the instructor opens a student assignment, the Grade Book is available for rubric based grading. Comments, points, and rubric items are visible to the student when the assessment is graded and the grades are released. Grade Book assessments do not show in the total assessment count on student/teacher dashboard. +A Grade Book assessment may be used to manually grade assignments, it will not appear in the student guides. When the instructor opens a student assignment, the Grade Book is available for rubric based grading. Comments, points, and rubric items are visible to the student when the assessment is graded and the grades are released. Grade Book assessments do not show in the total assessment count on student/teacher dashboard. For more information on **General**, **Metadata** (optional) and **Files** (optional) see :ref:`Assessments `. -1. On the **General** page, enter a short **Name** that describes the test. This name is displayed in the teacher dashboard and when the teacher is viewing the student work. The name should reflect the challenge and thereby be clear when reviewing. - Leaving the name visible will make it easier to locate when grading and we do not recommend toggling the **Show Name** setting to disable it. - - .. image:: /img/guides/assessment_gradebook_general.png - :alt: General +1. Complete **General**. Keep the **Show Name** setting enabled to make submissions easier to locate when grading. 2. Click **Grading** in the navigation pane and complete the following fields: - .. image:: /img/guides/assessment_gradebook_grading.png - :alt: Grading +.. image:: /img/guides/assessment_gradebook_grading.png + :alt: Grading - - **Points** - Enter the score if the student selects the correct answer. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). +- **Points** - Enter the score if the student selects the correct answer. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). - - **Allow Partial Points** - Partial points must be toggled on for this type of assessment. Once you toggle this on, you will be able to add rubric items. Rubric items are negative points, they will be subtracted from the total score of the assessment. +- **Allow Partial Points** - Partial points must be toggled on for this type of assessment. Once you toggle this on, you will be able to add rubric items. Rubric items are negative points, they will be subtracted from the total score of the assessment. .. image:: /img/guides/assessment_gb_rubric.png :alt: Rubric + :width: 350px -3. Click **Metadata** in the left navigation pane and complete the following fields: - - .. image:: /img/guides/assessment_metadata.png - :alt: Metadata - - - **Bloom's Level** - Click the drop-down and choose the level of Bloom's Taxonomy: https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/ for the current assessment. - - **Learning Objectives** specific educational goal of the current assessment. Typically, objectives begin with Students Will Be Able To (SWBAT). For example, if an assessment asks the student to predict the output of a recursive code segment, then its Learning Objectives could be *SWBAT follow the flow of recursive execution*. - - **Tags** - By default, **Content** and **Programming Language** tags are provided and required. To add another tag, click **Add Tag** and enter the name and values. - -4. Click **Files** in the left navigation pane and check the check boxes for additional external files to be included with the assessment. The files are then included in the **Additional content** list. +3. Click **Create** to complete the process. - .. image:: /img/guides/assessment_files.png - :alt: Files -5. Click **Save** to complete the process. +Grading a Grade Book Assessment +------------------------------- -The instructor must :ref:`opening a student assignment ` to complete the Grade Book grading. Locate the Grade Book assessment in the student file and click **Grade**. +The instructor must :ref:`open a student assignment ` to complete the Grade Book grading. Locate the Grade Book assessment in the student file and click **Grade**. - .. image:: /img/guides/assessment_gb_opengb.png - :alt: Grade book assessment +.. image:: /img/guides/assessment_gb_opengb.png + :alt: Grade book assessment Click on the 0 or 1 to allocate initial points for overall correctness and then use the other rubric items to subtract points for missing items. There is also a **Points adjust** field if you wish to adjust total points upwards. - .. image:: /img/guides/assessment_gb_grade.png - :alt: Grade book assessment \ No newline at end of file +.. image:: /img/guides/assessment_gb_grade.png + :alt: Grade book assessment + :width: 700px \ No newline at end of file diff --git a/source/instructors/authoring/assessments/math-assessments.rst b/source/instructors/authoring/assessments/math-assessments.rst index fdb1b6a0..c654bac0 100644 --- a/source/instructors/authoring/assessments/math-assessments.rst +++ b/source/instructors/authoring/assessments/math-assessments.rst @@ -10,16 +10,16 @@ Math Assessments Codio allows you to set and grade math questions for any type and level of mathematics using the **Free Text** assessment. We only offer manual grading of mathematical expressions or proofs. -Manually graded assessments using free text -******************************************* +Manually Graded Assessments Using Free Text +------------------------------------------- -To create a manually graded math question, you can use the **Free text** assessment type. This allows the students to enter expressions or even full proofs and worked answers using Latex. For more information about Latex, please :ref:`click here `. +To create a manually graded math question, you can use the **Free text** assessment type. This allows the students to enter expressions or even full proofs and worked answers using LaTeX. For more information about LaTeX, see :ref:`LaTeX `. -You can enter Latex in the **Question** and **Rationale** fields. +You can enter LaTex in the **Question** and **Rationale** fields. -You should also set the **Preview type** drop down to either **Plaintext + Latex** or **Markdown + Latex**. Both of these ensure that the student sees a preview pane beneath their answer entry fully rendered in markdown and/or Latex. Please - :ref:`click here ` to review the free text assessment. +You should also set the **Preview type** dropdown to either **Plaintext + LaTeX** or **Markdown + LaTeX**. Both of these ensure that the student sees a preview pane beneath their answer entry fully rendered in markdown and/or LaTeX. See :ref:`Free text assessments ` for more information. -Multiple choice -*************** +Multiple Choice +--------------- -You can also use the multiple choice assessment type to create answers containing properly rendered Latex expressions. +You can also use the multiple choice assessment type to create answers containing properly rendered LaTeX expressions. See :ref:`Multiple Choice ` for more information. diff --git a/source/instructors/authoring/assessments/multiple-choice.rst b/source/instructors/authoring/assessments/multiple-choice.rst index 47df8422..4b4f3dd2 100644 --- a/source/instructors/authoring/assessments/multiple-choice.rst +++ b/source/instructors/authoring/assessments/multiple-choice.rst @@ -5,94 +5,59 @@ Multiple Choice =============== -Multiple choice type assessments provide a question and then single or multiple response options. +Multiple choice assessments allow you to present a question with single or multiple response options. These questions can be created manually or generated using AI. Assessment Auto-Generation -++++++++++++++++++++++++++ +-------------------------- -Assessments can be auto-generated using the text on the current guides page as context. Follow the steps below to auto-generate a Multiple Choice assessment: - -1. Select **Multiple Choice** from the Assessments list. - -2. Press the **Generate** button at bottom right corner. - - .. image:: /img/guides/generate-assessment-button.png - :alt: Generate assessment button - -3. The Generation Prompt will open, press **Generate Using AI** to preview the generated assessment. - - .. image:: /img/guides/assessment-generation-prompt.png - :alt: Assessment Generation Prompt - - - If you are not satisfied with the result, select **Regenerate** to create a new version of the assessment. You can provide additional guidance in the **Generation Prompt** field. For example, *create assessment based on the first paragraph with 2 correct answers.* - -4. When you are satisfied with the result, press **Apply** and then **Create**. - - -More information about generating assessments may be found on the :ref:`AI assessment generation ` page. +Assessments can be auto-generated using the text on the current guide page as context. For more information, see the :ref:`AI assessment generation ` page. Assessment Manual Creation -++++++++++++++++++++++++++ +-------------------------- -Follow these steps to set up multiple choice assessments manually: +Follow these steps to set up multiple choice assessments manually. For more information on **Metadata** (optional) and **Files** (optional), see :ref:`Assessments `. 1. On the **General** page, enter the following information: - .. image:: /img/guides/assessment_mc_general.png - :alt: General - - - **Name** - Enter a short name that describes the test. This name is displayed in the teacher dashboard so the name should reflect the challenge and thereby be clear when reviewing. +.. image:: /img/guides/assessment_mc_general.png + :alt: General - If you want to hide the name in the challenge text the student sees, toggle the **Show Name** setting to disable it. +- **Name** - Enter a short name that describes the test. This name appears in the teacher dashboard for easy reference. To hide the name from the student's view of the challenge, toggle **Show Name** to disable it. - - **Question** - Enter the question instruction that is shown to the student. +- **Question** - Enter the question instruction that is shown to the student. 2. Click **Execution** in the navigation pane and complete the following information: - .. image:: /img/guides/assessment_mc_exec.png - :alt: Execution +.. image:: /img/guides/assessment_mc_exec.png + :alt: Execution + :width: 450px - - **Shuffle Answers** - Toggle to randomize the order of the answers so each student sees the answers in a different order. - - **Multiple Response** - Toggle to enable a user to select more than one answer. - - **Answers** - Mark the correct answer(s) to the question. You can add as many options as needed. For each answer, toggle to enable as correct answer (for multiple responses), or click the radio button for the correct single response. - - **Ordering** - Use the **Up** and **Down** arrows to change the order in which the answers are presented. +- **Shuffle Answers** - Toggle to randomize the order of the answers so each student sees the answers in a different order. +- **Multiple Response** - Toggle to enable a user to select more than one answer. +- **Answers** - Mark the correct answer(s) to the question. You can add as many options as needed. For each answer, toggle to enable as correct answer (for multiple responses), or click the radio button for the correct single response. +- **Ordering** - Use the **Up** and **Down** arrows to change the order in which the answers are presented. 3. Click **Grading** in the navigation pane and complete the following fields: - .. image:: /img/guides/assessment_mc_grading.png - :alt: Grading - - - **Correct Points** - Enter the score if the student selects the correct answer. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). - - - **Incorrect Points** is the score to be deducted if the student makes an incorrect selection. Typically, this value will be 0 but you can assign any positive numeric value if you wish to penalize guessing. If this assessment is to be ungraded, set '0' points - - - **Allow Partial Points** - Toggle to enable a percentage of total points to be given based on the percentage of answers they correctly answer. +.. image:: /img/guides/assessment_mc_grading.png + :alt: Grading + :width: 450px - - **Show Expected Answer** - Toggle to show the students the expected output when they have submitted an answer for the question. To suppress expected output, disable the setting. - - - **Define Number of Attempts** - enable to allow and set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. - - - **Show Rationale to Students** - Toggle to display the rationale for the answer, to the student. This guidance information will be shown to students after they have submitted their answer and any time they view the assignment after marking it as completed. You can set when to show this selecting from **Never**, **After x attempts**, **If score is greater than or equal to a % of the total** or **Always** +- **Correct Points** - Enter the score if the student selects the correct answer. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). - - **Rationale** - Enter guidance for the assessment. This is always visible to the teacher when the project is opened in the course or when opening the student's project. - - **Use maximum score** - Toggle to enable assessment final score to be the highest score attained of all runs. +- **Incorrect Points** - The score to be deducted if the student makes an incorrect selection. Typically, this value will be 0, but you can assign any positive numeric value if you wish to penalize guessing. If this assessment is to be ungraded, enter zero (0). -4. Click on the **Parameters** tab if you wish to edit/change **Parameterized Assessments** (deprecated) using a script. See :ref:`Parameterized Assessments ` for more information. New parameterized assessments can no longer be set up. +- **Allow Partial Points** - Toggle to enable a percentage of total points to be given based on the percentage of answers they correctly answer. -5. Click **Metadata** in the left navigation pane and complete the following fields: +- **Use Maximum Score** - Toggle to enable assessment final score to be the highest score attained of all runs. - .. image:: /img/guides/assessment_metadata.png - :alt: Metadata +- **Show Expected Answer** - Choose when students can view the expected output: **never**, **when grades are released**, or **always**. - - **Bloom's Level** - Click the drop-down and choose the level of Bloom's Taxonomy: https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/ for the current assessment. - - **Learning Objectives** The objectives are the specific educational goal of the current assessment. Typically, objectives begin with Students Will Be Able To (SWBAT). For example, if an assessment asks the student to predict the output of a recursive code segment, then the Learning Objectives could be *SWBAT follow the flow of recursive execution*. - - **Tags** - The **Content** and **Programming Language** tags are provided and required. To add another tag, click **Add Tag** and enter the name and values. +- **Define Number of Attempts** - Toggle to enable attempt limits and specify the maximum number of attempts. If disabled, students can make unlimited attempts. -6. Click **Files** in the left navigation pane and check the check boxes for additional external files to be included with the assessment when adding it to an assessment library. The files are then included in the **Additional content** list. +- **Show Rationale to Students** - Toggle to display the rationale for the answer. This guidance information will be shown to students after they have submitted their answer and any time they view the assignment after marking it as completed. You can set when to show this by selecting from **Never**, **After x attempts**, **If score is greater than or equal to a % of the total**, **When grades are released**, or **Always**. - .. image:: /img/guides/assessment_files.png - :alt: Files +- **Rationale** - Enter guidance for the assessment. This is always visible to the teacher when the project is opened in the course or when opening the student's project. -7. Click **Create** to complete the process. +4. Click **Create** to complete the process. diff --git a/source/instructors/authoring/assessments/parsons-puzzle.rst b/source/instructors/authoring/assessments/parsons-puzzle.rst index 805e2909..e5b124f3 100644 --- a/source/instructors/authoring/assessments/parsons-puzzle.rst +++ b/source/instructors/authoring/assessments/parsons-puzzle.rst @@ -8,112 +8,90 @@ Parsons Puzzle Parson’s problems are available in Codio as Parsons Puzzles. Parson’s Puzzles are formative assessments that ask students to arrange blocks of scrambled code, allowing them to focus on the purpose and flow of the code (often including a new pattern or feature) instead of syntax. Codio uses `js-parsons `_ for Parson's Puzzles. Assessment Auto-Generation -++++++++++++++++++++++++++ +-------------------------- -Assessments can be auto-generated using the text on the current guides page as context. Follow the steps below to auto-generate a Parsons Puzzle assessment: +Assessments can be auto-generated using the text on the current guide page as context. For more information, see the :ref:`AI assessment generation ` page. -1. Select **Parson's Puzzle** assessment from Assessments list. - -2. Press the **Generate** button at bottom right corner. - - .. image:: /img/guides/generate-assessment-button.png - :alt: Generate assessment button - -3. The Generation Prompt will open, press **Generate Using AI** to preview the generated assessment. - - .. image:: /img/guides/assessment-generation-prompt.png - :alt: Assessment Generation Prompt - -If you are not satisfied with the result, select **Regenerate** to create a new version of the assessment. You can provide additional guidance in the **Generation Prompt** field. For example, *create assessment based on the first paragraph with 2 correct answers.* - -4. When you are satisfied with the result, press **Apply** and then **Create**. - -More information about generating assessments may be found on the :ref:`AI assessment generation ` page. Assessment Manual Creation -++++++++++++++++++++++++++ - - -Complete the following steps to set up a **Line Based Grader** Parsons Puzzle assessment. The **Line Based Grader** assessment treats student answers as correct if and only if they match the order and indentation found in **Initial Values**. For incorrect answers, it highlights the lines that were not ordered or indented properly. +-------------------------- -1. On the **General** page, enter the following information: - .. image:: /img/guides/assessment_general.png - :alt: General +Complete the following steps to set up a **Line Based Grader** Parsons Puzzle assessment. The **Line Based Grader** assessment treats student answers as correct if and only if they match the order and indentation found in **Initial Values**. For incorrect answers, it highlights the lines that were not ordered or indented properly. For more information on **General**, **Metadata** (optional) and **Files** (optional) see :ref:`Assessments `. - - **Name** - Enter a short name that describes the test. This name is displayed in the teacher dashboard so the name should reflect the challenge and thereby be clear when reviewing. - - If you want to hide the name in the challenge text the student sees, toggle the **Show Name** setting to disable it. - - - **Instruction** - Enter the instructions in markdown to be shown to the students. +1. Complete **General**. 2. Click **Execution** in the navigation pane and complete the following information: - .. image:: /img/guides/assessment_parsons_exec.png - :alt: Execution - - - **Code to Become Blocks** - Enter code blocks that make up the initial state of the puzzle for the students. - - **Code to Become Distractor Blocks** - Enter code blocks that serve as distractions. - - **Max Distractors** - Enter the maximum number of distractors allowed. - - **Grader** - Choose the appropriate grader for the puzzle from the drop-down list. - - **Show Feedback** - Select to show feedback to student and highlight error in the puzzle. Deselect to hide feedback and not show highlight error in the puzzle. - - **Require Dragging** - If you enter **Code to Become Distractor Blocks**, **Require Dragging** will automatically turn on. Without distractor blocks, you can decide whether or not you want students to drag blocks to a separate area to compose their solution. - - **Disable Indentation** - If you do not want to require indention, check the **Disable Indentation** box. - - **Indent Size** - Each indention defaults to 50 pixels. +.. image:: /img/guides/assessment_parsons_exec.png + :alt: Execution + +.. list-table:: + :widths: 30 70 + :header-rows: 1 + + * - Option + - Description + * - **Code to Become Blocks** + - Enter the code solution in the correct order using proper indentation. + * - **Code to Become Distractor Blocks** + - Enter lines of code that serve as distractions. + * - **Max Distractors** + - Enter the maximum number of distractors allowed. + * - **Grader** + - Choose the appropriate grader for the puzzle from the drop-down list. For more information see :ref:`Grader Options `. + * - **Show Feedback** + - Select to show feedback to students and highlight errors in the puzzle. Deselect to hide feedback and not highlight errors in the puzzle. + * - **Require Dragging** + - If you enter **Code to Become Distractor Blocks**, **Require Dragging** will automatically turn on. Without distractor blocks, you can decide whether you want students to drag blocks to a separate area to compose their solution. + * - **Disable Indentation** + - If you do not want to require indentation, check the **Disable Indentation** box. + * - **Indent Size** + - Each indentation defaults to 50 pixels. 3. Click **Grading** in the navigation pane and complete the following fields: - .. image:: /img/guides/Grading-new-feature1.png - :alt: Grading - - - **Points** - Enter the score if the student selects the correct answer. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). - - - **Define Number of Attempts** - enable to allow and set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. - - **Show Rationale to Students** - Toggle to display the answer, and the rationale for the answer, to the student. This guidance information will be shown to students after they have submitted their answer and any time they view the assignment after marking it as completed. You can set when to show this selecting from **Never**, **After x attempts**, **If score is greater than or equal to a % of the total** or **Always** - - - **Rationale** - Enter guidance for the assessment. This is visible to the teacher when the project is opened in the course or when opening the student's project. This guidance information can also be shown to students after they have submitted their answer and when they reload the assignment after marking it as completed. - - **Use maximum score** - Toggle to enable assessment final score to be the highest score attained of all runs. - -4. Click on the **Parameters** tab if you wish to edit/change **Parameterized Assessments** (deprecated) using a script. See :ref:`Parameterized Assessments ` for more information. New parameterized assessments can no longer be set up. +.. image:: /img/guides/parsonspuzzlegrading.png + :alt: Grading + :width: 450px -5. Click **Metadata** in the left navigation pane and complete the following fields: +- **Points** - Enter the score if the student selects the correct answer. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). - .. image:: /img/guides/assessment_metadata.png - :alt: Metadata +- **Define Number of Attempts** - Enable to allow and set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. +- **Show Rationale to Students** - Toggle to display the answer, and the rationale for the answer, to the student. This guidance information will be shown to students after they have submitted their answer and any time they view the assignment after marking it as completed. You can set when to show this selecting from **Never**, **After x attempts**, **If score is greater than or equal to a % of the total** or **Always** - - **Bloom's Level** - Click the drop-down and choose the level of Bloom's Taxonomy: https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/ for the current assessment. - - **Learning Objectives** specific educational goal of the current assessment. Typically, objectives begin with Students Will Be Able To (SWBAT). For example, if an assessment asks the student to predict the output of a recursive code segment, then its Learning Objectives could be *SWBAT follow the flow of recursive execution*. - - **Tags** - By default, **Content** and **Programming Language** tags are provided and required. To add another tag, click **Add Tag** and enter the name and values. +- **Rationale** - Enter guidance for the assessment. This is visible to the teacher when the project is opened in the course or when opening the student's project. This guidance information can also be shown to students after they have submitted their answer and when they reload the assignment after marking it as completed. +- **Use Maximum Score** - Toggle to enable assessment final score to be the highest score attained of all runs. -6. Click **Files** in the left navigation pane and check the check boxes for additional external files to be included with the assessment when adding it to an assessment library. The files are then included in the **Additional content** list. - .. image:: /img/guides/assessment_files.png - :alt: Files +4. Click **Create** to complete the process. -7. Click **Create** to complete the process. +.. _grader-options: Grader Options -------------- -**VariableCheckGrader** - Executes the code in the order submitted by the student and checks variable values afterwards. +VariableCheckGrader +~~~~~~~~~~~~~~~~~~~ +Executes the code in the order submitted by the student and checks variable values afterwards. .. raw:: html
-Expected and supported options: +**Expected and supported options**: - ``vartests`` (required) array of variable test objects - Each variable test object can/must have the following properties: +Each variable test object can/must have the following properties: - - ``initcode`` - code that will be prepended before the learner solution code - - ``code`` - code that will be appended after the learner solution code - - ``message`` (required) - a textual description of the test, shown to learner +- ``initcode`` - code that will be prepended before the learner solution code +- ``code`` - code that will be appended after the learner solution code +- ``message`` (required) - a textual description of the test, shown to learner -Properties specifying what is tested: +**Properties specifying what is tested**: - ``variables`` - an object with properties for each variable name to be tested; the value of the property is the expected value @@ -122,33 +100,46 @@ Properties specifying what is tested: - ``variable`` - a variable name to be tested - ``expected`` - expected value of the variable after code execution -**TurtleGrader** - for exercises that draw turtle graphics in Python. Grading is based on comparing the commands executed by the model and student turtle. If the ``executable_code`` option is also specified, the code on each line of that option will be executed instead of the code in the student constructed lines. - .. Note:: Student code should use the variable ``myTurtle`` for commands to control the turtle in order for the grading to work. +UnitTestGrader +~~~~~~~~~~~~~~ + +Executes student code and Skulpt unit tests. This grader is for Python problems where students create functions. Similar to traditional unit tests on code, this grader leverages a unit test framework where you set asserts - meaning this grader checks the functionality of student code. .. raw:: html -
+
- Required options: +LanguageTranslationGrader +~~~~~~~~~~~~~~~~~~~~~~~~~ -- ``turtleModelCode`` - The code constructing the model drawing. The turtle is initialized to modelTurtle variable, so your code should use that variable. The following options are available: +Code translating grader where Java or psuedocode blocks map to Python in the background. Selecting the language allows the Parson's problem to check for correct indentation and syntax. - - ``turtlePenDown`` - A boolean specifying whether or not the pen should be put down initially for the student constructed code - - ``turtleModelCanvas`` - ID of the canvas DOM element where the model solution will be drawn. Defaults to `modelCanvas`. - - ``turtleStudentCanvas`` - ID of the canvas DOM element where student turtle will draw. Defaults to `studentCanvas`. +.. raw:: html + +
-**UnitTestGrader** - Executes student code and Skulpt unit tests. This grader is for Python problems where students create functions. Similar to traditional unit tests on code, this grader leverages a unit test framework where you set asserts - meaning this grader checks the functionality of student code. -.. raw:: html -
+TurtleGrader +~~~~~~~~~~~~ -**LanguageTranslationGrader** - Code translating grader where Java or psuedocode blocks map to Python in the background. Selecting the language allows the Parson's problem to check for correct indentation and syntax. +This is for exercises that draw turtle graphics in Python. Grading is based on comparing the commands executed by the model and student turtle. If the ``executable_code`` option is also specified, the code on each line of that option will be executed instead of the code in the student constructed lines. + +.. note:: Student code should use the variable ``myTurtle`` for commands to control the turtle in order for the grading to work. .. raw:: html -
+
+ + Required options: + +- ``turtleModelCode`` - The code constructing the model drawing. The turtle is initialized to modelTurtle variable, so your code should use that variable. The following options are available: + + - ``turtlePenDown`` - A boolean specifying whether or not the pen should be put down initially for the student constructed code + - ``turtleModelCanvas`` - ID of the canvas DOM element where the model solution will be drawn. Defaults to `modelCanvas`. + - ``turtleStudentCanvas`` - ID of the canvas DOM element where student turtle will draw. Defaults to `studentCanvas`. + Sample Starter Pack ------------------- diff --git a/source/instructors/authoring/assessments/partial-points.rst b/source/instructors/authoring/assessments/partial-points.rst index 1bd16281..31bb8629 100644 --- a/source/instructors/authoring/assessments/partial-points.rst +++ b/source/instructors/authoring/assessments/partial-points.rst @@ -9,12 +9,12 @@ You can award partial points for student assessments in your testing scripts. Us -Autograding enhancements for partial points +Autograding Enhancements for Partial Points ------------------------------------------- -To provide you with more robust auto-grade scripts, you can send back feedback in different formats HTML, Markdown, or plaintext and a URL is passed as an environment variable ```CODIO_PARTIAL_POINTS_V2_URL```. These variables allow POST and GET requests with the following parameters: +To provide you with more robust auto-grade scripts, you can send back feedback in different formats HTML, Markdown, or plaintext and a URL is passed as an environment variable ``CODIO_PARTIAL_POINTS_V2_URL``. These variables allow POST and GET requests with the following parameters: -- **Score** (```CODIO_PARTIAL_POINTS_V2_URL```) - 0-100 percent for assessment, should be a percentage of total points possible. +- **Score** (``CODIO_PARTIAL_POINTS_V2_URL``) - a percentage of total points possible between 0-100. - **Feedback** - text - this is limited to 1 Mb - **Format** - html, md, or txt (default) @@ -22,8 +22,8 @@ If you use Python, you can also use the function ``send_partial_v2``. Also, thro As a general rule, your script should always exit with ``0``; otherwise, the grade will be 0. If the student receives partial points, the results will display an orange percent sign rather than a green check mark or red x. -Example auto-grading scripts with partial points -................................................ +Example Auto-Grading Scripts With Partial Points +------------------------------------------------ The following examples in Python and JavaScript show how you can write your scripts in any language. @@ -113,7 +113,7 @@ The Python script parses the student's file and then assigns points based on spe exit(0 if res else 1) -Example grading script for partial points +Example Grading Script for Partial Points ----------------------------------------- These are examples of the older method of partial points reporting. @@ -145,4 +145,5 @@ These are examples of the older method of partial points reporting. main() -The score you award should be any value between 0 and the maximum score you specified when defining the assessment in the Codio authoring editor. \ No newline at end of file +.. important:: + The score you award should be any value between 0 and the maximum score you specified when defining the assessment in the Codio authoring editor. \ No newline at end of file