Tests are created though the Test Manager tool. There are 17 types of questions and most can be marked automatically.
Surveys are similar to tests in most respects. The principal differences are that you cannot view individual answers to surveys (all responses are anonymous) and you cannot assign marks to questions or give detailed feedback.
There are currently a number of known issues affecting tests.
See the Casebook site for examples of use in learning & teaching at Leeds.
In the Control Panel, Click on Module Tools, then select Tests, Surveys and Pools.
In this example we are going to build a test, so select Tests. (If you need to create a Survey you will find the steps are virtually the same).
Select Build test.
In the Test information section, give your test a name and add any description and instructions as necessary. Click Submit when you are finished.
You’ve just created a test container – next we'll check the question settings.
After creating your test, you'll be taken to the Test Canvas page. From here you can review the question settings.
On the questions settings page you can select whether questions should include feedback for individual answers and options for images, files, and links. You can also change options for scoring and display.
If you wish to enable negative marking, you need to do so here. You need to select Specify partial credit options for answers and then Specify negative points options for answers. Negative marking is available for multiple choice, multiple answer and matching questions.
Make your changes as required and click Submit to return to the Test Canvas page.
In the Test Canvas, select Create Question. Select the question type from the list in the drop-down menu.
In this example, we’ll choose a Multiple Choice question type. Enter the question text.
Next specify your options. If you selected partial/negative marking in your Question settings, you need to enable it per question by selecting Allow Partial Credit and Allow Negative Scores for Incorrect Answers. For Multiple Answer questions, you have the option to 'Allow Negative Overall Score for the Question' (this is not selected by default).
In the Answers section, specify the number of answers and for the correct answer, select the radio button labelled 'correct'. Enter the text for each answer.
Options for partial credit will then appear under questions as relevant - enter the score by percentage, start with the negative symbol for negative marking e.g. -20
You now have an opportunity to add positive and negative feedback, categorise your question with meaningful keywords and add notes for instructors' use. Once you are satisfied with your question, press Submit or Submit and create another if you want to create another question of the same type.
Once you've created all your questions, you are ready to deploy your test.
In your module, go to where you want to add your test and select the Add Assessments button, then select Test from the list in the drop down menu. Select the name of your test from the Add an Existing Test list, and select Submit.
You can edit the Test name and include a description before you deploy it - there is also an option to show students the Test Description you added when creating the test.
Under Test availability, make the test Available if you want students to see it now. You may choose to leave the link unavailable and preview the deployed test before you release it to students.
In this section you can also add an Announcement and select options for multiple attempts, date availability, force completion, timers, auto-submit and passwords.
In our experience it’s best to limit the number of restrictions you place on Tests and Surveys to aid successful completion. If you start restricting attempts and enforcing timers etc you will likely have issues with students who could not take the test properly. For formative tests, leave them as easy to access as possible. For summative tests, ensure you give students appropriate instructions so that they know what to expect from the test conditions.
Next you can choose to add an exception for your deployed test. For example, if you have a student who needs an extra attempt, you can specify this here. You can also create exceptions for availability/removal of Force Completion (if enabled).
Then specify any due dates for completion, Grade Centre and feedback options.
For feedback options, you can:
Finally select how to present your test to students.
Remember to click Submit when you are finished.
We do not recommend using Minerva tests for high stakes summative assessments.
If you choose to do a Minerva test in a controlled classroom environment, stagger the start and student submission times to reduce the chance of work being lost.
In general we recommend you do not use the Force Completion option (in case of technical problems).
Use the test presentation setting 'One at a Time' to help prevent session timeout (unless you are using any form of randomisation when you should use "All at Once".)
If possible, avoid the use of randomised questions in a test - see the section on randomisation below.
Bear in mind that if the student stays on the same page of a test for over one hour, they will lose their Minerva session. This is especially relevant if your test contains work done outside of Minerva.
Tell students to use a wired connection for important tests. Using a wireless connection increases the chances of students losing work. If their wireless connection is lost, any work since the last save will be lost too.
For important tests, you should try your test with the Student Preview functionality. Also check how the results are stored in the Grade Centre. Remember you might want to adaptively release the link to the test so students don't see it while you're testing!
Force Completion: Be aware that by simply opening the test, the student will register an attempt. If they quit the test, abandon the online session or have a technical problem, they will not be able to complete. Make sure students are aware if you enable this setting.
Random Block/Randomise Questions:
Saving Work: There is an autosave function which limits the amount of work lost if a technical problem occurs. This is enabled automatically. However, for an important test we recommend manually saving every 10-15 minutes as well.
Self Assessment Options: If you select the 'Include this Test in Grade Centre Score Calculations' option the score achieved will flow into the grade centre. Be careful with the 'Hide' results option: you will not be able to retrieve any of the results if you enable this option, and attempts will have to be deleted to undo this setting. Contact us via the IT Services Desk if you have this setting on in error and we will see if we can help.
Test length: Blackboard suggest keeping tests short (under 50 questions), especially when using randomised questions. If using a random selection of questions from pools, they recommend keeping the test even shorter. Consider breaking tests into a series of short tests if you have lots of questions. Otherwise, there is a particularly high risk of students losing work.
Test timeout: If the student stays on the same page of a test for over one hour, they will lose their Minerva session. If Force Completion is enabled, they will not be able to resume the test. When using any form of randomisation, note that the test cannot run for longer than one hour.
Test attempts are accessible via the Grade Centre (unless you elected to hide results in the deployment options.)
Students can see their scores by returning to the assessment area or by going to My Grades, if you have added this to the module menu.
If you need to mark answers, go to the Grade Centre in the Control Panel and select Needs Grading.
You should see attempts for the test. Click on a student's name to open an attempt. You will see the student's answers on this page, there are also options to enter marks and return feedback.
You can navigate to other submissions from this page by using the arrows towards the right of the screen:
If you just need to view student marks, go to the Grade Centre in the Control Panel and this time click Tests. You can also download a spreadsheet of entries if needed (this method also works for Surveys).
If you find a question had the wrong mark assigned to it, you can return to the test canvas and change the mark. This will then update the marks given to students.
There is an Item Analysis feature for any test with submitted answers (this feature is not available for surveys).
This provides you with statistics for overall test performance and individual test questions. This is to help highlight questions which might be a poor discriminator of student performance.
You can run Item Analysis from the Tests section of the Control Panel, or by returning to the deployed test in your module. Select the drop down menu next to the test and then Item Analysis.
On the Item Analysis page select Run to view the analysis.
Once the report is generated, you will see a link which will take you to the results.
To export score data from Minerva tests and anonymous results from surveys, you will need to use the Grade Centre.
Each test or survey will have its own column.
Go to the Grade Centre and click the chevron icon that appears adjacent to the column title for the Grade Centre item you want to export:
Then select Download Results from the context menu.
On the Download Results screen:
The file will be downloaded to your computer as a .XLS file which can be opened in Microsoft Excel.
You can copy a single test by following the steps below:
Navigate to the module you want to add the test to, follow steps 1 and 2 above then select Import Test, browse your computer for the test and click Submit.
Once your test has been copied you will need to deploy it (see Step 4 in the Creating a test instructions).
See Copying Module/Organisation and follow the steps. Choose only Tests, Surveys and Pools from the Select Copy Options section.