Skip Navigation

Further Information

Tests and Surveys

Tests are created though the Test Manager tool. There are 17 types of questions and most can be marked automatically.

Surveys are similar to tests in most respects. The principal differences are that you cannot view individual answers to surveys (all responses are anonymous) and you cannot assign marks to questions or give detailed feedback.

Examples at Leeds

See the Casebook site for examples of use in learning & teaching at Leeds.

Creating a test

Step 1 – Create the test container

In the Control Panel, Click on Module Tools, then select Tests, Surveys and Pools.

Tests, Surveys and Pools option in the Control Panel

In this example we are going to build a test, so select Tests. (If you need to create a Survey you will find the steps are virtually the same).

Tests option

Select Build test.

Build test option

In the Test information section, give your test a name and add any description and instructions as necessary. Click Submit when you are finished.

Test information page

You’ve just created a test container – next we'll check the question settings.

Step 2 – Check the settings

After creating your test, you'll be taken to the Test Canvas page. From here you can review the question settings.

Question Settings option

On the questions settings page you can select whether questions should include feedback for individual answers and options for images, files, and links. You can also change options for scoring and display.

If you wish to enable negative marking, you need to do so here. You need to select Specify partial credit options for answers and then Specify negative points options for answers. Negative marking is available for multiple choice, multiple answer and matching questions.

Settings required for negative marking

Make your changes as required and click Submit to return to the Test Canvas page.

Step 3 – Create a question

In the Test Canvas, select Create Question. Select the question type from the list in the drop-down menu.

Select the question type you require

In this example, we’ll choose a Multiple Choice question type. Enter the question text.

Entering question text

Next specify your options. If you selected partial/negative marking in your Question settings, you need to enable it per question by selecting Allow Partial Credit and Allow Negative Scores for Incorrect Answers. For Multiple Answer questions, you have the option to 'Allow Negative Overall Score for the Question' (this is not selected by default).

Options for partial credit within Question

In the Answers section, specify the number of answers and for the correct answer, select the radio button labelled 'correct'. Enter the text for each answer.

Enter the answers

Options for partial credit will then appear under questions as relevant - enter the score by percentage, start with the negative symbol for negative marking e.g. -20

You now have an opportunity to add positive and negative feedback, categorise your question with meaningful keywords and add notes for instructors' use. Once you are satisfied with your question, press Submit or Submit and create another if you want to create another question of the same type.

Once you've created all your questions, you are ready to deploy your test.

Step 4 - Deploying a Test/Survey

In your module, go to where you want to add your test and select the Add Assessments button, then select Test from the list in the drop down menu. Select the name of your test from the Add an Existing Test list, and select Submit.

Select Test from Assessments drop down menu

You can edit the Test name and include a description before you deploy it - there is also an option to show students the Test Description you added when creating the test.

Under Test availability, make the test Available if you want students to see it now. You may choose to leave the link unavailable and preview the deployed test before you release it to students.

In this section you can also add an Announcement and select options for multiple attempts, date availability, force completion, timers, auto-submit and passwords.

How to make the test available

Next you can choose to add an exception for your deployed test. For example, if you have a student who needs an extra attempt, you can specify this here. You can also create exceptions for availability/removal of Force Completion (if enabled).

Adding exceptions to your deployment

Then specify any due dates for completion, Grade Centre and feedback options.

For feedback options, you can:

  • Select 'Score per Question' to show the student the score they achieved for each question.
  • Select 'All Answers' so that the student can see questions with all the possible answers. (Note, for automatically marked question types where there is only one answer, this can reveal the correct answer).
  • Select 'Correct' to show the student the question text with the correct answer(s).
  • Select 'Submitted' to show the student what response they gave.
  • If you select 'Show Incorrect Questions' this will highlight which questions the student got wrong.

Finally select how to present your test to students.

Remember to click Submit when you are finished.


Pitfalls

General Recommendations

We do not recommend using Minerva tests for high stakes summative assessments.

If you choose to do a Minerva test in a controlled classroom environment, stagger the start and student submission times to reduce the chance of work being lost.

In general we recommend you do not use the Force Completion option (in case of technical problems).

Use the test presentation setting 'One at a Time' to help prevent session timeout (unless you are using any form of randomisation when you should use "All at Once".)

If possible, avoid the use of randomised questions in a test - see the section on randomisation below.

Bear in mind that if the student stays on the same page of a test for over one hour, they will lose their Minerva session. This is especially relevant if your test contains work done outside of Minerva.

Tell students to use a wired connection for important tests. Using a wireless connection increases the chances of students losing work. If their wireless connection is lost, any work since the last save will be lost too.

For important tests, you should try your test with the Student Preview functionality. Also check how the results are stored in the Grade Centre. Remember you might want to adaptively release the link to the test so students don't see it while you're testing!

 

Information re Test Settings

Force Completion: Be aware that by simply opening the test, the student will register an attempt. If they quit the test, abandon the online session or have a technical problem, they will not be able to complete. Make sure students are aware if you enable this setting.

Random Block/Randomise Questions:

  1. If you are using a random block of questions you should not select Randomise Questions in the Test Deployment settings - this can cause problems.
  2. If using any form of randomisation:
    • do not combine this with the test presentation setting 'One at a Time.'
    • the test cannot run for longer than one hour.
    • the test should have less that 50 questions; a series of short tests is safer than one long test.

Saving Work: There is an autosave function which limits the amount of work lost if a technical problem occurs. This is enabled automatically. However, for an important test we recommend manually saving every 10-15 minutes as well.

Self Assessment Options: If you select the 'Include this Test in Grade Centre Score Calculations' option the score achieved will flow into the grade centre. Be careful with the 'Hide' results option: you will not be able to retrieve any of the results if you enable this option, and attempts will have to be deleted to undo this setting. Contact us via the IT Services Desk if you have this setting on in error and we will see if we can help.

Test length: Blackboard suggest keeping tests short (under 50 questions), especially when using randomised questions. If using a random selection of questions from pools, they recommend keeping the test even shorter. Consider breaking tests into a series of short tests if you have lots of questions. Otherwise, there is a particularly high risk of students losing work.

Test timeout: If the student stays on the same page of a test for over one hour, they will lose their Minerva session. If Force Completion is enabled, they will not be able to resume the test. When using any form of randomisation, note that the test cannot run for longer than one hour.

Timer:

  1. Although it is technically possible to set a timer for more than one hour, bear in mind tests using random blocks/randomised questions/presented on one page will time out after an hour.
  2. There is now an auto-submit option which means the student's work will be automatically saved and submitted when the timer expires.

Results

Test attempts are accessible via the Grade Centre (unless you elected to hide results in the deployment options.)

Students can see their scores by returning to the assessment area or by going to My Grades, if you have added this to the module menu.

If you need to mark answers, go to the Grade Centre in the Control Panel and select Needs Grading.

Needs Grading in the Grade Centre

You should see attempts for the test. Click on a student's name to open an attempt. You will see the student's answers on this page, there are also options to enter marks and return feedback.

You can navigate to other submissions from this page by using the arrows towards the right of the screen:

The grading interface

If you just need to view student marks, go to the Grade Centre in the Control Panel and this time click Tests. You can also download a spreadsheet of entries if needed (this method also works for Surveys).

Analysis

There is an Item Analysis feature for any test with submitted answers (this feature is not available for surveys).

This provides you with statistics for overall test performance and individual test questions. This is to help highlight questions which might be a poor discriminator of student performance.

You can run Item Analysis from the Tests section of the Control Panel, or by returning to the deployed test in your module. Select the drop down menu next to the test and then Item Analysis.

The test anaylsis option

On the Item Analysis page select Run to view the analysis.

The test analysis page

Once the report is generated, you will see a link which will take you to the results.

Exporting data

To export score data from Minerva tests and anonymous results from surveys, you will need to use the Grade Centre.

Each test or survey will have its own column.

Go to the Grade Centre and click the chevron icon that appears adjacent to the column title for the Grade Centre item you want to export:

Screenshot showing how to find the column context menu

Then select Download Results from the context menu.

On the Download Results screen:

  • Leave the file type as Tab
  • Leave the format of results as By User
  • Leave the Attempts setting at Only Valid Attempts (unless you need to see all attempts for each user)

The file will be downloaded to your computer as a .XLS file which can be opened in Microsoft Excel.

Copying tests

You can copy a single test by following the steps below:

  1. In the module that contains the test you would like to copy, go to the Control Panel, Module Tools and then Tests, Surveys and Pools.
  2. Click Tests.
  3. You should see a list of your tests on the module.
  4. Hover over the relevant test and click the action link.
  5. Select Export to Local Computer.
  6. Save the exported test on your computer.

Navigate to the module you want to add the test to, follow steps 1 and 2 above then select Import Test, browse your computer for the test and click Submit.

Once your test has been copied you will need to deploy it (see Step 4 in the Creating a test instructions).