RCI by OU

Version 8.1 by Davide Bonicelli on 2014/08/21 15:56

The Reliable Change Index (RCI) Report by OU is a way to assess whether any statistically significant change in different OUs was affected in a group or population based on their scores for several domains and their items as they progressed through their various assessments.

CREATING A NEW RELIABLE CHANGE INDEX REPORT - by OU

1. To create an RCI report by organizational unit, when you first open the application, you must select the "Reports and Analytics" tab on the top menu bar. After the pull down menu appears, select the third option marked "Reports"

new reports option - for all reports.PNG

2. Select the "Personal Reports" tab under Saved and Named Reports

2.2 Next, in order to create a new report configuration, click on "New Saved Personal Report" represented by the green plus icon.

2.3 NOTE: If you simply plan on working with an existing RCI report, you may use the steps outlined above but now proceed to use one of the many menu options located on the top row of the left pane. These options include the ability to filter your reports by name, copy, delete, and re-run them

progression report - new report.PNG

3. A list of report options should subsequently come up.

3.1 Proceed to page 2 of this list (by clicking the "next page" arrow towards the bottom of the window") to select RCI by OU.

3.2 If you are having trouble finding the report type, you can use the filter tool located at the top of the window (denoted by the gray filter icon) and simply search "RCI by OU."

rci by ou - report types.PNG

4. A new report, generically entitled RCI by OU, should appear in your list of report.

4.1 Additionally, the pane to the right of your screen allowing you to input specific information about the parameters of your report should appear.

4.2  Initially, you can use this new pane to change the name your report from the generically-assigned "RCI by OU," as well as provide a different or more robust description for your report.

RCI BY OU - NAME.PNG

5. Next, we will discuss the various parameters you can change in running your report.

5.1 Note: you can choose to review each parameter individually by clicking on the down arrow next to each parameter (e.g. "Start Date") or you may alternatively elect to expand/collapse all the parameters simultaneously with the "-/+" options on the top left of the right pane  

  • "Client Status": you can select whether to incorporate data from only active or inactive clients, or both
  • "Sliding Date Range": allows you to choose a "sliding range" of time (e.g. last 6 months). This is called "sliding" because it is not affiliated with particular dates and can change based on when you choose to run the report. Many options are self-explanatory, but see below for some clarifications.
    • "Specified Date Range" - choose this option if you would prefer to manually enter specific start and end dates (see next two parameters)
    • "Year to Date" - this will run a report constituting data from January 1 of the current year until the day the report is run
    • "Last 30 Days" - data from the last 30 calendar days will be included and tabulated
    • "Last Year" - this essentially means that data from the last 365 days will be taken into account. Note that this option does not mean the last calendar year.
  • "Start Date": You can choose the specific date for which you would like the report to begin giving information. The options for entry include a pop-up calendar which you can utilize by simply clicking on the small calendar icon on the right of the Start Date entry line OR you may enter the date in the space provided in MM/DD/YYYY format. NOTE: You will also see an option in this parameter for you to "Allow Run Override." This option is especially helpful if you plan on running the same style of report again at a later time with all the same parameters, but with different date ranges. Selecting the box for run override will allow you to simply enter a new start and end date when choosing to run your report instead of re-entering all the parameters.
  • "End Date": You can choose the specific date for which you would like the report to stop give information. The options for entry include a pop-up calendar which you can utilize by simply clicking on the small calendar icon on the right of the Start Date entry line OR you may enter the date in the space provided in MM/DD/YYYY format. NOTE: You will also see an option in this parameter for you to "Allow Run Override." This option is especially helpful if you plan on running the same style of report again at a later time with all the same parameters, but with different date ranges. Selecting the box for run override will allow you to simply enter a new start and end date when choosing to run your report instead of re-entering all the parameters.
  • "Instruments": The instrument option allows you to choose the specific assessment tool you would like the report to be run on. Upon clicking on the down arrow, a list of available options appear, from which you can choose your preferred instrument (e.g. "ANSA" or "CANS")
  • "Assessment Domain(s)": you can elect to choose which domains from the assessment you are working with you would like to include in the report (e.g. "Culture")
  • "From Assessment type": select which assessment you want to include in the report as your "point A"; note that these options are divide based on the juncture of the assessment and will form the basis of comparison with the "to assessment type"
  • "To Assessment type": select which assessment you want to include in the report as your "point B"; note that these options are divide based on the juncture of the assessment and will form the basis of comparison with the "from assessment type"
  • "Assessment Status(es)": decide what sorts of assessments you would like included in the report (e.g. only ones marked "approved")
  • "Organizational Units": The organizational unit represents the geographic or other division of systems. In order to select your preferred organization unit, simply click the down arrow to expand the options, and then proceed to check all the applicable boxes for all locations (e.g. "Sydney") you are interested in for your report.
  • "Reporting Units": The reporting unit represents the classification of clients based on various factors through the system of tagging. You can select all reporting units of interest by checking all appropriate boxes (and sub-boxes) you want your report to incorporate. Note: this is an optional parameter (the only one), and you may leave this blank and still generate a report.
  • "Gender": allows you to choose a specific gender to analyze in your report (e.g. "male" or "female")
  • "Age Ranges": allows you to select which group of people, based on age, to include in your data

RCI BY OU - PARAMETERS.PNG

6. After you have made these changes and included all this new information, you should save these parameter inputs using the "Save" button (represented by the floppy disc icon) located on the top of the right pane.

6.1 Hitting the save button should update the report displayed on the left pane and also provide that report with a unique ID Number (which should be listed in the far left column of the left pane).

6.2 Congratulations, your report is now ready to run! To do this, you can select your new report and select the "Run Report" option on the above bar of options

tickler 7.PNG

7. When you run your report (by clicking on the "Run Report" button), your new RCI by OU report should open in a new tab on your computer.
7.1 Note: your report will vary in length depending on what organizational unit you chose, as well as how many sub-groups were part of the larger unit.
7.2 If you would prefer a paper copy of the report, you can easily obtain one by choosing the "Print Report" option (represented by the printer icon) in the menu bar towards the top of the screen.

7.3 There are also options in this space for you to Export the Data from this report or Save the Report (as a PDF, for example) to your computer.

8. If you want to delete the report you have created (or any report that is saved with an ID number), you can do so by selecting the report (causing it to highlight) and clicking on the "Delete Report" option represented by the red minus sign.

assessment aging - toolbar.PNG

9. TOOLBAR OPTIONS

  • Refresh: allows you to check for any new or updated reports
  • Open Selected: opens up the information and parameters for the selected report
  • Show Filter: allows you to filter your reports by name
  • New Saved Personal Report: allows you to create a new report (see above)
  • Delete Report: allows you to delete an existing report (see above)
  • Copy: allows you to copy the exact report type and its parameters for future use
  • Run Report: allows you to run report and print out (see above)
UNDERSTANDING YOUR RCI BY OU REPORT

Next, we will move on to looking at the report and the information it conveys. Please see the sample report below for reference.

rci report - by ou.PNG

Note initially that this is only the first page of the report. Because we selected Melbourne as an OU we wanted to consider when entering our parameters (and because it is first alphabetically), it was elaborated on in this page. Each individual OU will occupy a unique page in your report.

Note also that the instrument of our choice was CANS, but this report can apply to other instruments as well. 

Let's begin dissecting the report and the data it provides. Remember, again, that the RCI by OU Report enables you to track and understand the progress of your clients with regards to their scores on assessment domains and items in particular OUs across two assessment reasons (essentially over time). 

Top

  • the title of the report is featured prominently at the center of the first page
  • many of the parameters you entered are also included for your ease of remembrance (this includes the period of time you indicated was of interest and the OUs you selected).
  • also vital here is the two assessment reasons listed, as this is what provides the basis for any comparison.

Bottom

  • your report includes a time stamp of the exact date and time the report was ordered (not pictured above).

Middle (DATA)

  • Because we selected Melbourne as an OU of interest, its data is provided for us
  • We can see that, in Melbourne, there were 12 people who met all the criteria we set out in our parameters. This is because each row below the graph (representing a single domain or item) adds up to 12, implying 12 clients.
  • There will always be three columns on the bottom: one for decline, one for no change, and one for improvement
  • The domains and items you choose will be the ones that will be considered (e.g. Child Strengths and Culture) in relation to progress
  • The graph above and numerical data below show the same data, except for the fact that the numerical information provides absolute quantities (i.e. exact number of clients) instead of just percentages
  • However, the bar graph is often easier to understand and serves as a visual guide to your results. In fact, it has been color-coded to make the process easier:
    • RED: the red bar represents the percentage of clients whose performance in relation to a specific domain/item declined between the two assessments, meaning their scores went up (e.g. someone could have gotten a 1 on their initial assessment and then a 3 on their scheduled update). 
    • YELLOW: the yellow bar represents the percentage of clients whose performance stayed the same over the two assessments, meaning their scores remain unchanged
    • GREEN: the green bar represents the percentage of clients whose performance improved between the two assessments, meaning their scores went down (e.g. someone could have gotten a 3 on their initial assessment and then a 1 on their scheduled update). 
  • Clearly, such data proves extremely useful when looking at a specific OU and its ability to meet the needs of its clients. Looking at the data, someone in a supervisory capacity may look to see if, in a given OU:
    • are most clients improving, or at least staying the same?
    • if one OU has a 90% green bar for Culture (i.e. lots of improvement is happening) and another one has 70% red, what differences exist between these two environments? Also, what can we do at the second OU to better emulate the practices and thus success of the first OU?
    • what about certain OUs might make them better adept at improving one domain and another OU better at improving a different one?

The ability for the RCI by OU to allow for such questioning and analysis (and its ability to compare and screen data across OUs) makes it an obviously powerful tool.