HomeData QualityPrinter Friendly Version

Data Quality

1. Remote Access Application

1.1. TeamViewer Application

This free application can be used a tool to connect to another person's computer remotely.  It it compatible with Windows and Apple OS



For more information about the application or the company go to https://www.teamviewer.us

2. Data Quality Reports - Presentations

2.1. How to run and use Data Quality Reports in HMIS

Major changes since 2015: you can now run it on multiple providers, there is no more Summary tab, the Households data quality has been combined with the Entry Exits and Assessments data quality and is now called the Data Quality All Workflows report, and the other one is ONLY necessary for those projects that enter Service Transactions.

3. Which Reports Should I Run and How Often?

3.1. About Correcting Your Data

Throughout the Data Quality reports, you will notice various words used to describe your data- words like "Missing", "Incorrect", "Questionable", etc. Where there is something to be done about it, users are expected to collect and correctly enter all client data. But sometimes there is not a way to collect a piece of data or there is an exceptional situation, and that is expected and ok, to a degree. Above everything, we want the data that goes into HMIS to be true. If a Destination either wasn't collected, or the client didn't know or refused, then your data should reflect that. It will show up on your Data Quality report, but not everything on the Data Quality report needs to be fixed. Thus, the clients listed in your reports are not meant to be corrected necessarily and in every case. And on the other side, HUD counts any "Data not collected", "Client doesn't know", and "Client refused" data as "Missing". They have thresholds for missing data, however, because they expect a certain amount of missing data, but the threshold is low enough that where possible, every data element should be attempted for every client.

For the Balance of State, when we are assessing Data Quality "points" on your CoC competition or for the quarterly Top Ten Perfect Data Quality list, we only count "Missing" and "Incorrect" data as being a negative (excluding Destination). Questionable data is just that: we're questioning it, realizing that it could be accurate.

If there are any questions about this balance between truth in your data and aiming at perfection, please give us a call so we can talk about it! Above all, we want your data to reflect the truth.

3.2. Data Quality Monitoring Process

During the summer of 2016, the Balance of State HMIS Department presented a webinar on a new Data Quality Monitoring process. You can find the slides here. Following is the written process which will remain updated as things change over time.

Monthly, the Balance of State HMIS team will run a version of the Data Quality report on ALL projects. This report will return the number of Clients with Missing or Not Correct data, and the number of Households in an error state. The providers will then be ordered by the number of clients and households in error. This list will be reviewed by the HMIS team and CoC staff monthly. The five or six providers with the highest number of clients and households in error (referred to as "The Last Five" in the slides) will be contacted by the HMIS team so that we can work through the issues over the next month or so.

Also monthly, the Balance of State HMIS team will run a CoC-wide Desk Time report on the past year. The providers with the highest median Desk Times will be contacted by the HMIS team so that we can work through the issues.

Quarterly, the Balance of State HMIS team will look at the data from the report described above, but instead of looking at those providers with a lot of issues, we will be looking at those with perfect Data Quality. Since there are so many providers with perfect Data Quality, we will decide how to organize our Top Ten list each quarter. Examples may be HMIS Top Ten by Most Clients Served or HMIS Top Ten Transitional Housing Projects. We will check the providers that land on this list against the Desk Time report to be sure that none of them have excessive Desk Times. Once the group has agreed who should be in the Top Ten list, an email will be prepared and sent out to the listserv with the announcement.

Should a provider not show improvement on the Data Quality report from one month to the next, the HMIS team may recommend that the provider be placed on a Quality Improvement Plan (QIP) through the Balance of State CoC. Since the Desk Time median will not show immediate improvement even if the provider immediately takes all necessary steps to enter their clients within 5 days of program entry, we will keep up with who we have contacted already and notice whose Desk Times are not improving. These providers may also be recommended for a QIP.



3.3. Data Quality 1- Client Data

Before diving into the Data Quality reports, it is highly recommended that users read the Data Quality Standards document that went into effect November 1st, 2013. The "Data Quality - All Workflows" report should be run MONTHLY.

TO RUN THE "Data Quality - All Workflows" REPORT:

  1. Click the "Connect to ART" link at the top right of your screen.
  2. Navigate the folders to the report: for Balance of State, go to Public > Balance of State HMIS > Data Quality and Performance > Monthly. For Youngstown, go to Public > Data Quality.
  3. Click the magnifying glass next to the "Data Quality - All Workflows" report and click "View".
  4. Only answer the Provider: prompt (NOT the "EDA Provider").
  5. Run it from the beginning of your fiscal year or from the most recent October 1st, whichever comes first, to the current date. Be sure to answer the Effective Date prompt as well. It should match your End Date.
  6. Click "Refresh Data".


3.4. Data Quality 2- Needs Services Referrals

Before diving into the Data Quality reports, it is highly recommended that users read the Data Quality Standards document that went into effect November 1st, 2013. The "Data Quality 2- Needs Services Referrals" report should be run MONTHLY.

TO RUN THE "Data Quality 2- Needs Services Referrals" REPORT:

  1. Click the "Connect to ART" link at the top right of your screen.
  2. Navigate the folders to the report: Public > Balance of State HMIS > Data Quality > Monthly.
  3. Click the magnifying glass next to the "Data Quality 2- Needs Services Referrals" report and click "View".
  4. Be sure to answer only the Provider prompt and NOT EDA Provider.
  5. Run it from the beginning of your fiscal year or from the most recent October 1st, whichever comes first, to the current date.
  6. Also, it is important that your "Report End Date Plus One" is the same date as your "Effective Date". If you are not running it on a Rapid Rehousing or PATH or RHY project, you may find it makes the most sense to only run it back to January of 2018 when your region implemented its Coordinated Entry plan.
  7. Click "Run Query".
NOTE: Client data that shows in this report as "Questionable" is not meant to be corrected necessarily- only if you see that the data actually reflects something that is not true. This gives you the chance to double-check this kind of data so that you are not reporting incorrect data. As the description states: Do not change data that reflects reality!

The prompts for this report are very basic: Provider and a date range. As always, your Report End Date should equal your Effective Date.

Below, you will find a screenshot of each tab in the report along with its explanation at the top.

The following two images are from the same tab. If you run the report on an Access Point that has been entering Referrals, you will see the first one with "Yay!" at the end. If you run it on an Access Point that does not seem to have begun using Referrals, you will see the name of that Access Point listed.

The following tab is VERY MUCH up for discussion. If you find a client where your Referral Outcome/Reason and Needs Status/Outcome/Reason are true but not ideal, we *do* want to allow that to be recorded and it should not show here. Please send an email to hmis@cohhio.org if you find this kind of issue.


The final page of the report summarizes the prompts you ran the report with and how many total clients should be showing on the Summary page.

Please report any problems with this report to hmis@cohhio.org or call 614.280.1984 extension 123.

3.5. Data Quality Reports Changelog

Version 5.9

10/13/2017: Updated to new Data Standards

  1. Fixed Residence Prior variable to check for all the new picklist values.
  2. Fixed Move-In Date to include PSH projects' data and to account for the 10/1/2017 cutoff date.
  3. Added a check that any PSH Entries prior to 10/1/2017 must equal the Move In Date.
  4. Removed the requirement to answer the other If Yes questions on Disability.

Version 5.8

9/13/2017: Updated the Missing HoH variable, also the Questionable Housing Data tab

  1. The Missing HoH variable had been looking at whether the Household was built correctly in Households AND whether the Relationship to Head of Household was correct. The only one that matters for reporting, however, is the Relationship to Head of Household, so I removed the other condition so that there would be fewer Household errors.
  2. The Questionable Housing Data wasn't looking at whether institutional stays were less than 90 days and the client was Literally Homeless prior— I fixed it so that those clients will NOT show as questionable.

Version 5.7

8/4/2017: Updated the PATH tab

  1. Made it so that the Contacts data block only checks adults and heads of household.

Version 5.6

5/4/2017: Updates to make report align more closely with the Data Quality Framework HUD released in April plus minor fixes.

  1. Moved the Duplicate Entry Exits, Questionable Housing Data, and Household tabs to the front because projects with these types of errors should fix these first.
  2. Removed RHY project data from the Income, Non-cash, and Children Only Households tabs.
  3. Adjusted Missing SSN logic to detect fake SSNs more accurately.
  4. Adjusted Missing Name logic to more closely align with the DQ Framework.
  5. Replaced the old "Duplicate Entry Exit" logic with the logic in the DQ Framework for "Project Entry" errors. (Still named the same thing, but uses more elegant coding.)
  6. Adjusted the income logic to more closely align with the DQ Framework. Still not exact.
  7. Fixed the Total Clients formula on the Additional Information tab.

Version 5.5

3/27/2016: Removed Entry Exit Type prompt, moved the date to go back to on County of Residence Prior back to April to match County Where Served's date.

Version 5.4

11/8/2016: Finishing touches to the new 2016 changes to the 2014 Data Standards

  1. Fixed some tabs where it wasn't looking at any data if it was entered since 10/1/2016. (This means you may have errors today that you didn't see yesterday.)
  2. Fixed the Domestic Violence column to catch missings for SSVF and PATH clients that entered projects since 10/1/2016 but not before.
  3. Added new SSVF data elements pertaining to the new data elements.
  4. Fixed the Disabilities tab to not pull in clients that look (and are) fine.

Version 5.3

10/16/2016: Accounting for the new 2016 changes to the 2014 Data Standards.

  1. Added logic to Residence Prior that will return "Incorrect" for any client with "Interim Housing" where the Program Type is not equal to PSH and the client has no disability.
  2. Included SSVF in data checks on disability subs, Domestic Violence, and the County fields for clients who entered on or after 10/1/2016.
  3. Removed references to Other Gender and Other Residence Prior.
  4. Added "On the night before, literally homeless?" and adjusted logic on Homeless Start Date, Times Homeless, and Months Homeless to account for Residence Prior, Program Type, and other answers to check accuracy.
  5. Added a "2016" layer to the different client groupings as to which clients need which data elements and used that to adjust which clients are being checked for what.


10/3/2016: Due to the ServicePoint upgrade, made corrections on the Data Quality - All Workflows report, Missing and DKR 1 and Questional Housing Data tabs.

  1. Bowman changed the Residence Prior picklist value for "Emergency shelter..." by adding a space so any variables with Residence Prior in it were broken.


9/29/2016: Correction on the Data Quality - All Workflows report, Disabilities tab.

  1. The number of Disabilities was not calculating correctly. Moved parenthesis in the How Many Disabilities variable and it tested accurately.

Version 5.2

8/24/2016: Inclusion of PATH data elements, County fields, change to Future EEs logic, removal of old Length of Time Homeless questions.

  1. Removed all the old Length of Time Homeless questions from the Missing & DKR2 tab.
  2. Modified the DV variable to ignore PATH clients since DV isn't required for PATH providers.
  3. Added a new tab for PATH that includes all PATH-specific data elements.
  4. Added logic to Questionable Housing Data tab that flags Street Outreach clients with a Residence Prior of anything other than "Place not meant for habitation".
  5. Added "County in which client is being served" and "County of Residence Prior" to Missing & DKR 2. For "County in which client is being served", it will only flag clients that entered on or after 5/6/2016 since that's when we added the field. For "County of Residence Prior", it will only flag clients that entered on or after 9/1 2016.
  6. Changed the logic that pulls Future Entry Exits, adding a condition that looks at if the Date Added Entry Exit is less than Entry Date.
  7. Corrected a problem on the Disabilities tab where it was not picking up all the clients with missing "If Yes" questions.
  8. Adjusted some of the header narratives to account for the fact that we have removed access to the Assessments tab.
  9. Removed text boxes in the headers that said "This report indicates a definite data quality issue." because it was confusing people. In general, if something says Missing or Incorrect, it likely needs to be corrected. If it says Questionable or DKR, it is only expected that the user look at the issue to be sure the data is correct and if not, then make corrections.


6/23/2016: Correction on the Data Quality - All Workflows report.

  1. The DV Questions column on the Missing & DKR2 tab was coded so that if the Currently Fleeing question wasn't answered, the Client ID would show in the grid, but it wasn't clear why. Adjusted the [DQ DV] formula to only check the main question, and not the "if" questions. Then coded the field to look at [DQ DV] plus the other two DQ DV variables for either Missing or Not Correct. Then created an alerter so that the text returned would make sense.


5/27/2016: Minor change to Data Quality - Services report.

  1. The client count on the Additional Information tab was incorrect. Corrected the formula.
  2. Changed the name of the report in all the footers to reflect that there is no Household data quality data in the Data Quality - Services report anymore.

Version 5.1

5/18/2016: Necessary adjustments, improvements.

  1. Renamed the reports. What was the Data Quality: Assessments and Entry Exits report is now the Data Quality - All Workflows and what was the Data Quality: Households and Services report is now the Data Quality - Services. Projects that enter Service Transactions still need to run both reports, but projects that do not enter Service Transactions can just use the Data Quality - All Workflows for a full data quality check.
  2. Added ability to run the Data Quality - Services report on multiple providers.
  3. Cut Summary tab on both reports because now that the reports can be run on multiple providers, the formulas did not return the correct numbers. Moved percentages out to each individual tab.
  4. Added three new tabs to the Data Quality - All Workflows report: Missing Head of Household, Multiple Heads of Household (new), and Children Only Households.
  5. Cut the Household tabs from the Data Quality: Households and Services report so that projects with workflows that do not include Service Transactions will only have to run one Data Quality report from now on.
  6. Adjusted some of the header narratives about the various issues.
  7. Added Adult or Child column to the relevant tabs so that users can tell if a client is an adult or child. This was a user request.
  8. Added grouping to list client IDs who are in the same household together. This was also a user request! (Thank your friendly RROhio HMIS person for #7 and for this one.)
  9. Removed Interims tab for now and made it its own report, available here: Public Folder > Balance of State HMIS > As Needed and Custom Reports > Interims. The plan is to add functionality back in later that will show "Missing Interims" as reported on the APR. Since the Interims tab as it is serves less of a data quality role, and more of a way of knowing who needs Interims, we decided it fit best in its own report under the "As Needed" folder.
  10. Cut the "Different Housing Statuses in Household" from the Data Quality - Services report because Housing Status is not collected anymore so this cannot be an issue.

Version 5.0

3/29/2016: Change to HUD's guidance on how to enter babies who are born during a program stay and other realizations.

  1. Cut "Date of Birth = Entry Date" tab because the new guidance on how to enter babies born into a program make it likely (and correct) that a client can have a Date of Birth equaling their Entry Date.
  2. Added the ability to run the Data Quality: Assessments and Entry Exits report on multiple providers.
  3. Adjusted some of the header narratives about the various issues.

Version 4.2

12/30/2014: After all the dust settled from the initial changes to this report after the new Data Standards came out, the HMIS Department met and decided to proceed with the following changes, which are all reflected in this version.

  1. Stop including the version number in the report title because it messes up people's report scheduling. Instead record the version number in the report's "Description" field.
  2. Cut "Non-DV Anonymous" because it is not a large issue in the Balance of State.
  3. Cut "Income Decrease" because the workflow has changed such that users are not likely to create this as a data quality error anymore. Further, when this is an issue, it is usually not a data quality problem, and it shows on the Quarterly Performance Report (QPR) where the user has the chance to verify the accuracy of the data.
  4. Cut "Housing Status" because the Housing Status field was cut from the ESG and CoC assessment and is, in general, not used anymore. The program types that will be using it will have their own supplemental Data Quality reports.
  5. Change "Questionable Housing Data" from comparing Housing Status, Residence Prior to Entry, and Program Type to only comparing only Residence Prior to Entry and Program Type. The report will now only check to be sure that if your program type has certain eligibility restrictions and an adult client has a Residence Prior to Entry that does not match, it will show here. For example, a client with a Residence Prior to Entry of "Rental No Subsidy" but a Program Type of Rapid Rehousing would show up on this tab.
  6. Move "Missing Entry Exit" from "Data Quality: Entry Exits and Assessments" to "Data Quality: Households and Services" because the logic relies on the agency using Service Transactions and the "Data Quality: Entry Exits and Assessments" report is the larger/slower report.
  7. Clean up the report, eliminating variables and data that are not being used in the final report.
  8. The Summary Page is still broken!! This will be fixed soon.

Versions 3.1-4.0

10/1/2014-12/15/2014: Once HUD released the new Data Standards, this report underwent many changes, including adding functionality to check completeness on all of the new data elements. Other changes are:

  1. Added ability for each report to distinguish between data that falls under the old data collection rules and the new ones. This ensures that all errors on the report only flag for those clients that did not exit prior to 10/1/2014.
  2. Added tabs that check Health Insurance and Disability questions for completion and accuracy.
  3. Split the tabs that check for completion into "At Entry" and "At Exit" sections since some of the data elements are required at Entry and at Exit for certain program types.
  4. Included SSVF data elements and logic in all tabs.
  5. Expanded the "Missing and DKR" report to cover three tabs. The second one (called "Missing & DKR 2") includes new data elements from the new Data Standards. The third one (called "Missing & DKR SSVF Only") which only checks data elements specific to the SSVF program.
  6. Added logic to the Missing & DKR tabs that check for accuracy, finding clients with combinations of answers that would be impossible or highly unlikely.
  7. The Summary Page no longer works.

Version 2.2

8/1/2014: Added Duplicate Entry Exits and Incorrect Entry Exit Type to the Summary page. Also added verbiage to Tab B explaining what "Unlikely" means in relation to data being complete and correct.

Version 2:

7/22/2014: Corrected Missing SSN variable to include those clients with the "SSN Data Quality" field not answered. Also added logic to match the Completion Summary that causes Dates of Birth that are over 0.75 years prior to the Entry Date or where the client age is over 100 to be flagged as "Unlikely". Also added logic to the Missing SSN variable that flags SSN's of 000-00-0000, 111-11-1111, 999-99-9999,  or 123-45-6789 as "Unlikely". Those data elements flagged as "unlikely" will be counted the same as the actual "Missing" data, since it is not valid data.







3.6. Bed and Unit Utilization by Provider

**This guidance applies to Emergency Shelter, Transitional Housing, Rapid Rehousing, and Permanent Supportive Housing providers only.**

HOW TO RUN THE "Bed and Unit Utilization by Provider" REPORT:

1. Log into ServicePoint.

2. Click "Connect to ART" at the top right of your screen.

3. Navigate the ART folders: click the black triangle next to "Public", then "Balance of State HMIS", then "Data Quality and Performance", then "Monthly". For Youngstown, go to Public > Performance.

4. Click the magnifying glass next to the "Bed and Unit Utilization by Provider".

5. Click View Report. The image below shows the prompts that appear.
bed utilization prompts

6. The Month PLUS 1 prompts: Each of these are preset to the last Wednesday of the month PLUS one day. They can be changed but in general there is no need.

7. The Provider prompt: Users can select one or more providers at a time. The report will show bed and unit utilization separately for each provider selected.

8. The EDA Provider prompt: This should be left as "-Default Provider-" when running this report.

9. Once all the prompts have been answered, click "Run Query".

Interpreting the Bed and Unit Utilization Report

To understand why the Bed and Unit Utilization Report is important, it is important to know about the Annual Homelessness Assessment Report (AHAR), which is a report that HUD submits to Congress every year. All data that we upload to the AHAR comes from your HMIS data. HUD expects the data to fall within boundaries they consider reasonable. Any data that does not fit within their specifications causes an error and requires either a change to the data or an explanation. An area that typically causes great difficulties is HUD's expectation of Bed Utilization rates fall between 65% and 105%. Your Bed Utilization Rate gives an idea of how full your program is on a given night.

Bed Utilization:

  • For the AHAR, Bed Utilization is measured the last Wednesday of every month. AHAR reporting begins on 10/1 and goes to 9/30.
  • Bed Utilization is the number of clients who were in your program on a night divided by the total number of beds reported on the Housing Inventory Chart (HIC).

If the Bed Utilization percentage is too high, either there are too many clients in the program on the night of the measurement or the number of beds reported on the HIC is low. The most likely scenario is that there were clients that were enrolled in the program that night that were not actually there because they were just never exited.

If the Bed Utilization percentage is too low, there were either too few clients in the program the night of the measurement or the number of beds reported on the HIC is too high. If you are unsure of which is the case, please contact the HMIS Department at COHHIO for help.

Unit Utilization:

  • Unit Utilization is not currently used in any HUD reporting or in any CoC-level scoring or monitoring. This could change, however.
  • For right now, the Unit Utilization is being included along with your Bed Utilization data so that it can serve as another way of getting at how full a project is on a given night.
  • Unit Utilization is the number of households who were in your project on a night, divided by the number of units reported on the HIC. In this calculation, a household can be a single individual or multiple family members and to get your total Unit Count, we add together the number of units reported for your family beds and the number of beds reported for your individual beds.

Since there is currently no window or threshold for unit utilization, the report does not highlight anything with a red font; it is purely informational at this time.

Looking at the image above, the data blocks are split into Bed Utilization on the left and Unit Utilization on the right. On the 7/27/2016 row, you can see the Bed Utilization data seems to be indicating that the shelter was underutilized on that night, with only 63% of its beds in use. That is below the threshold, and you'll see it's in a red font because of that. But if you look across at the Unit Utilization for that same day, you can see that while they only filled 10 of their 16 beds, they were actually filling 7 of their 8 units!

Unit utilization definitely has its advantages, but it is not perfect either. If you look at the 2/24/2016 row, you can see that the Bed Utilization looks pretty reasonable at 75%, however their Unit Utilization on that night is extremely high, maybe because the shelter staff roommated two individuals into each unit. It could also be that a household was entered incorrectly as two singles.

At this point, you may wonder how you are supposed to find the exact Client IDs and Households that make up the "# Beds Used" or "# Units Used" of the percentages so that you can find which clients may need to be corrected. For this, you will use the Detail tabs that follow the main Bed Utilization page.

Bed and Unit Utilization Detail:

The Detail tabs are labeled one tab for each month of the calendar year.

The image above shows the Detail tab for the last Wednesday in March. The different elements of this report are the title, provider, recap, and detail data.

  • Title: The title tells you which date you are looking at.
  • Provider Name: shows the Provider the data is for, which is redacted (mostly) in this document. If you run the report on multiple providers, it will separate each provider's data into separate sections.
  • Recap: the next table is the Bed and Unit Utilization data from the row on the main Summary page that matches the date in the title.
  • Client Detail: the client detail gives you the Client IDs being counted. Households are grouped together with bolder gray lines between households.

Use this detail data to figure out who (according to HMIS) was in your program for that night and compare it to your paper files or other documentation you might have. Make corrections as necessary.

If the client data is correct, but your bed or unit count is wrong, please run your HIC Verification Report and follow the instructions on the cover sheet to get it to the CoC so they can approve it and we can make the corrections.

If both your client data and bed data are correct, but your utilization is still outside the acceptable zone, please work with your region to discuss the reasons and address them. Figuring out why may involve asking questions at meetings or simply communicating with management what the data is showing and discussing possible reasons. Examples are:

  • utilization too low: facility had to temporarily close or greatly decrease capacity due to water damage (nothing to address except the state of the facility)
  • utilization too low: facility is targeting a population not in great need of services in the community (consider broadening target population or geography)
  • utilization too low: facility is barring clients in need from its services by requiring drug testing or employment or is in general not implementing Housing First practices. (remove barriers to entry)
  • utilization too high: facility had large families for a few months (nothing to address unless this is ongoing, then consider maybe raising the bed count)
  • utilization too high: facility is not exiting leavers in a timely manner (assist with HMIS data entry)
  • utilization too high: extreme cold (nothing to address, except to follow up with the folks who had been unsheltered and get them into housing as soon as possible)

The Bed and Unit Utilization Report should be run and examined for each of your agencies monthly. Grant managers will use this report as a way of understanding to what extent your beds are being used to fill needs in the community.

Comments, questions, and feedback are welcome. Send to hmis@cohhio.org.

4. Timeliness of Data

4.1. Desk Time Report

In the Data Quality Standards, the timeliness standards for data entry are clearly laid out as follows: 

  • From the day the client enters your program, the agency has five days to get the client entered into HMIS.
  • From the day the case manager learns of a change in circumstance (increase/decrease in income, non-cash, or health insurance or other change) the agency has five days to get the interim entered into HMIS.
  • From the day the client exits your program, the agency has five days to exit the client in HMIS.

This report only looks at the span of time between the Client's Entry Date and the date their Entry/Exit was entered into HMIS. Though it is equally important that clients are exited within 5 days, ServicePoint does not currently have a way of tracking that. Generally it is the "median" that is used in performance measures.

To run this report, you must have an ART license.

Public > Balance of State HMIS > Data Quality and Performance > Monthly > "Desk Time".

Users can select multiple providers. Use a date range of a year ago to the current date. The report will show stats for each provider selected. See below:

5. How to Make Corrections

5.1. I Entered a Client into the Wrong Provider!

See the page on EDA for information about Enter Data As and why it is so important to enter your data under the correct provider and how to be sure you are avoiding this problem in the future.

To Correct:

  1. Click Enter Data As and choose the provider that the household should have been entered under.
  2. Open the client record, click their Entry pencil
  3. Click Save & Continue button.
  4. Change the Provider Name to the correct Provider and click "Update".
  5. Then Save & Exit
  6. Go to the ROI and just switch the provider name. 
Delete all services and needs (if applicable)
  1. Go to the Service Transactions tab,
  2. Click View Entire Service History,
  3. Click the Needs tab. Write down and then delete any services you created by clicking all the trash cans next to the Needs. In order to delete all the needs for each household member, click the "Switch to Another Household Member" dropdown just above all the tabs, choose a different household member, then click Submit. Repeat Steps 1 and 2 until all the Needs have been deleted from all the household members.

5.2. I need to move an Entry Date back in time. How do I do that without losing my assessment data?

When a user creates an Entry Exit and saves Assessment data, that Assessment data is only attached to that program stay by its Effective Date which is saved to each data element to match the Entry Date. When you move an Entry Date back in time, those Effective Dates on the assessment data remain unchanged, thus breaking the connection between that assessment data and the Entry. This is why when you move an Entry Date back in time and go to view your Assessment data through the Entry pencil, all the data seems to have disappeared. (It is still there, it's just dated incorrectly.)

If you need to move an Entry Date back in time, follow these instructions.

1. Run the "Client Data at Entry" report in ART. If you don't have ART, ask the user at your agency with an ART license to run it for you.

  • Click ART: Connected at the top corner of ServicePoint.
  • Click the triangles next to Public > Balance of State HMIS > As Needed and Custom Reports.
  • Click the magnifying glass on the "Client Data at Entry" report and click "View".
  • Enter the provider and Client ID into the prompt. You can only run this report on one client at a time. It runs quickly.
  • Export the report either to Excel or .pdf. 
  • For help using ART, refer to the links below.

2. Make corrections in ServicePoint.

  • Log into ServicePoint.
  • Open the client record and click their Entry pencil.
  • Move the Entry Date to the correct date, click Save & Continue.
  • The assessment data will have disappeared. Re-enter all the data from your copy of the "Client Data at Entry" report that was run on the client you are correcting. 
  • Destroy or store in a locked and secure location the paper or digital copy of the assessment data.