SIS is designed to support many types of evaluations. We tried to make it simple enough for immediate use by small non-profits, and complex enough to be used by large networks and major research projects.
That means that there is a lot of technical detail that users will rarely see. This page describes some of the principles and technical design that underlie SIS.
No evaluation is completely objective. They all embed values about who is important (like who should be consulted, who makes decisions and who is a passive consumer of services) and what is important (like what outcomes and outputs are essential and what are optional or unnecessary). Like any evaluation platform, SIS is not value-neutral. LogicalOutcomes has adopted the Principles for Digital Development as design principles for SIS:
- Design With the User
- Understand the Existing Ecosystem
- Design for Scale
- Build for Sustainability
- Be Data Driven
- Use Open Standards, Open Data, Open Source, and Open Innovation
- Reuse and Improve
- Address Privacy & Security
- Be Collaborative
Organizations must take measures to minimize collection and to protect confidential information and identities of individuals represented in data sets from unauthorized access and manipulation by third parties. Responsible practices for organizations collecting and using individual data include considering the sensitivities around the data they have collected, being transparent about how data will be collected and used, minimizing the amount of personal identifiable and sensitive information collected, creating and implementing security policies that protect data and uphold individuals’ privacy and dignity, and creating an end-of-life policy for post-project data management.
Beyond the Principles for Digital Development, SIS design priorities can be summarized as follows:
- The purpose of evaluation is to improve services for the communities who are intended to benefit. A secondary but important objective is to report to funders, donors and managers because without accountability it is not possible to get adequate resources.
- Evaluation information should be used by service providers in improving services. That means that the data should be relevant and credible to the providers, and they should get results in a timely and accessible manner.
- Service recipients should have an opportunity to communicate their priorities and requests and have them incorporated into planning and decision-making. Priorities often include qualities like responsiveness and courtesy as well as the outcomes of the program.
- The burden of evaluation should be kept as low as possible. That includes respondents’ time for filling out surveys as well as the organization’s time for managing evaluation projects.
- Evaluation tools, measures and indicators should be technically sound, based on evidence, open to criticism and continuously improved.
- Data collection techniques should be multilingual, accessible to people with disabilities, and informed by issues of equity and diversity.
- High quality evaluations should be financially feasible for small nonprofit organizations.
- Evaluation metadata is a common good. Rigorous informatics tools are essential resources for achieving broad social goals and should be built collaboratively and shared freely. Our many inspirations include the United Nations Sustainable Development Goals and the World Health Organization’s International Classification of Health Interventions.
Any specific evaluation will need to balance these principles in its own design. For example, although SIS supports multiple languages (even the name ‘SIS’ works in French, Spanish, Portuguese, Italian, Romanian, etc.), resources are required to translate instruments, test the translations for accuracy and cultural appropriateness and enter the translated text into the system. That means that there will often be trade-offs between accessibility and financial feasibility.
As another example, user engagement in every stage of digital development is required under the Principles of Digital Development but is expensive to do well. Whenever we can, we select measures that have been tested and validated by users.
Every survey and interview instrument that collects personal data requires informed consent. Personal data includes opinions and demographic information – anything that can be used to identify someone or link their identity to a response.
Consent processes can be detailed and complex, or quick and simple, depending on the evaluation design and the type of risk that is involved. For example, children under 16 should have consent from parents or guardians for sensitive questions, but it may be acceptable to ask older children for simple anonymous feedback without contacting guardians, assuming the raw data is used only to improve services, is deleted shortly afterward, and is not re-used in future studies. Surveys that ask highly stigmatizing information need more elaborate consent questions, which may reduce response rates.
We have created a few consent templates that should cover most uses in SIS, but you can write your own if you wish. The SIS consent statements assume that:
- The data is used for quality improvement or program evaluation.
- If the data is used for an academic research study or for some other purpose, that use must be stated in the consent form, and the project should be approved by an Ethics Review Board.
- Responses to the surveys or interviews are completely voluntary. If clients are refused service based on whether they respond to the survey, you should get an external ethics review by, for example, the Community Research Ethics Office.
- The services provided to clients are safe and ethical, and appropriate consent has already been obtained.
- This consent form does not cover consent to the services themselves, only to the survey or interview.
Given those assumptions, your consent questions should include:
- A statement that participation is voluntary and will not affect the services that the respondent receives.
- An explanation about how the data will be used, and who will have access to it.
- A description of how long it will take to answer (unless it’s obvious, like a brief poll).
- A description of the risks, for example, that someone might figure out the identity of the respondent.
- A way to ask for more information, including contact information for a person who can answer questions about privacy.
- Consent forms often include a detailed description about the purpose of the evaluation and contact details for the evaluators as well as organizational staff. They should also, according to best practice in information privacy protection, state how long the data will be retained.
We have tried to streamline our default consent questions, but we realize that your organization or its stakeholders may wish other wording. Ask us for pricing if you want a customized consent process that is different from our default consent.
Privacy and security
1. Usability and threat models
Usable policies mean policies that are embedded into the system and that minimize willpower and ethical decision-making on the part of users. As much as possible we have simplified procedures for users and made them fairly automatic through tightly managed access controls.
Secure systems demand a sophisticated and detailed understanding of risks relating to privacy, data integrity and availability. You can’t create effective protection procedures without knowing what you are protecting against. In the case of nonprofits and health-related services, threats to privacy come from both external actors (e.g., hackers) and internal actors (staff). In fact, the biggest privacy risks in health systems are staff fraud, curiosity and sloppiness, not external hackers. We need to protect the privacy of respondents in situations where either our own contractors or nonprofit staff who subscribe to SIS may share their computers with coworkers, neglect to patch their operating systems, click on infected email links, email sensitive information by mistake because they are in a hurry, have their laptops stolen, and so on.
As part of our policy development we analyzed every database and every information asset in LogicalOutcomes to identify where we might be storing personal data, and where potential risks might arise. Not only emails and evaluation spreadsheets, but also IP addresses collected from web visitors, contracts from our consultants, log-in info from third-party services like Skype – everything. Then we radically reduced our exposure to privacy breaches by consolidating most of our work into Microsoft 365 hosted on Canadian servers, using Microsoft’s increasingly rigorous privacy and security resources. Just as one example, Microsoft’s Compliance Manager for Office 365 GDPR, a free service for Office 365 subscribers, tracks our GDPR policies and provides a way to report compliance to our Board of Directors. We can prevent downloads, enforce encryption, require multi-factor authentication and set automatic alerts for unexpected behaviour, among other tools.
For the privacy risks that Microsoft cannot prevent, such as government warrants and subpoenas, we use KoNote, an open source Electronic Health Record software program, and GDPR-compliant third-party hosting for survey data managed by a European company on Canadian servers.
2. European Union General Data Protection Regulation (GDPR)
The GDPR came into force in May 2018, and affects any organization offering services to European Union or United Kingdom residents or citizens, or that collects personal data from residents or citizens of the EU or UK, or has an office located within the EU or the UK.
LogicalOutcomes has adopted the goal of GDPR compliance for our whole organization, including SIS as well as our consulting projects in evaluation and stakeholder engagement. By complying with GDPR, we will also satisfy the requirements of Canadian privacy regulations, including the Personal Information Protection and Electronic Documents Act (PIPEDA) and the Ontario Personal Health Information Protection Act (PHIPA) . We also simplify discussions about how we protect the privacy of personal information we collect – we just refer to GDPR regulations and ask ‘What do we need to do?’.
There are massive amounts of online resources regarding the GDPR. If you want to learn more, we suggest starting with the UK’s Information Commissioner’s Office at https://ico.org.uk/for-organisations/guide-to-the-general-data-protection-regulation-gdpr/
The UK National Health Service has recently released guidance on criteria for cloud hosting of highly sensitive personal health information, posted at https://digital.nhs.uk/Offshoring-and-the-use-of-public-cloud-services. We relied heavily on those documents in designing our processes. They are:
- The health and social care cloud security good practice guide.
- The health and social care cloud risk framework.
- The health and social care data risk model.
- The health and social care cloud security – one page overview.
The LogicalOutcomes Information Security Policy is supported by detailed procedures which are reviewed and audited by an external Information Security professional (with current CISM, CISA, CRISC qualifications) acting as Data Protection Officer.
Interoperability with other information systems
When people talk about interoperability, they are really talking about common standards for metadata definitions. In other words, if one database defines a date as DD-MM-YY and another database defines it as MM-DD-YYYY, you’ll have a problem unless you convert from one format to another.
Dates and times are easy in comparison with social and health indicators. How do you define well-being, or full-time employment? How can those indicators be compared across programs or organizations?
SIS defines metadata using formats and codes that can be used across international and sectoral boundaries. Measure definitions are consistent with ISO 11179, the international standard for data elements. That means they can be used with FHIR, the leading HL7 standard for health care data exchange. We code measures to Sustainable Development Goal 2030 Targets to make it easier for funders and networks to aggregate and compare data. And we maintain our metadata registry in simple non-proprietary SharePoint lists using custom content types so that organizations can incorporate measures into their own systems.