Introduction
Welcome to the DEEP guidance on the Assessment Registry. The Assessment Registry is a tool designed to process and analyze existing assessment reports to facilitate retrieval and understanding of needs-related information and identification of gaps. This registry does not include all documents that serve to understand the country's humanitarian context as is the case with DEEP entries, but is structured around only those documents generated through systematic and methodologically robust data collection methods.
These reports are crucial in managing and responding to humanitarian crises. The DEEP Assessment Registry is a database containing the needs assessments conducted in a given country or crisis. It is primarily concerned with the identification of what is being assessed, who is being assessed, when and where. It also captures the methodology, the quality and analytical value of information contained within the assessments so as to map the extent to which decision making is possible and identify information gaps.
Using the DEEP Assessment Registry
Before using the Assessment Registry
The preliminary step before using the Assessment Registry is to evaluate whether a document added to our project in DEEP is a needs assessment or not. To do this, we should consult the definition of a needs assessment and the criteria to determine if a document is considered as such.
So, what is a needs assessment? A needs assessment is a process intended to estimate or provide informed opinions about deficiencies and their consequences. It entails a systematic set of procedures undertaken for the purposes of setting priorities and making decisions about programs, system improvement and allocation of resources (Witkin & Altschuld, 1995).
How do we recognize an assessment report to upload in the DEEP Assessment registry?
Considering the definition of a needs assessment, the document must comply with the following selection criteria:
- A deliberate effort to collate and/or collate data on specific humanitarian topics contained in the Analysis Framework.
- Focused on identifying status, needs, problems, deficiencies, etc.
- Conducted using data collection techniques specially designed for humanitarian settings: Key informant interviews, direct observation, head of household interviews, Focus Group Discussion, Community group discussions, etc.
- Requiring specialized or trained humanitarian staff
- Information life span goes beyond a month
On the contrary, what is not a needs assessment?
- Flash Appeal
- Humanitarian response plan
- UN agency SitReps (the lifespan of the information contained in them is too short and general)
- A survey of survey report (Instead try to identify the assessment reports included in it and check if you have them available in the assessment registry)
- A 3W (Who does what and where?)
- A response monitoring report
Once you have checked that the document evaluated meets the requirements, then you should check the “Is assessment” box available in the menu that appears when adding a document to DEEP.
If you want to check this box on a document already existing in your DEEP project, click on the pencil icon to the right of your “Sources” menu, where you can mark “Is assessment”.
To start the registration of your assessment, click on the “ASSESSMENT” button on the right side of the “Sources” screen.
Metadata
In the Metadata tab of the DEEP Assessment Registry, you'll find a concise yet comprehensive set of categories that are key to contextualizing each assessment. From pinpointing the countries involved to defining the crisis type and date, you'll lay the groundwork of your assessment's context. You'll note the level of preparedness and any external support received, classify the coordination efforts, and detail the type of assessment conducted. Additionally, you'll link the assessment to established guidelines, specify the frequency of reporting, ensure confidentiality, list available languages, and note the report's length. Finally, documenting the stakeholders involved paints a complete picture of the collaborative effort behind each assessment.
Assessment Background
Country
- All countries where the assessment has taken place (field is multiple choice).
- In the case one of the countries covered by the assessment is not available in your project, you can send the report to the relevant project from the ‘Sources’ menu.
Crisis Type
- In this field, specify the main type of crisis that the assessment addresses. Options include natural disasters like floods, earthquakes, landslides, and tropical cyclones, as well as conflict-related crises.
- This is a single-choice field, ensuring focused classification of the assessment based on the primary crisis.
Crisis Start Date
Estimated date at which the crisis was declared, in the absence of clear data, use the date of the first report available in GLIDE.
Preparedness
- Preparedness refers to the use or adaptation of protocols, tools, skills, and templates by assessment teams that were prearranged before a disaster in the relevant country. Depending on this preparedness, you must select one of the two options displayed.
- This typically occurs when an assessment working group, an interagency body, a specific coordination mechanism, or a specialized team is established in-country to conduct assessments post-crisis or shock.
- Commonly, needs monitoring systems, such as protection or market monitoring, involve specialized teams (e.g., IOM DTM). These are regarded as 'With preparedness', except for their initial rounds.
External Support
- This category captures support received from external organizations or actors in conducting the assessment, such as OCHA NAAS, ACAPS, Mapaction, UNDAC, and others.
- Tip: When reviewing assessment reports, check for logos or mentions of external actors who have provided support, especially in response to sudden crises. Organizations like Mapaction, ACAPS, REACH, and UNDAC are often involved in such situations.
Coordination
The 'Assessment Coordination' category is pivotal for understanding the collaborative nature and scope of each assessment. It categorizes the assessments based on how they were implemented and the extent of coordination involved.
Here’s a detailed guide:
- Uncoordinated Assessments: These are conducted by a single agency or organization. Typically, the cover page of such an assessment report will display only one logo. These assessments are standalone and don't involve collaboration with other entities.
- Coordinated - Joint Assessments: When an assessment is carried out by multiple organizations or partners, it falls under this category. These assessments are marked by several logos on the cover page, indicating collaboration. They are unified by a single methodology, questionnaire, and result in a joint final report.
- Coordinated - Harmonized Assessments: This category is for assessments that are part of a larger constellation of assessments, where one assessment may use data from another. For example, a health assessment might incorporate data from a WASH assessment. These involve multiple reports and methodologies, but each one is linked by the shared use of data.
Assessment Details
Assessment Type
- Initial: These are conducted at the onset of a crisis and typically last a few hours to days. Initial Assessments primarily employ qualitative research methods, often utilizing open questionnaires. The sampling strategy is usually convenience or purposive, aimed at gathering quick, initial insights into the crisis situation.
- Rapid: More structured than Initial Assessments, Rapid Assessments generally span one to two weeks. They use qualitative research methods, including key informant interviews, direct observation, and community group discussions. The sampling in Rapid Assessments is purposive, designed to provide a more comprehensive understanding of the situation in a short timeframe.
- In-Depth: These are extensive assessments, often taking more than a month to implement. In-Depth Assessments are typically representative, using quantitative research methods or a mix of quantitative and qualitative approaches. Common methods include household surveys combined with focus group discussions. The sampling method is representative, with random selection of respondents, ensuring a comprehensive and detailed analysis of the crisis.
- Monitoring: This type refers to assessments that are conducted on a recurring basis, such as Displacement Tracking Matrix (DTM) and protection monitoring. Monitoring assessments are essential for tracking changes and trends over time.
- Registration: Similar to a census, Registration involves the systematic recording and registration of individuals. It is exhaustive and systematic, gathering data for the entire population of concern. This method is frequently used by organizations like UNHCR to register refugees or asylum seekers.
- Other: For any assessment that does not fit into the above categories or if there is any doubt about its classification, it should be categorized as 'Other'. This allows for flexibility in accommodating a wide range of assessment types.
Assessment Family
- This category refers to a predefined set of assessment families, each with its established methodology and framework. Examples include HeRAMS, EFSA, MIRA, DTM, among others. These families are recognized by their adherence to specific assessment protocols and procedures.
- Recognizing the correct family for an assessment may initially pose a challenge, especially for users new to the system. However, with experience and exposure to various assessment reports, identifying the right family becomes more intuitive. The name of the family often appears within the document itself, making it easier to categorize the assessment accurately.
- It's important to note that 'Family' is a closed list, meaning it includes only those families predefined in the system. If there's any uncertainty regarding the appropriate family for an assessment, it's advisable to consult with colleagues who might have more experience or insight.
Assessment Frequency
This information provides insight into the periodicity and timeliness of the data being analyzed.
- Regular: This classification is used for assessments that are published or conducted with high frequency, at least twice a year. If you have selected 'Monitoring' under the 'Type' category, the frequency should automatically be classified as 'Regular'. Assessments that fall into this category typically include reports that are updated weekly or monthly, such as the Displacement Tracking Matrix (DTM) or Reach assessments.
- One-off: For assessments that are published less frequently, typically once a year, the 'One-off' category is used. This includes reports like Humanitarian Needs Overviews (HNO) or Multi-Sector Needs Assessments (MSNA), which provide annual insights but do not have the regular updates characteristic of 'Regular' assessments.
Assessment Confidentiality
Confidentiality relates to whether the information in an assessment can be shared widely or if it has restrictions due to sensitive content.
- Typically, the confidentiality of an assessment will be communicated to the analyst handling it. If an assessment is marked as confidential, it implies that there are specific restrictions on its distribution and use, often to protect sensitive data or the privacy of individuals involved. These assessments require careful handling to ensure that confidentiality protocols are maintained.
- On the other hand, if an assessment is unprotected or not marked as confidential, it suggests that the information can be shared more freely, subject to standard data sharing and usage policies. It's important to always verify the confidentiality status as communicated and adhere to the guidelines provided to maintain the integrity and security of sensitive information.
Language
- This field is designed to record all the languages in which the report is available. It is a multiple-choice field, allowing you to select more than one language if the report has been translated or is available in multiple linguistic versions.
- If the same report is available in different languages, each version should be uploaded separately. These translations or additional language versions should be added in the 'Additional Documents' tab.
Number of pages
You must indicate the number of pages that the main document of your assessment has.
Key Dates
The next section within the ‘Metadata’ tab is the one referring to key dates of the assessment. Analysts need to be aware that the publication date of a report can sometimes significantly differ from the data collection dates. This gap is important as it may affect the relevance and applicability of the data, especially in rapidly changing humanitarian contexts.
- Data Collection Start Date: This is the date marking the commencement of field data collection or the Secondary Data Review (SDR) for the assessment. It should be clearly stated within the methodology section of the assessment report.
- Data Collection End Date: This date indicates the conclusion of field data collection or the SDR. Like the start date, it is essential to refer to the methodology section of the assessment report to find this information.
- Publication Date: Note the date when the assessment report was published on a web platform, such as HR.info or ReliefWeb. This indicates when the findings were made publicly available.
Stakeholders
In the DEEP Assessment Registry's Stakeholders section, it's essential to record all entities involved in the assessment. This includes the lead organizations, which can be either local or international NGOs, UN agencies, or other entities, as well as other international partners participating in the initiative. National organizations, including local NGOs and relevant government ministries, should also be documented. Additionally, acknowledging both international and local donors who have contributed funding is important. To add the different stakeholders:
- Click on the 'Add' button in the lower right side of the screen.
- Search for the organization or actor you wish to add. You can search by the full name or acronym. DEEP has a repository of verified organizations; choose the 'Verified' one if in doubt.
- Locate the stakeholder and drag and drop it into the appropriate category based on its role in the assessment.
- If the organization or stakeholder isn't in DEEP's registry, you can add it by clicking the 'Add' button above the search bar. Then, follow the process outlined in the 'Data Collection' section of this User Guide to create a new entry.
Additional Documents
In the 'Additional Documents' tab of the DEEP Assessment Registry, you have the possibility to enhance your assessment with key supplementary materials. You can insert the text content of the executive summary of your assessment, upload datasets or provide the link where they are hosted, questionnaires, and any other relevant files like maps or versions of the assessment in other languages and formats.
Focus
In the 'Focus' tab, you can categorize your assessment data according to several metrics depending on its content. You will tick the boxes displayed if the assessment includes any information about specific framework pillars, affected groups like refugees or internally displaced persons, geographic locations covered, and humanitarian sectors addressed (remember, there must be at least a paragraph dedicated to a specific sector to mark it!). If your assessment involves the Protection sector, the 'Protection Information Management Systems' section will be enabled, allowing you to specify Protection-related metrics. This organization ensures your data is aligned with key areas of interest and accurately represents the scope of your assessment.
Methodology
In the 'Methodology' section, you'll explore the procedures and approaches used in your assessment. This crucial step sheds light on the validity of your findings. It requires detailing the specific objectives, describing the data collection techniques used, outlining the sampling methods, and acknowledging any limitations or biases present in the study. Grasping the methodology gives you a clear view of the assessment's reliability and the level of confidence you can have in its conclusions.
Objectives and Limitations of the Assessment
In the upper part of this screen, you have two textboxes: ‘Objectives’ and ‘Limitations’, both of which must be filled out meticulously:
- Objectives: Here, insert the text identified in the assessment that defines its objectives. These objectives must be explicitly stated in the assessment document. Carefully select the relevant portions of text that outline the goals and aims of the assessment and drag and drop them into this textbox.
- Limitations: In this textbox, document any limitations or constraints identified in the assessment. This includes challenges faced during the assessment, potential biases in data collection or analysis, and any factors that might impact the interpretation of the findings. Again, select the appropriate text from the assessment that discusses these limitations and drag and drop it into the textbox.
The bottom side of the Methodology menu allows you to enter several details about the methodologies used in the assessment, creating as many entries as there are methods used. The fields to be filled are:
- Data Collection Technique: Select from a closed list of techniques, including Secondary Data Review, Key Informant Interview, Direct Observation, Community Group Discussion, Focus Group Discussion, Household Interview, Individual Interview, and Satellite Imagery.
- Sampling Size: Specify the number of people or units assessed.
- Sampling Approach: Choose between Non-Random Selection, Random Selection, or Full Enumeration.
- Proximity: Indicate whether the data collection was Face to Face, Remote, or Mixed.
- Unit of Analysis: Define the level at which data is collected.
- Unit of Reporting: State the level at which data/results are aggregated and displayed, such as in chapters, tables, graphs, charts, maps, etc.
Summary
So far, the first four tabs of the Assessment Registry have contained metadata from the assessments added to it. However, the Summary tab contains specific information for quantifying, identifying relevant data, and beginning to condense the assessment's analytically valuable content. In this tab, you can assign various categories and subcategories of information related to the different pillars of DEEP's generic analytical framework. As a user of the Assessment Registry, you will need to assign different text excerpts that summarize the main issues associated with each information category. In addition to this qualitative component, there are several spaces where you can enter key figures identified in your document.
The Operational Environment section will always be active, while the appearance of the sectors at the bottom of the Summary menu will depend on having checked these sectors in the 'Focus' tab. At the end of the 'Summary' menu, you will see a series of charts automatically generated from the figures you have entered in the different categories of the menu.
Score
Finally, you arrive at evaluating the quality and usability of each assessment. Here, the depth, clarity, and credibility of the reports are critically analyzed. This involves evaluating the evidence used, the thoroughness of the analysis, and the relevance of the findings, a crucial step in determining the assessment's usefulness for humanitarian action and decision-making. Unlike the 'Score' tab in the previous version of the Assessment Registry, Quality and Analytical Density are now scored in two separate sub-tabs.
Alongside the quality scoring scale, you can now write a justification for the chosen score, ensuring a record of the process followed to reach these conclusions.
Analytical Density is scored in a more intuitive and documented way than in the previous version. For each sector covered by the assessment, you will mark the boxes assigned for different Analysis Levels and Figures, allowing DEEP to automatically calculate the score based on the number of boxes marked.
Lastly, the 'CNA' tab will appear in your registry if the assessment has been categorized as Coordinated in the Metadata tab. Similar to the previous version of the Assessment Registry in DEEP, you will answer YES or NO to various questions designed by the Global Public Policy Institute to evaluate Coordinated, Multi-Sector Needs Assessments. Similar to the quality and analytical value scores, DEEP will aggregate the scores from the different indicators to obtain the CNA score.
Exporting your Assessment Registry
Similar to the process for exporting your Entries, you can also export the content of your Assessment Registry in DEEP. To do this, click on the same Export menu you use for exporting your entries and select New Assessment Export. After waiting for DEEP to process this export, you will receive an Excel file containing a record of all the needs assessments processed in the Assessment Registry, with all the details and metadata entered throughout the explained steps. This export is an easily distributable file with other stakeholders, and its format will be familiar, as it is the typical format of other assessment registries created by the humanitarian community.
Thank you for joining this tutorial on the DEEP Assessment Registry. We hope you found the information helpful and feel more confident in navigating and utilizing this crucial tool.