Updates to the 2019 Survey

The deadline for organizations to complete the 2019 OCI and GMI surveys are nearing. For organizations that completed surveys in previous years, and for organizations looking to fill them out for the first time, PSD provides a synopsis of how the surveys have been enhanced from previous years to better ensure quality findings from survey results, which will be announced in November!

2019 GMI Survey

The 2019 Geospatial Maturity Index Survey has launched! While this may already be something your organization is aware of, we here at PSD wanted to take the time to outline some of the changes that we have made to this year’s survey.

In total, the 2019 survey contains a possible 85 questions (compared to 74 from 2018). 35 of the questions remain the same as last year, 28 of the questions are essentially the same with some modifications to the answers, and 11 questions have been removed from this year’s survey that appeared last year. That means there are 22 new questions for the 2019 survey. For a full list of the new survey questions click here.

When digging a little deeper as to why so much of the survey has changed since 2018, we find that a large portion of the changed questions were those that existed as open-ended questions last year. In fact, the number of open-ended questions has been reduced by 46% from the 2018 survey (from 24 to 13). Most of these questions have been replaced with a “select all that apply” question based on the top answers from last year. Select all that apply questions have increased 93% from 15 to 29 for the 2019 survey, this will make it significantly easier to provide trend analysis for future years of the survey.

Ultimately, the modification of questions within the 2019 survey was driven by three key factors. First, we wanted to incorporate as much feedback as possible that we had received from the 2018 version of the survey. We had received feedback that some of the questions could be made clearer or that there was a specific aspect of geospatial maturity that were missing in the previous survey such as data security protocols.

Second, the 2019 version has a greater focus on the outcomes of the questions. When revising the survey, we tried to establish the purpose for each question, what it was trying to measure, and whether it had any value to the survey respondents. For example, in the 2018 survey we asked whether GIS data interoperability existed to enable integration of spatial data from external or internal sources. Responses to this question provided limited insight to both us as the host of the survey and to respondents, so ultimately this question was cut. Additionally, there were some questions which results could be explained by other factors not recorded in the question that led to their exclusion in the 2019 survey. In many of these cases, the standalone questions have been cut and added as additional options within other questions to still capture their results but lessen the weighting of their responses when measuring overall maturity.

Finally, we have modified some questions to include an expanded understanding of possible outcomes to better reflect the various ways in which an organization can achieve geospatial maturity. For example, in 2018, the survey asked respondents whether they had Standard Operating Procedures for data maintenance and quality. While 36% of respondents indicated that they had SOP’s, 48% indicated that they had other methods to maintain data quality. These results showed that the question could be better worded or framed to capture more of that 48%. In the 2019 version of the question, the question was modified to ask how organizations maintain data quality and provides “Formal SOP’s” as an option.

We believe that the changes made to the GMI survey for 2019 enhance the robustness of the survey and provide better best practice indicators to respondents without rigid parameters that can penalize organizations based on their activities. We are excited for organizations to complete the 2019 survey as we believe it will create a viable representation of the state of geospatial maturity in North America and will serve as a health check for all participants to set goals for future development of GIS initiatives.

2019 OCI Survey

This year, the 2019 Open Cities Index has a total of 71 questions. Of those 71 questions, 15 are new, 7 are modified, and we have also added in 7 new data sets. Like previous years, the survey is accompanied with a Walk-through guide that can be accessed via link in the first part of the survey, which includes more in-depth descriptions of the questions and data sets posed on the survey.

To provide an example of some of the 15 newly added question, a new question from the Readiness Section, Q22 reads: “Does your open data policy align with the 6 principles of the International Open Data Charter.” The Charter is now becoming international standard of best practice for Open Data solutions; if an organization is modelling their open data work based on the data’s Shared Set of Principles, it facilitates better benchmarking and consistency across organizations.

Q32 of the Implementation Section reads, “Have you introduced software in order to help automate the open data publishing process?” The rationale behind including this new question is that we know that a barrier for many organizations to grow their open data program is sustainability, to which software helps achieve greater efficiency by updating data sets automatically.

One of the new questions added to the Impact section is Q41 which asks, “Does your organization have a framework in place to measure the quality of published datasets with key indicators?” The reason for adding this question was to try to encompass more measurement in the Impact section, insofar to understand and see quantitative results as opposed to qualitative. Measuring the impact of the quality of published datasets strengthens the legitimacy and rationale for an organization to have an open data program.

We have also modified some questions that were included in the 2017 survey for better clarity, but also for results to be more specific and measurable. An example of this is the question that reads, “Has your organization held an app competition (a hackathon), or partnered with an organization to host an app competition, within the past 2 years?” With the possible answers being: “No”, “No, but scheduled” and “Yes”. The question is also followed by a subsequent question that states, “Check all that apply in relation to your app competition” with possible answers being:

      • Attendance goals were achieved
      • Community diversity was represented across attendees
      • Competition hosts measured the outcomes of the competition (ie number of ideas generated etc.)

If you consider the 2017 OCI survey question that was in regards to the same topic at hand, which asks “Has your municipality held an app competition within the past 2 years?” it was apparent that we were trying to determine the success of the competition based on the possible answers. However, upon feedback from respondents, success of a hackathon can be defined in different ways, hence why we provide the option in to “check all that apply” and includes different success factors including attendance, diversity, and measurement.

A modified question from the Impact section is Q39 which asks: “Does your organization measure the extent to which staff has accessed open data for the purposes of their job duties (i.e. for research, to contribute to a report, etc.)?” A subsequent questions asks respondents to describe the method of measurement. In the 2017 survey, the question was slightly more ambiguous, asking “To what extent has open data had a noticeable impact on increasing government efficiency?” With the new question, we wanted to be more definite in understanding how many staff access open data for the purposes of their job duties, which by extension, should correlate with increased government efficiency.

In addition to the new and modified questions, we also added 7 new dataset options to the Implementation section. The definitions of all datasets are included in the Walk-through guide, but to provide an overview of the new datasets:

Local Election Campaign Contributions Campaign

  • contributions data for candidates of the most recent local election.

Council Attendance

  • This dataset may include information such as councillor names, type of meeting, meeting date, etc.

Council Declared Conflict of Interest

  • List of declared conflict of interest(s) by councillor.

Parking Citations

  • Data that may include time, location, and bylaw violation associated with the parking citation.

Energy Consumption and Efficiency Data

  • This data may include energy consumption at city facilities, renewable energy production, etc

GHG Emissions

  • This data may include community-wide emissions over time, broken down by sector or may just focus on municipal emissions.

Affordable Housing

  • Datasets related to income-restricted affordable housing listings

Of interest, based on those organizations who have filled out the survey so far, the data sets that are reported as being presented or available are local election campaign contributions, council attendance, and energy consumption.