Unacast Updates Social Distancing Scoreboard

Unacast Updates Social Distancing Scoreboard
Try the updated Social Distancing Scoreboard here.

In the age of COVID-19, reducing visits to non-essential businesses and limiting social interactions is an important and oft-repeated message coming from municipalities and government agencies. But equally important is knowing whether or not citizens are responding to that message. This is the core value of our Social Distancing Scoreboard: enabling policy makers, researchers, and health officials to measure the efficacy of their efforts.

And so far, we are seeing our efforts bear fruit — we’ve been contacted by government officials and health policy experts at the local, state, and federal levels expressing great enthusiasm for our aggregated analysis and partnering with us to make their response to COVID-19 even better and smarter.

And we’re sharing here what we’re sharing with our public partners: our Scoreboard will see continuous improvements as long as the virus continues to spread.

As of today, March 31st, our Scoreboard features three major improvements:

  1. A more intuitive interface to help users more easily and quickly navigate the data
  2. An updated grading system that’s more in line with what’s been learned about the virus since our initial launch
  3. A scoring methodology that now incorporates the reduction in non-essential visitation (in addition to our previous metric, reduction in distance traveled)

A New Look & More Muscle

Existing users will notice that our dashboard is now sleeker, faster, and much more responsive. Our team spent the last week developing a native front-end that we hope will reduce loading time for all the users of our tool (2 million and counting) and provide a more enjoyable experience.

An Updated Grading System: Stricter Requirements for Flattening the Curve

We’re partnering with epidemiologists and public health academics to help us make our Scoreboard as useful as possible. These specialists are advising us on our methodology, and on the shortcomings and advantages for each potential approach. With their ongoing guidance, we are building the most useful models for depicting social distancing behavior.

In our initial phase, we used the US county with the strongest reduction in distance traveled — our original metric — as a baseline for setting the grading system. However, this soon proved to be problematic as travel distances were still decreasing in the US and experts argued that more reductions were needed.

The challenge is that we cannot look at history to guide us forward. Mass implementation of social distancing and measuring its results has not been done in modern times, and as a result, there is simply no previous experience in measuring social distancing proxies. As a solution, we look to the learnings from Italy for inspiration. Italy was chosen because it is ahead of the curve compared to the US and strict in its policy measures. Therefore, rather than trying to hit moving targets within the US, Italy provides a more informative baseline for what the bottom in the US could look like.

In Italy, the range of decrease in travel is 70% - 80%. We thus chose 70% reduction in distance traveled as a model for what can be expected under a total shut-down. As of our recalibration, no US state has yet achieved the new cut-off, but neither has any state has chosen to go into a full quarantine as Italy.

As an additional cautionary measure, we defined steps for each grade so that F has a larger range compared to the other grades. 

  • A: > 70% decrease
  • B: 55-70% decrease
  • C: 40-55% decrease
  • D: 25-40% decrease
  • F: <25% decrease

Expanded Metrics for a More Accurate Methodology

When we launched the first iteration of Social Distancing Scoreboard, speed was of the essence. We wanted to get some critical information into the hands of relevant parties as quickly as possible, beginning with one simple metric — reduction in distance traveled — for measuring social distancing.

Being a data-science-driven company, we recognize that a complex multitude of behaviors related to social distancing cannot truly be captured by one simple metric. As such, our aim since the beginning has been to develop a more comprehensive score underlaid with multiple metrics, each describing one facet of social distancing.

That’s why, In addition to reduction in distance traveled, we’re now factoring in a second metric to social distancing scoring: reduction in visits to non-essential venues.

Based on the guidelines issued by various state governments and policy makers, we categorized venues into essential vs. non-essential. Essential locations include venues like food stores, pet stores, and pharmacies — visits to which, ideally, shouldn’t count toward social distancing measurements. 

Non-essential venues include (but are not limited to):

  • Restaurants (multiple kinds)
  • Department and clothing stores
  • Jewelers
  • Consumer electronics stores
  • Cinemas and theaters
  • Office supply stores
  • Spas and hair salons
  • Gyms and fitness/recreation facilities
  • Car dealerships
  • Hotels
  • Craft, toy, and hobby shops

Once we extract the essential venues from the data, we then calculate the average visitation for each day of the week prior to the COVID-19 outbreak (defined as March 8th and earlier) as a baseline. We then compare those baselines to visits on the corresponding days of the week post-outbreak (March 9th to the present). By always comparing Saturdays to Saturdays, Tuesdays to Tuesdays, and so on, we can more accurately measure social distancing because we’re keeping it in the context of the normal visitation rhythm of the 7-day week.

Similar to the grading system for distance traveled described above, we translate the percentage change into a letter grade using the following conversion:

  • A: >70% decrease
  • B: 65-70% decrease
  • C: 60-65% decrease
  • D: 55-60% decrease
  • F: <55% decrease

In a full quarantine there would ideally be no travel at all to non-essential venues — but we know that our grading shouldn’t be so strict that it doesn’t account for gray areas. For example, in certain states and counties, many restaurants are maintaining their services as purely take-out or drive-thru.

Finally, as we do with all of our products, we are actively rooting out bias in our data. In improving our social distancing methodology, we uncovered some false positives in our visitation algorithm — especially in cases where a non-essential business is located next to an essential business but the two are unlikely to be connected. We have taken steps to correct for this bias and factor it into the scoring methodology. 

Frequently Asked Questions

How does Unacast combine the Distance Traveled metric and Visits to Non-Essential Venues metric to create an overall score?

We treat the two metrics like grades in a school report card: the overall score is the average of the individual grades, for example:

  • A & B = A-
  • A & C = B
  • F & D = D-

In addition to the five letter grades, grades with minuses were introduced to give the scoring more nuance. This method was incorporated because the two metrics have different cut-offs and because some counties have only one metric when they don’t have enough available venues to make reliable measurements (in these cases, the overall score is equivalent to the travel distance reduction grade). Since both metrics reflect the most crucial facets of social distancing, which are related, yet not completely overlapping, they were equally weighted in the calculation of the final score.

Why is my state’s / county’s score so different than it was last week?

We, both at Unacast and as a larger collective community, have learned a lot about COVID-19, and the human behavioral responses to it, in the past week. Two of our Scoreboard updates — the new grading system and the improved methodology that layers in reduction in visits to non-essential venues — combine to paint a more accurate and up-to-date picture of what’s happening in the real world, based on what we know as of this publication. To compare your score from the previous iteration of the Scoreboard to the current one would be misleading, and we encourage you to consult the line graph below the map, as well as the arrows located next to the letter grades, for a sense of how social distancing is trending in your community.

If the overall score is lower (such as a D or an F), is Unacast implying that the threat of virus spreading is also lower?

Not at all. The metric is a proxy only for changes in the behavior of people — not for the travel path of the virus. Since behavior change is the intention behind social distancing, we can provide direct aggregated feedback to policy makers and community leaders on how well their social distancing measures are being adopted by the general public, and if more severe restrictions do lead to a reduction in the number of reported cases of COVID-19.

By not comparing absolute numbers in visitations, aren’t you still favoring urban areas over rural ones?

Rural and urban areas are different in a multitude of ways, and as such, no single metric fairly measures both. We explored several options to get as close as possible to fairness and accuracy, and settled on using a rate of change over absolute numbers in visitation — for both the distance-traveled metric and the non-essential-venues-visited metric. For example, if our data set can measure 50% of all available venues in an urban county, but only 30% of available venues in a rural one, the absolute number of visitations would show wildly biased results toward the urban county. Rates of change are not perfect, but much more apples-to-apples.

Additionally, working directly with policy makers has taught us that, by and large, they are most interested in how their own constituencies are performing and comparing that performance to similar areas.

What's Next?

When it comes to the fight against COVID-19, we’re in it for the long haul. The more we partner with industry experts in epidemiology and public health, the more we’ll learn about the virus and its broader impact on human mobility. We’ll continue refining our thresholds and methodology to bring us closer and closer to accurately reflecting the real world.

In that vein, our next major update to the Social Distancing Scoreboard will include:

  • Our third metric: the change rate for the number of person-to-person encounters for a given area
  • Reduce time lag such that metrics are available evening next day.
  • And more!

We’re thrilled that our Social Distancing Scoreboard is helping community leaders make decisions that will ultimately save lives and that over 2 million people have used it so far. Please get in touch if you have questions, want to embed the Scoreboard on your site, or wish to access the underlying data at dataforgood@unacast.com.

Related Articles