Members of our Cantina Data Privacy and Security Working Group 1 regularly discuss what we’re reading about both current events and emerging tools and methodologies. Occasionally we like to share these in-house observations with a larger audience here on the blog.
This particular post shares reactions to the recent app debacle of the Iowa primary caucus. The issues with the app highlight the lack of public knowledge of how (and why) apps come to be, and the dangers of using them in elections due to underlying security hazards. The group discusses the role of design and development in proper app development and deployment, and how education and awareness about technology and security is key in preventing future security lapses and breaches of public trust.
Kimberlee Cloutier-Blazzard, Marketing Specialist
Since joining our Working Group, I’ve been reading Bruce Schneier’s book Secrets and Lies: Digital Security in a Networked World (2004). While of course it’s slightly dated, to a layperson like myself it’s a great primer on security terminology, written in a very accessible, conversational way. I happened to be reading Chapter 13 (luckily?) on Software Reliability as the Iowa caucus debacle was unfolding. It occurred to me that—in reading this week’s news reports of (precinct leader) user unfamiliarity with the new app technology and the many ways the voting app could have been hacked—this was a great opportunity to educate myself about how (and why) seemingly unnecessary apps are developed and deployed, and just how vulnerable our election infrastructure is when we open it to the internet.
This reminds me of the “Internet of Fails” concept. Our colleague, Lead Designer Corey Roth, a member of Cantina’s Internet-of-Things working group, recently wrote about the importance of fully understanding the impact of IoT technologies in his post: “Considerations for Prototyping and Scaling IoT Solutions”. He writes that doing your homework requires using experience strategy, foundational research and workshops to avoid falling into the IoT “line of insanity,” thereby creating unnecessary security vulnerabilities. Or, in the case of Iowa, even undermining the democratic process in an election. As Corey wrote, “The data-oriented nature of IoT means true value only emerges when companies have: security and privacy maturity to protect data…” Or, as design guru Mike Monteiro writes on the concept of licensing in his book, Ruined by Design, “Do I care if you have a license when you’re designing data collection for two billion people? Yes, I do. Actually, it’s not even the license that I care about. I care that you know what you’re doing. I care that you understand the job and the ethics behind the job. I care that someone has tested you on this knowledge and you’ve passed those tests.” This would mean you would not—with a 2 month timeline, small budget, brand new company, and inadequate testing—use unnecessary tech for high risk, time-sensitive projects.
Matthias Ferber, Principal Engineer
One of the interesting things, to me, about the Iowa crisis, apart from the security aspects, is that it was not just avoidable but entirely unnecessary. Speaking as an application developer, I could talk about the comically short development time and gross underfunding of the project; but speaking as a consultant, it’s fascinating that there was no real need for an app at all.
As David Dayen recently wrote in The American Prospect, caucus reporting worked perfectly well the old-fashioned way; the Shadow app was “a tech solution to a nonexistent problem.… Somehow, the Iowa Democratic Party got sold that they needed to improve upon this, to ‘disrupt’ the caucus reporting.” The end result was a critically important app that no one really needed, built without a good plan, on a shoestring, in a few weeks. That was never going to end well.
Part of what makes a technology solution work is understanding the tradeoffs involved — there are always tradeoffs — and choosing your path wisely. That technology has transformed every aspect of our lives, and vastly for the better in some ways, has led to the blind knee-jerk reaction that any technology is always better than no technology.
Dayen puts this episode in a larger context. He argues that this wasn’t an isolated case of bad judgment, but a symptom of a larger trend in our economy and society. To tie it back to security, it’s not all that different from the plague of insecure network-enabled smart devices that are flooding the marketplace. A lot of these, like the Iowa app, are solving problems that don’t exist (are smart refrigerators really an improvement?) and causing huge new problems that didn’t have to exist.
Katrina Stropkay, Service Designer
We all have people in our lives who carry on unperturbed by mounting cautionary tales of personal data security. While some are overwhelmed by the daunting complexity of it all, others maintain the common refrain “Why should I be worried if I have nothing to hide?” Election security puts a fine point on the naivety of the ‘nothing to hide’ argument. While it’s easy to see the risk of vulnerability posed by, for example, a breach in your credit card information, it’s harder to see how a lack of personal responsibility for data privacy increases the vulnerabilities of the larger systems and institutions we are all a part of. Take, for example, the Cambridge Analytica scandal. Though you may not feel you’ve shared anything online that’s worth hiding, it’s almost guaranteed you have offered just enough information for third parties to put together a comprehensive, predictive profile for you, which allows for the creation of targeted messaging, or worse, manipulation.
In the case of the current Democratic primary, a crowded field means that opinions are being molded and public opinion openly shaped by the perceived successes and predictive indicators that can be gleaned from available data. The unpredictable nature of technology introduces risks and instills doubt in an already highly-contested and high-stakes voting system. All that being said, as a Service Designer, I don’t think we can allow risk to force us to shy away from opportunity. Lackluster voter turnout, arcane voter registration laws, and even poor ballot design pose enormous threats to creating a fair process and engendering a representative voter base, therefore affecting the unabated will of the republic.
Technology is a double-edged sword with its ability to augment both access and risk. In order to instill confidence that we are creating a free and fair election in the United States, we need well-designed systems at every touchpoint of voter experience. I’m reminded of a Henry Ford quote, “If I had asked people what they wanted, they would have said faster horses.” If you’d asked participants in the Iowa Caucus what they wanted, maybe they would have said better voter applications. But, history has taught us that if we remain innovation-minded, in the long-run, we might be able to create something much better.
Our Data Privacy and Security Working Group provides continuous exploration, content creation, and internal education within the security space. Privacy and security are important aspects of the work Cantina does and the services we provide. The working group helps inform the company’s evolving perspective on associated relevant topics.
Our working group would love to hear your thoughts and challenges around data privacy and security. Please reach out and keep the conversation going.
-
Working Groups are small groups within Cantina organized by theme or goal; for example, IoT & Connected Products or AR/VR. Their purpose is to stay on top of the emerging trends and function as an R&D engine to explore and capture skills and abilities. ↩