The Jurassic Park Problem & Software Development (Part 3)

How you can (should?) use them to guide your work

In the previous parts one and two we covered the basics of what possible, ethical and legal looks like in software development and looked at 4 challenging scenarios. To finish up, in this part we will try to wrap everything up by giving some suggestions for how you can work on getting “is it legal to build this” and “should we build this” to be just as important questions as “can we build this” and “will this make us money”. 

Before getting into the separate tips, we would like to give a big shoutout to Fiona Charles who will be heavily featured in this post. Her keynote “10 commandments of an ethical software testers” was a big reason Lena got invested in becoming more conscious and deliberate with the moral challenges of building software.


Suggestion #1: Become legally literate

Legal literacy can be described as the ability to connect relevant legal rules with the professional priorities and objectives of ethical practice. Noone can be an expert in every new (or existing…) rule and regulation out there. Even people working full time with law choose to focus on some area(s) so we really don’t expect anyone outside of law to be an expert. But knowing just a bit about what could be important and affect us is a great way to not end up on the front page or getting sued into oblivion. A lot of times it can be enough just to know which regulations could potentially be relevant to read up on and when you can manage yourself vs. when it is time to call a professional.

@fionaccharles

GDPR

Know your data and how you use it. And then limit that.

Privacy is all about being aware of what data we are collecting and why. It’s about avoiding saving any data we don’t need, just because it could be useful. Don’t use the data for things outside of what we need or have permissions for, make sure your customers are clearly informed about all of the above and make sure you have everything documented clearly.

Accessibility Act

Make sure all of your users can use your application/service.

The European Accessibility Act and the Web Accessibility Directive aim to enable people with disabilities to be able to partake in society on equal basis with others. 

Disabilities can come in a lot of shapes and forms and we cannot forget that they can be permanent, temporary or situational.  Examples of this could be missing an arm (permanent), having a broken arm (temporary) or carrying a child (situational) – they will all need help navigating your software. 

It’s also not enough to think of navigation without a mouse, making sure your software works with screen readers and having good contrasts – you need to think of things like how easy your software is to understand from a language point of view, if your session handling is causing problems for users who need more time to complete a task and so much more. 

DORA

Manage and mitigate your operational risk. 

DORA is specifically for financial entities and third party providers. It deals with areas like risk management and governance, incident reporting, resilience testing, third-party risk management and how to share information and learn from each other.

To simplify to an extreme, if you are a financial entity, you need to 

  • Have framework for Risk Management 
  • Have a process for Incident Response 
  • Do regular Security testing 
  • Have all of your 3rd party risks mapped out
  • Share information about threats with others

AI ACT

Don’t use AI in ways that cause harm.

The AI Act classifies AI applications in five risk-levels, from unacceptable to minimal (and a plus one for general application purposes). A good place to start would be to identify the risk level of your intended use, look into what is required of you to be able to do that. 

For each risk level there are some classification rules as well as a summary of obligations that go with it. Not only is this useful to make sure you don’t break the law, you can also use it to refine your use case. If the intended solution lands on a high risk level – maybe there are ways you can change and limit the use of AI to reduce that, thus reducing the risk level. 


Suggestion #2: Strive for Exploitation Consciousness

Instead of just thinking your application is purely beneficial or at least harmless, try to at least consider what the worst thing that could happen would be, and if you are ok with that. Then you find yourself a risk level you are comfortable with, and work there. This comes down to your own core values, your moral ruleset and maybe your own passions. 

When figuring this out, you can think about what is the worst harm it can do, but also what is the likely harm it can do. Classical risk management! If you build … an X-ray machine, the worst thing could be someone dying from malfunction where the machine suddenly hits them with deadly amounts of radiation.  The likely harm might be something a lot smaller, like getting a burn due to slightly too high levels, or a missed tumor due to the calibration of the machine being off. 

These considerations get harder and harder. When Lena started programming in the late 1990s, most of the code used was written by the teams themselves. Need an easter day calculation? Write the code. Need to deal with multi-threading? Write the code. With an increased use of commodity tools, frameworks and libraries – less of the code is actually our own and more becomes black boxes that we cannot control or spend time understanding. Our ecosystems are also growing, we collect and spread more data and we integrate with more things outside of our control. We need to understand how something happening in our system can be misused by someone completely outside of our sphere of influence. Think about Facebook and Cambridge analytica. 

An update on airtag and unwanted tracking: https://www.apple.com/newsroom/2022/02/an-update-on-airtag-and-unwanted-tracking/

I am not sure if I hope that the original air tag inventor(s) were unaware of the ways it could be misused or if they understood and just didn’t bother. If they were unaware – maybe they should take a look at who was in that team and reflect on the importance of diversity? So if you are building something new and innovative –  In what ways can your services be misused? And if you personally can’t think of ways – look at which groups are missing in your team and ask for their input.

Identity theft: How scammers exploit personal information online: https://www.cyberinfoblog.com/blog/identity-theft-how-scammers-exploit-personal-information-online

And with the current technological landscape, data is key! Make sure you model, build and protect your data properly because identity theft is the bread and butter of the cyber criminal. 

Consider what data you actually need and if there are ways to limit that so it is as little use as possible to someone with malicious intent.  Stay up to date with the current thread landscape and make sure to get as sure as possible that other people can’t get access to that data. 

What login and authentication methods will you choose? How do you make sure you are safe from different injection methods? How do you handle sessions and encoding algorithms? 


Suggestion #3: Maintain Societal Awareness

Just as law is shaped by trends and politics – so does the threat landscape shift. 

What are the current political trends? What is currently being talked about? How could that affect you in the future? Knowing more is a great way of forming stable opinions based on facts instead of emotion (Well… more based on facts. We are still emotional creatures)

If you know the major sociological and political trends in society – you can extrapolate and make reasonable assumptions about how your piece of technology might be abused in the future, even if it doesn’t seem likely in the short term.

And sometimes the worst thing that can happen is something taken straight out of the handmaid’s tale and something we cannot even comprehend being possible at the time. Be aware of societal shifts. What is harmless today might not be tomorrow.

Why US woman are deleting their period tracking apps: https://www.theguardian.com/world/2022/jun/28/why-us-woman-are-deleting-their-period-tracking-apps

Suggestion #4: Stay Professionally Ethical

What are the lines you are not accepting to cross? What is important to you? What are you prepared to do, and what are you prepared to risk anything to make sure never happens? This differs and the only person who can set your lines are you. Get an opinion before you are forced to and have a plan for what you will do if you are ever asked to cross them. Having to choose between crossing ethical lines and making rent this month is a bad place to be.

The Environmental Protection Agency (EPA) found that many VW cars being sold in America had a ”defeat device” – or software – in diesel engines that could detect when they were being tested. It would change the performance accordingly to improve results. The German car giant has since admitted cheating emissions tests in the US. To repeat that: This was deliberately and consciously programmed, not a mistake or a bug. 

EPA - Learn about Volkswagen Violations: https://www.epa.gov/vw/learn-about-volkswagen-violations

A lot can be said about Boeing and not all of the things that have happened in the big scandals have been  deliberate but there have definitely been people getting pressured into making unethical decisions. Regardless of what honest mistakes were made along the line – there is definitely a thread of greed running through it all, with all of the regulatory corner cutting and lobbying to get things through safety processes prematurely. A lot of deliberately bad decisions. But also – a lot of regular Joe’s must have chosen to, or been bullied into, turning a blind eye.  

Boeing Charged with 737 Max Fraud Conspiracy and Agrees to Pay over $2.5 Billion: https://www.justice.gov/opa/pr/boeing-charged-737-max-fraud-conspiracy-and-agrees-pay-over-25-billion

Final wrap up

There are bad actors in every profession, but most of us actually want to do a good job. Lawyers like to cooperate, to create a good result and get a good product to market just as much as engineers do. So even if the lawyer technically gets eaten first in Jurassic Park: Go make a friend. Invite them to the table from the start so you don’t have to get annoyed when they interrupt you when you are almost at the finish line. 

Reading list

Why US woman are deleting their period tracking apps: https://www.theguardian.com/world/2022/jun/28/why-us-woman-are-deleting-their-period-tracking-apps

An update on airtag and unwanted tracking: https://www.apple.com/newsroom/2022/02/an-update-on-airtag-and-unwanted-tracking/

Identity theft: How scammers exploit personal information online: https://www.cyberinfoblog.com/blog/identity-theft-how-scammers-exploit-personal-information-online

EPA – Learn about Volkswagen Violations: https://www.epa.gov/vw/learn-about-volkswagen-violations

Boeing Charged with 737 Max Fraud Conspiracy and Agrees to Pay over $2.5 Billion: https://www.justice.gov/opa/pr/boeing-charged-737-max-fraud-conspiracy-and-agrees-pay-over-25-billion

Co-Authors

  • Lena Pejgan Nyström

    Lena has been building software in one shape of form since 1999 when she started out as a developer building Windows Desktop Applications (Both impressive and scary - they are still alive and kicking) She later found her passion for testing and even though her focus has shifted to building organizations and growing people, she is still an active voice in the testing community. Her core drive is continuous improvement and she strongly believes we all should strive to challenge ourselves, our assumptions and the way things are done. Lena is the author and creator of “Would Heu-risk it?” (card deck and book), an avid blogger, international keynote speaker and workshop facilitator.

  • Mathias Jansson

    Mathias Jansson works as a legal counsel at a Swedish government agency. He is currently working in project financing and has a long experience in application of public law, public access to information and secrecy as well as integrity and personal information regulation. He is a passionate power lifter, biker and loves climbing mountains or take seriously long walks in the woods. Some also say he is fun. Don’t ask him about history unless you are prepared for a lecture.