Back to insights

Apps: Getting users arrested and spied on since 2016

In the realm of mobile apps, where innovation and technology intersect with daily life, there's ample room for the unexpected to unfold. While many apps are aimed at entertainment or communication, some encounters may lead to hilarious and sometimes horrifying consequences:

1. Pokémon GO's Unintentional Adventures:

Pokémon GO, the augmented reality game that swept the globe in 2016, led players on quests to capture virtual creatures in the real world. However, some players found themselves in amusing predicaments as they pursued elusive Pokémon.

In 2016, a police station in Australia had to issue a public warning after multiple players attempted to enter the police station while playing Pokémon GO, mistakenly believing it to be a designated in-game location.

2. Chat GPT ruins education:

With the advent of AI and it’s availability to the wider public, a major draw card that has been used is that AI democratises information that could previously only be accessed through educational institutions or costly text books. The inventors regularly taunted ChatGPT as the next evolution in education and as a supplementary tool for educators. Just days after OpenAI dropped ChatGPT in late November 2022 however, the chatbot was widely denounced as a free essay-writing, test-taking tool that made it laughably easy to cheat on assignments.

Los Angeles Unified, the second-largest school district in the US and others, have blocked access to OpenAI’s website from schools’ networks. By January 2024, school districts across the English-speaking world had started banning the software, from Washington, New York, Alabama, and Virginia in the United States to Queensland and New South Wales in Australia. Several leading universities in the world, including Imperial College London and the University of Cambridge, has since issued statements that warned students against using ChatGPT to cheat.

Instead of using ChatGPT to complete assignments and essays, students are now using the AI to do the assignment on their behalf. Instead of empowering educators, Chat GPT seems to be ruining education all together.

3. Google Maps Outs Criminal:

Google Maps has become an integral part of how users use their mobile phones to move around and how we get information on businesses and locations around the world without physically setting a foot there. The Google camera vehicles that roam the planet’s streets snapping photos of places we later search, has some hilarious and often unintended consequences.

In 2022 an Italian mafia boss who had been in hiding for over 20 years was arrested in Spain after he was seen in a Google Streetview photo ordering sausage from a local deli. Police used facial identification to track him down.

4. Filter Apps turn into porn-generating engines:

It started with filters on Snapchat that enhances and comically changes the user’s appearance, but now lawmakers are warning against the legal and social implications of apps that uses AI to turn found-images into something more sinister: Porn.

Apps that use artificial intelligence to undress people in photos are surging in popularity, according to new social media research. These programs manipulate existing pictures and videos of real individuals and make them appear nude without consent. Many of these “nudifying” apps only work on women.

A recent study conducted by the social media analytics firm Graphika analyzed 34 companies offering this service, which they call non-consensual intimate imagery (NCII). They found that these websites received a whopping 24 million unique visits in September alone.

Recently, popstar Taylor Swift had to turn to courts to protect images of her being turned into pornographic content that was sold on the internet.

5. Stand back FBI – The apps are the spies now

Have you ever seen one of the videos on Facebook that shows a “flashback” of posts, likes, or images—like the ones you might see on your birthday or on the anniversary of becoming friends with someone? If so, you have seen examples of how Facebook and other apps uses Big Data.

A report from McKinsey & Co. stated that by 2009, companies with more than 1,000 employees already had more than 200 terabytes of data of their customer’s lives stored. Since then, trillions of interactions by real people have been happening on apps in real-time. Each time you download an app or open it on your smartphone, you are in fact given away crucial details about your private life, including where you shop, where you go, who you spend time with and your political affiliations.

App makers will of course argue that this data is used for commercial purposes: Selling products to you from advertisers or improving the functioning of the app. There exists a real concern that the data harvested from users could be exploited by terrorist organisations and political leaders alike.

While apps certainly aren’t going to go away soon, they may cause temporary embarrassment or confusion and in it’s most dubious guise: An unpredictable future where apps have the ability to alter the very fabric of human society, including it’s security.

Recent Articles

Read more

From the CTO’s desk: 7 Reasons why we didn’t outsource our Custom Software Development

While the idea of outsourcing our custom software development seems like a cost-effective, logical decision to make, you might want to consider the following reasons why you should rather not. This is what we did:

1. We are a Super Company and equipped with all the skills

Why depend on others when you can just depend on yourself, right? Our company already has a rich pool of specialized talent, ensuring that our project is handled by our own experts which are skilled in all technologies and methodologies without a hitch. This means we don’t need to access a breadth of skills and know-how because they are all already available to us and we know exactly how and what to do to make our custom software a success.

2. We don’t need to save money on the long haul

It’s cost-effective for us to solely rely on our in-house team. We are OK with the huge overheads and the development death-spiral of figuring stuff out for months on end and payslip after payslip. It’s not like we are competing against anyone! Deadline, shmeadline. Also – we love recruiting and we love paying more and more to retain expertise, specialized talent and hoping that our team magically acquires those skills they don’t already have.

3. We don’t mind being last to market

It’s perfectly acceptable to be the last company to enter the market. Our competitors have always had the edge on us, why change things now? Rapid Development is not needed. We will get there in your own, sweet time. The customers will just have to wait while we try and figure things out with our own talent and our own skills. There is absolutely no hurry. And no end to the bank overdraft facility.

4. Scalability is nonsense. We like your shoes 3 sizes too small!

More customers don’t necessarily mean we have to scale now. That’s just tech talk. We can adjust to the changing needs and the development of new tech at our own pace. It’s not such a big deal to get an error message or system-down screen every so often. It’s all about being dependable and that means not being flexible. We like what we have and will be damned if we have to change it. Growth doesn’t need to be efficient for us. We can scale when things really start going wrong. Customers don’t mind unresponsiveness!

5. We like being strong. Look at us doing everything. All. The. Time.

ADHD in the corporate world is the next big thing. Why focus on our core competencies and business model when there is so much more to get hung up on? Let’s get bogged down in the technical details. Let’s gobble up all the resources meant for running the company. Let’s put innovation on the backburner while we figure out this coding error. We’ve been going at it for months, why give up now? We will just throw more money, more people, and more time at it. It’s crucial that we oversee every minute, thime-staking detail. Eventually we will understand where we went wrong.

6. You like to call it a budget. We like to call it magic.

There is no inherent risk in developing custom software and apps. It’s a myth. Besides, we believe in pay-as-you-go-broke. We don’t have the need of bringing in outside expertise and experience to the table, because the bank now owns the table. And the chairs. And the desks. And the IP. And the building. Overrunning on our budget was the best decision we could have made. It’s not like it’s going to run out! Investor? Investor, who?

7. It’ll keep on working if we just don’t touch it. Ever again.

We have a support email address. That’s enough. Joylin in the tearoom gets those. We trust in our code. We explained this at the launch: We don’t really pay attention to what is happening outside of our custom software or app, so we will be fine. We don’t believe that things in the tech world changes and we are pretty sure it won’t have an influence on what we have built. It’s just not how software works. See it as a monument of brick and cement. What worked in 1995 will work in 2030 too. Everyone knows that!

Coding gone Bad: 5 Coding Disasters

In an era dominated by technology, the quality of software code can make or break a product, service, or even an entire organization. The past decade has seen several high-profile incidents where bad coding practices led to disastrous outcomes, resulting in financial losses, reputational damage, and sometimes even endangering lives. Here are 5 times that bad coding lead to disaster

1. The Boeing 737 Max Crashes (2018-2019):

The tragic crashes of two Boeing 737 Max aircraft in Indonesia and Ethiopia claimed the lives of 346 people and shook the aviation industry to its core. Investigations revealed that a flawed flight control system, known as the Maneuvering Characteristics Augmentation System (MCAS), played a central role in both accidents. The MCAS relied on faulty sensor data and exhibited aggressive behavior, forcing the aircraft into nosedives. Poor coding practices and insufficient testing of the MCAS software contributed to this deadly flaw, highlighting the grave consequences of software defects in safety-critical systems.

2. The WannaCry Ransomware Attack (2017):

The WannaCry ransomware attack infected hundreds of thousands of computers worldwide, disrupting critical infrastructure, businesses, and government agencies. The ransomware exploited a vulnerability in Microsoft's Windows operating system, known as EternalBlue, which had been patched months earlier. However, many organizations failed to apply the patch due to poor patch management practices or reliance on outdated systems. The widespread impact of WannaCry underscored the importance of timely software updates and robust cybersecurity measures in mitigating the risk of cyber threats.

3. Windows Vista (2016)

Operating system releases are a big deal. Microsoft and Apple put a ton of labor into each iteration of Windows and macOS, and there's a lot riding on each. So it's amazing that Windows Vista was such a horrific blunder. Designed to replace the aging Windows XP in 2007, Vista failed at just about every possible benchmark. It was bloated—50 million lines of code compared with XP's 40 - and buggy; tons of pre-existing apps didn't even work in it.

4. Tesla’s dream of self-driving cars (2021 to current)

Tesla has been promising self-driving cars since 2016, with mixed results, and social media is full of videos of "autonomous" Teslas doing absurd and dangerous things. Could the technology eventually work? Sure, but in February the company had to recall 54,000 cars, because the self-driving software let them just roll past stop signs. The sad thing is, Teslas are incredible cars. The company's insistence on having the cars drive themselves before the systems are anywhere close to ready is dragging it down, not helping it grow. And eventually, someone's going to get hurt. It’s a massive lawsuit waiting to happen.

5. Microsoft Co-Pilot on GitHub (2024)

It's too soon to trust Microsoft's GitHub Copilot to automatically fix your programming code. Microsoft itself has said that the program, sold as a $10 per month add-on to GitHub, "does not write perfect code," and "may contain insecure coding patterns, bugs, or references to outdated APIs or idioms." For a business owner or a developer that simply means that it can’t be trusted, and you run the risk of bringing your entire app or custom software to a screeching halt due to faulty code programmed by AI and with that – your business.

These examples underscore the far-reaching consequences of bad coding practices and the critical importance of prioritizing software quality, rigorous testing, and ongoing maintenance. As technology continues to play an increasingly central role in our lives, the stakes have never been higher for ensuring that software is developed and deployed responsibly.