2019 Tech Trends

Our top tech trends for 2019

2019-01-29 23:08:16
This first appeared in the Guernsey Press in January-March 2019.

Q: What will be 2019’s big tech trends?

A: It’s common to think of technology trends as discrete items that develop independently of one another. In reality it’s simultaneous advances in many areas that collectively advance the state of the art.

In 2019, we won’t see the emergence of one attention-grabbing new technology as we did with e.g., blockchain bursting on to the scene in 2017/18; instead we’ll see continued development of existing technologies where the overarching theme will be concentrated on the connectedness of services and data and in the increasing capabilities of machines to use these to act more autonomously - or in one word - automation.

Automation currently exists in many guises. There is a lot of hype around the buzzword-rich tech du jour such as artificial intelligence, machine learning and robotic process automation.

If we look beyond the monikers and instead analyse what these technologies really enable - more efficient business processes and access to faster, more well-informed decision-making - these areas will see a lot of growth in 2019 as the barriers to use of these technologies will continue to drop.

Adoption of automation technologies will be enabled by the continued growth and increasing availability of seamlessly connected cloud services in conjunction with advances in edge computing (and to some extent IoT) and the rollout of 5G.

Healthy competition in these areas should see these services more affordable than ever.

As part and parcel of this, we can expect an increased emphasis on information security - from the privacy standpoint - as GDPR really starts to take hold and businesses are held fully accountable for the ways in which they process data - and from a cyber-security standpoint where the rate and scale of exploits shows no sign of abating.

Nevertheless, 2019 should be a pivotal year for businesses and consumers alike when it comes to the speed, ease and cost of a truly connected user experience.

Q: What’s going on with cryptocurrencies and blockchain? I thought they were the next big thing?

In September 2017, a single bitcoin was worth around $4k. By January 2018, it had skyrocketed to an eye-watering $20k and the whole world was going mad for it. Today, it’s back down to less than $4k and continuing to fall. 

As with most technologies, the concept (the 'distributed ledger' more commonly referred to as blockchain) behind it is not new - it’s up to 20 years’ old depending on who you believe; but it was the emergence of its use as a form of currency around 2013 that needed no intermediary (such as a bank) that fired it in to prominence. 

The blockchain's use as a form of currency will invariably continue to be challenged whilst it struggles to shake a poor reputation due to a perpetual association with nefarious activities; but the distributed/decentralised ledger yet represents many opportunities where the concept of a publicly inspectable, unchangeable record of something is very appealing. Global supply chains, asset registers and binding legal agreements are at the forefront of these opportunities. So much so, that Guernsey has become the first jurisdiction in the world to draft specific legislation around the use and status of so-called smart contracts.

There are numerous technological hurdles for the blockchain to overcome before it really achieves mainstream adoption and we should expect 2019 to be mostly about this. One of the big issues of a publicly administered technology is that achieving consensus on exactly what should happen is often incredibly difficult - ironic given its ambition to do away with a single central ruling master.  But if it can do this, without splintering in to a myriad of competing entities, then it’s fair to say we have barely even scratched the surface of what the blockchain can do.

Q: Is automation going to put us all out of a job?

Of all the recent technology trends, automation remains at the forefront of them and remains our pick for the biggest technology disruptor in 2019. Robotic Process Automation (RPA) is the latest buzzword that’s been garnering a lot of interest over the last 18 months as the toolkits have become considerably more sophisticated and capable, whilst making lofty promises such as “no-code” solutions so that, in theory, at least, you don’t need specialist skills or consultants. 

Common candidates for automation technologies are the (typically) mundane, repetitive, time-consuming and often error-prone processes that need to be done but don’t add much in the way of value. A common example of this is the “we’ve always done it that way” business process - nobody can quite remember why they still do it and is usually a process that requires moving some data from one spreadsheet to another.  These processes are easily automated (or eliminated entirely) with something that operates the same way, every time and usually in considerably less time than it would take for a human to complete the same tasks. 

Of course, automation can be used in considerably more sophisticated ways and, coupled with the emergence of artificial intelligence, there is a growing fear that this will all result in rising unemployment as the machines take over, Skynet-style. Fortunately, research carried out looking at the last 5 decades of technological change does not support this fear; in fact, it finds that the reverse is mostly true - that automation in fact creates more jobs - in existing and emerging industries alike. Some diversification of job roles is inevitable but this should be at the expense of redundant processes and in favour of time spent on more applied which, for the moment, at least, the machines cannot do.

Q: I’ve got an Alexa-enabled device. Is it safe to use?

The popularity of voice-activated devices like the Amazon Echo (Alexa), Apple Siri and the Google Home Assistant - aka personal voice assistants - has risen meteorically in the last couple of years. This was prompted mostly by the novelty factor of having a semi-transient being to get quick answers from (‘Alexa, what time is it?’) and the convenience for use especially in smart home automation (‘Alexa, turn on the lights and set the TV to BBC1’). The relatively low price-points have made them more affordable than ever - but what is the real price? The cost of the device is not where the companies are making money - the bigger prize is the value in the data so that they can market (i.e., shopping suggestions or adverts) to you more effectively.  

Therefore ‘safe to use’ has many different contexts - which means there are many ways in which devices like these can be considered safe or not; but in simplest terms, there are two key questions to answer.

Firstly, how would you feel if your data escaped from the company? Would you be OK with recordings of your conversations being in the public domain? You would usually be forgiven for assuming that the biggest providers are more resilient to data breaches - because it’s actually mostly true - but even Amazon recently had a very embarrassing data breach to do with data captured on Echo devices. Large data breaches are not uncommon and in reality, there is little you can do to about them, other than exercising good personal security practices. 

Secondly, and more importantly, are you comfortable with devices that are always listening to you? This is effectively your own 'personal information safety’ and is really a personal decision based on various factors, such as how you feel about the company behind it and what they’re likely to do with the data. Some people are OK with it; some aren’t. Mostly, people don’t think about it. 

Whilst the GDPR should provide certain protection to consumers around fair usage, if you feel your conversations are sensitive (think Ashley Madison hack) or you place a higher value on the data that these companies are collecting about you, then it might be time for ‘Alexa, please switch off.'

Q: What technologies should I be working in if I want a career in programming?

It goes without saying that programming has evolved considerably in the last 20 years. Formerly the preserve of graduates of computer science or lifelong vocation in the trade, coherent in low-level languages such as Fortran, C/C++ and Java, programming nowadays has never been more accessible. 

Accelerated by the growth of web technologies, led by the markup languages of HTML in combination with CSS, client-side technologies like Javascript, server-side programming languages such as PHP and ASP, and database technologies like MySQL and PostgreSQL, the rapid development in the 90s and onwards of not only the programming technologies, but the development environments, the growth of pre-built frameworks and the increasing ease of deploying homemade solutions to cheap hosting environments started something - and then as game development environments such as Gamemaker and Unity, and the “app-ification” of everything took hold with the growth of the likes of Xamarin, Swift and Android - the industry of programming has become available to more people than ever, without the requirement of a formal education.

Doing it for fun or as a hobby is very different to doing it for a career. The strongest demand for skills is in Python - by no means a new language but one that covers a staggering number of bases. Once almost exclusively a web technology, it has expanded considerably to incorporate the hugely popular areas of big data and machine learning and is easily the best tip if you are just getting started. Knowledge of web technologies, especially HTML5, CSS3 and the various Javascript frameworks such as React, Angular and Node, to name a few, remain in high demand.

Web technologies however, will be somewhat niche when it comes to business in the Channel Islands. A broad experience in development technologies such as C# and the .NET framework will be valuable. This can be future-proofed by learning what it means to deploy these solutions to the cloud providers such as Microsoft Azure.

Q: Do we really need folding phones?

With the recent announcement of folding-screen smartphones from Samsung and Huawei, many people are asking a simple question: “Why?!” 

The primary selling point of these devices is that they give you the 'smartphone and tablet in one’ experience so you will no longer have to endure the inconvenience of lugging around two devices. They may have some functional use and a very practical purpose if the technology of the new bendy screens make them more durable and resistant to accidental breakage. And there’s no denying that they have some grade A, prime "gimmick-value" - these will no doubt make great conversation in the pub when they’re released. But for most of us, the £1500+ price tag is likely to be reason alone to not even consider splashing out, long before you consider whether you even really need one.

In reality, these phones represent much more than gimmicky consumer gadgetry. For the manufacturers, there are simple bragging rights at stake - Samsung got there first, beating, perhaps more importantly than Huawei, Apple to the mark. We will no doubt see more varied uses of the technology as it becomes adapted for e.g., installation in non-flat surfaces such as dashboards and headrests and perhaps most likely in medical and scientific scenarios where flexible and adjustable screens are likely to serve a very real and valuable purpose.

But above all, it encapsulates the very spirit of innovation - that is, defined in simplest terms, doing things differently. This is mostly looking at something we already do and asking that very same question: why? Why do we do it like that? Oftentimes the answer is underwhelming and, in such circumstances, considering whether a folding screen (or at least its metaphor) would help do things differently is a great way to bring innovation in to our daily lives.

Q: I keep hearing innovation being referred to as ‘disruptive’ - what does this mean?

The term ‘disruptive innovation’ was first coined more than 20 years ago as a way of describing ‘innovation-driven growth’ but has become popularised in recent years as technological developments have increased the rate and scale of change.

Innovation is typically disruptive when it causes a radical shift - either in an existing industry or by creating an entirely new industry or market. 

Importantly, though, not all innovation is inherently disruptive; and the tipping point is probably the biggest cause of disagreement. In other words, just because a new way of doing things is innovated, even if it causes considerable change - does not automatically mean it is disruptive. The originators of the term often refer to Uber as not being disruptive - despite the wide-reaching impact it has had, Uber is nevertheless operating (and innovating) within the well-established the taxi industry, but not disrupting it. 

Perhaps the simplest example is in the humble smartphone. The iPhone when it launched was undoubtedly revolutionary; but within the mobile phone industry was not intrinsically disruptive - it was, fundamentally, still a mobile phone. However, the smartphone was, over time, highly disruptive within the portable computing, handheld gaming and digital photography industries.

For industry in Guernsey and for Guernsey as a jurisdiction, innovation should be amongst the top three priorities on the board agenda.

Nevertheless, the distinction between innovation and disruption is important and one which we must remain watchful over; either in terms of opportunity for disruption - or, conversely, from the threat of disruptors entering the market.

One need look little further than disruption in the content delivery industry - the Netflix vs Blockbuster effect - to understand just how important this is.

Q: How can Guernsey make better use of modern technology?

Innovation is not the exclusive preserve of technology startups - all it requires is an idea (or problem to solve) and a bit of effort. Primary industries, such as finance, retail or tourism and secondary industries, such as estate agency, recruitment or marketing, can benefit from innovation. 

For example, the new electronic price tags in the Co-Op in St Martins are a good example of an innovation that could, in theory at least, allow them to have dynamic, demand-led pricing as well as improved stock monitoring.

If Guernsey is destined to become the 'innovation island’ of its ambitions, though, it stands to reason that some industries should be leading the way in both the private and public sectors. Like it or not, adoption of new technology in the other island is considerably more advanced. Government there have, for example, embraced IoT to deliver projects such as real-time bus tracking, car park monitoring and air quality monitoring and are rolling out more and more online services, which are not only of meaningful value to the community but have become part and parcel of the sales pitch of the island as a destination.

Guernsey could do more to embrace this. For example, the ‘please don’t sell your car here’ car park signs could be replaced by cheap IoT cameras and some image processing. Machine intelligence could be applied to traffic control to dynamically help with rush-hour congestion and the approach to road closures. The new electronic contracts legislation could be tested by enabling online trading of vehicle number plates.

The States have excellent infrastructure available through the Digital Greenhouse and the Guernsey Innovation Fund to stimulate innovation but has had only limited success to date, due mostly, it would seem, to having unrealistic expectations. 

To lead the way, the States must become more entrepreneurial in its approach to innovation and accept that the chance of success comes bundled with a risk of a failure. In other words, and to quote a famous Guernesiais saying: Bonnet de douche (Rodney.)