Gatwick Airport embraces IoT and Machine Learning

GATWICK -®JMilstein 05

As the eighth busiest airport in Europe and the largest single-runway airport in the world, London Gatwick Airport is an essential fixture for international travellers

In an effort to keep up with the demands of the digital world, Gatwick has recently announced the modernization its IT infrastructure, in partnership with Hewlett Packard Enterprise and Aruba.

Even though typical IT upgrades in airports take four years, Gatwick’s network was upgraded in just 18 months, all while avoiding downtime and instability. Work was completed overnight with just a 2 hour window for upgrades and 2 hours to roll back to the legacy network. Data links were limited with Gatwick’s old IT infrastructure, but the net network contains a cleaner meshed design providing up to 10 times more data connections. As new technologies continue emerging for consumers, the airport’s management, and the airlines as well as businesses in the airport who rely on their infrastructure, Gatwick will provide a robust backbone.

We speak to their CIO, Cathal Corcoran and Hewlett Packard Enterprise UK&I MD, Marc Waters below.

Most busy international airports have several runways and ample real estate. Gatwick, on the other hand must operate with a single runway and limited space. Maximizing efficiency is key to ensuring the airport is able to serve the needs of the UK. IoT enabled heat sensors will track movement and how busy the airport is, allowing management to better utilize their resources and improve the passenger journey through the airport. Tracking data lets the airport handle logistical issues that can’t be solved through expansion, ensuring a smoother and more efficient experience for customers and a better business foundation for airlines that operate in Gatwick.

World-Class WiFi

Smartphones, laptops, and entertainment devices have made the time-consuming process of air travel more tolerable and more productive, but serving such a large number of travellers in small spaces is a major challenge. Those in Gatwick can expect typical speeds of 30mbps, providing plenty of bandwidth for working online or streaming video while waiting. Fast and stable WiFi also provides smoother operations for airlines and other companies, enabling them to focus on offering excellent and affordable service without having to worry about outages or sluggish speeds.

Experts are good at finding great ways to utilize limited resources, which is particularly important at Gatwick. When aided by IT however, they can do even more. Machine-learning can detect busy areas in the airport through smartphones and tracking these results over the long term can provide key insights into optimizing day-to-day operations. When making decisions, Gatwick’s management will be aided with powerful data that can provide insights not attainable with more traditional technologies, and new the IT infrastructure will be a key to this analysis. Facial recognition technology will boost security as well as track late passengers, and personalized services based on smartphones or wearable technology can provide valuable updates to travellers on a personal level.

Gatwick Airport embraces IoT and Machine Learning TechNative
©Paul Prescott

Dealing with lost baggage can be a time-consuming and often stressful process. Armed with its new IT infrastructure, Gatwick and its airline operators are poised to offer a better alternative. Being able to track luggage and its owners creates new opportunities for simplifying the check-in and baggage claim process, helping get travellers in and out the the airport in a prompt and seamless manner.

Around 45 million people travel through Gatwick each year, and the airport’s unique constraints make operation an ongoing challenge. However, new technology offers tremendous promise that will serve Gatwick well for passengers today, and robust infrastructure provides a solid foundation for testing and implementing new technology for years to come.

Just announced! Latest addition to HPE SimpliVity provides support for Microsoft Hyper-V


HPE recently announced the latest addition to the HPE SimpliVity hyperconverged portfolio — HPE SimpliVity 380 with Microsoft Hyper-V.

This new addition allows HPE to offer a wider range of options for customer’s multi-hypervisor capabilities — delivering a more holistic hyperconverged offering with added management, optimization and intelligence options, and flexibility.

HPE SimpliVity 380 with Microsoft Hyper-V provides businesses with an easier IT infrastructure solution, simplifying the data center by converging servers, storage, and storage networking into one simple to manage, software-defined platform. The result is the increased business agility and economics of the cloud in an on-premises solution. The pre-integrated, all-flash, hyperconverged building block combines all infrastructure and advanced data services for virtualized workloads—including VM-centric management and mobility, data protection, and guaranteed data efficiency.

Below HPE’s Thomas Goepel looks at the architecture that’s bringing Hyper V to HPE SimpliVity.

HPE SimpliVity 380 now enables customers to deploy the industry’s most powerful hyperconverged platform with either Microsoft Hyper-V or VMWare vSphere private clouds.  The latest update was designed to provide customers with more multi-hypervisor support choices together with the benefits of HPE SimpliVity:

VM centric management and mobility: HPE SimpliVity hyperconvergence enables policy-based, VM-centric management abstracted from the underlying hardware to simplify day-to-day operations and enable the secure digital workspace.  It also provides seamless application and data mobility, empowering end users and driving increased productivity while increasing efficiencies and reducing costs.

Data protection: HPE SimpliVity hyperconverged solution delivers built-in backup and recovery at no additional cost.  These data protection features include the resilience, built-in backup, and bandwidth-efficient replication needed to ensure the highest levels of data integrity and availability, eliminating legacy data protection.

Data efficiency: HPE SimpliVity RapidDR automates the inherent data efficiencies of HPE SimpliVity hyperconverged infrastructure, slashing recovery point objectives (RPOs) and recovery time objectives (RTOs) from days or hours to seconds, with a guaranteed 60-second restore for 1TB VM2.

Below, HPE’s Stuart Gilks explains why the move expands hypervisor choice for customer use-cases.

Citrix Ready HCI Workspace Appliance Program

In addition, HPE also extended the partnership with Citrix by integrating the HPE SimpliVity portfolio, including the new HPE SimpliVity 380 with Microsoft Hyper-V, into the Citrix Ready HCI Workspace Appliance Program. The program allows HPE and Citrix to set the standard for customers to easily deploy digital workspaces with multi hypervisor, multi-cloud flexibility, resulting in world-class digital collaboration and a borderless, productive workplace.

About Chris Purcell

Just announced! Latest addition to HPE SimpliVity provides support for Microsoft Hyper-V TechNativeChris Purcell drives analyst relations for the Software-Defined and Cloud Group at Hewlett Packard Enterprise. For more information about HPE SimpliVity 380 with Microsoft Hyper-V, click here. To learn more about how hyperconvergence can simplify your datacenter, download the free e-book, Hyperconvergence for Dummies.

To read more articles from Chris Purcell, check out the HPE Converged Data Center Infrastructure blog.


Blockchain: A Chaotic Ecosystem?

Block chain network

After a couple of years of speculation, Blockchain and cryptotechnology finally received substantial mainstream attention in 2017

Governments got on board, ICOs progressed at a breakneck speed, and the market cap of the entire industry skyrocketed accordingly. The keyword now driving many discussions in the domain is “mainstream adoption”, with investors, blockchain entrepreneurs, and mainstream tech companies trying to predict where the market is going to go.

It is an important time for the industry, since first-mover advantage is likely to set the scene for blockchain platform selection in the next 10 years. Just as how Microsoft staked their claim to a massive user base with Windows, or how Google has amassed a fortune off the popularity of its search engine, the blockchain solutions that get the first foot in the door of the new tech ecosystem will be set up for future dominance.

However, it is anyone’s guess how this will shake out. The industry now is quite frankly a mess, with tech incumbents slow to get up to speed while the blockchain-native projects are beset by controversy and uncertainty as well as a seemingly never-ending proliferation of new competitors holding ICOs every week. Here we will briefly outline the major platforms and protocols in the oncoming scrap to be the king of the blockchain jungle.

The crypto-native heavyweights: Ethereum, Ripple, Stellar, Hyperledger, (R3) Corda and others
Given the seemingly endless applications of blockchain technology, there is a broad and blurry bracket of crypto protocols offering similar solutions. These can range from smart contract development in the case of Ethereum, blockchain architecture in the case of Hyperledger, or Ripple as a payment protocol. Here we’ll sketch out what each does.


Ethereum has usurped Bitcoin in many ways as the flagship of the crypto industry. Ethereum uses protocols and a blockchain to enable users to create complex transactions and smart contracts, using a Virtual Machine to automate in a transparent and decentralised way much of what is run on proprietary servers now. With the launch of the Enterprise Ethereum Alliance, the world’s largest open source blockchain initiative, Ethereum is pulling ahead of the pack in the enterprise-friendly stakes.


One of the most notably successful platforms of late, Ripple have developed a cryptographic protocol to enable superfast settling of financial transactions cross-border. It is in many ways a halfway house between crypto and fiat, using cryptographic protocols to enable international fast fiat money transfers. On the 13th of April Bank Santander release an app for money transfer which uses the Ripple protocol, and the platform allows third-party developers to use their open source framework and blockchain to integrate fast payments in their solutions.

The Ripple platform also includes a token (XRP), which doesn’t actually have to be used in Ripple protocol based apps.


The Stellar protocol grew out of Ripple, and is run by the Stellar Foundation. It is primarily targetting international value transfers like Ripple does, except it is more focussed on transfers using its own token and allowing third parties to process payments quickly and cheaply. Early users being targetted are non-profits, third world organisations, and ecommerce. One interesting thing about the platform is IBM’s involvement, and it seems IBM will lean on Stellar for a lot of its blockchain solutions in the future.


The Hyperledger project created by the Linux Foundation is focussed more on the building blocks (so to speak) of ledger technology. Companies intending to create their own crypto solutions or private blockchains can utilise the Hyperledger protocol and open source stack as tools. Intel are a notable partner, having developed the Hyperledger Sawtooth protocol to help speed up blockchain transactions.


Created by R3 (a consortium of major financial institutions), Corda aims to provide a range of business-ready blockchain solutions mainly focussed on financial transactions but also applicable to supply-chain transparency and securities.

Tech companies getting in on the action

Notable in their absence, Google has not had much involvement in blockchain, nor has Facebook. While some skeptics consider this a negative sign for blockchain technology, others would say that the transparent nature of blockchain is not in the best interests of these companies. Regardless, there are some tech incumbents weighing into the domain.

Meanwhile, Microsoft are ramping up for the release of their Coco Framework on Azure platform which has garnered a lot of positive press for be leveraging existing blockchain-focused consensus mechanisms and ticking all the control boxes enterprises require. Just after it was announced last year, Azure CTO Mark Russinovich spoke to TechNative about the move.

While they haven’t had much involvement, Amazon are working to make blockchain solutions like those mentioned above (especially Corda and Sawtooth) work seamlessly with AWS – unsurprisingly directly positioned against Microsoft’s Coco.

The bottom line: Anyone’s game

While individual businesses have a broad range of options open to them for developing blockchain and ledger solutions, we are really in the early stages of this domain where there is little head to head offerings or clearly defined markets. It will be interesting to see if established companies gain more traction with ledger technology or if the crypto native platforms hold their market share in this nascent industry.

About the Author

Blockchain: A Chaotic Ecosystem? TechNativeEoghan Gannon is senior writer at TechNative, a cryptocurrency researcher and entrepreneur. His interests lie in how blockchain technology is changing business.

8 AI Use Cases Every CxO Should Know About

Woman person stands at lake with cyberspace background

Artificial intelligence is changing from promising concept to a core technology businesses will need to use going forward

While reading about tools and frameworks can give C-level decision-makers ideas for incorporating the technology, it’s also worth exploring how others are making use of AI; after all, it’s important to close gaps created by your AI-adopting competitors. Here are a few AI use cases executives need to explore.

Fraud and Theft Detection

Dealing with fraud is all but inevitable when working in the financial sector. Retail stores assume some amount of theft as part of doing business. Classic monitoring is still essential for detecting fraud and theft, but AI systems can go a step further and detect problems while they’re underway. Modern AI systems are trainable and when given one or more sets of data, they can detect patterns that are hard to spot using more traditional techniques. By feeding systems banks of legal and banks of fraudulent data, these systems can be trained to detect fraud as it occurs. Likewise, AI in retail can detect popular locations for theft and, in some cases, even determine the time when theft is most likely to occur. This data gives store owners and managers useful information for improved store monitoring.

The AI-Enabled Office

Today’s offices are filled with technology. However, it’s easy for employees to become overwhelmed with technology, and the urge to multitask can lead to poorer performance. AI-based voice assistants and robotic processes can let employees disconnect from monitors for a moment to complete tasks. AI can also serve a purpose in intra-office communication; AI can alert users when changes are made to collaborative works, and it can help route information to the employees who need it most. AI is already being used for time management systems, and it can help employees schedule their days more effectively. The smart office is coming along, piece by piece. For executives and others focused on employee productivity and satisfaction, it’s important to take a more holistic view of AI and find out how it can be integrated to create a truly modern workplace.

We recently spoke to Mihir Shukla, CEO and Co-Founder of one of the market leaders in robotic process automation, Automation Anywhere.

Customer Service

Telephone helplines of the past have a well-earned reputation for being hard to use, and their infamy has poisoned the well when it comes to automated assistance. However, AI is increasingly playing a role in customer service, and recent reviews of modern technology are far more positive. By reading through past interactions with customers, modern systems are better able to sort through human language and determine what the customer is asking for. In addition, these systems can improve over time, as each new interaction provides more data to improve performance. While some customers will always prefer to speak with a human, many people claim to prefer interacting with a chatbot or other automated interface. Although these bots will lack the so-called human touch for some time, they can access large databases that are easy to update instantly, which can lead to better overall assistance.


Jobs throughout much of the developed world are becoming more intellectually demanding, and they’re becoming more specialized. Even though online recruiting and online resumes have helped companies connect with potential employees, the task of hiring the right person has become more of a challenge. AI tools are better able to sort through potential candidates and provide valuable heuristics to recruiters. Benefits don’t end after an employee is hired: AI systems are valuable tools for onboarding new hires, and they can even replace a number of human resources tasks. Companies don’t have to develop all of this technology internally, as a number of companies now provide AI-based recruiting services that can perform nearly everything up to an in-person interview, letting HR employees spend their time on other tasks.

Logistics Management

The value of computer technology is clear when it comes to logistics, and this is especially true when it comes to dealing with warehouses and shipping. As more and more customers expect free shipping, reducing logistics costs will be essential. Computerized inventory management and tracking is effectively mandatory in many fields, but AI is able to use this data to provide useful information and let companies find means of cutting costs. These cost savings tend to be incremental, but they add up over time. One area in which AI excels is finding counter-intuitive information. Shipping routes that no human or traditional computer system would even consider might have some unexpected benefits that machine learning can uncover. Furthermore, there may be subtle annual trends that are easy to miss in the data, and AI systems might be able to uncover unforeseen yet reliable trends.


The internet is invaluable for businesses of all sizes. With the power of the internet connectivity, however, comes a number of risks, as hacks can be expensive and lead to lost customers and a drop in reputation. The fundamentals of cybersecurity haven’t changed with the dawn of AI, but AI-based tools are better able to detect threats as they occur and take preemptive steps to halt would-be attackers. AI tools are also able to audit systems and find bugs that humans might miss. It’s worth noting that hackers are often among the first to adopt new technology, and it stands to reason that AI will lead to more sophisticated attacks in the coming years. It’s important to first focus on basic security, but executives should explore if AI-enabled cybersecurity is worth the investment.

Callsign’s Intelligence Driven Authentication (IDA) protects identities and defends against data breaches. We recently spoke to their commercial VP Daniel Grimes.

Energy Management

Computing has long-assisted energy management, from power stations down to the lines to customers’ homes and businesses. AI provides even better capabilities for detecting power usage patterns and ensuring data is routed properly. Multiple plants can share data to optimize their operations, as can regulators. Customers can benefit as well: AI-derived data can inform customers about their energy usage and to help them make more informed decisions about their consumption. For C-level decision-makers, energy management serves as a shining example of how cooperation, even among competitors, can make better use of resources. Two companies depending on shared resources can often save on expenses by combining their efforts to provide faster and more precise resource allocation. Similarly, providing customers from the type of transparent information AI is good at delivering can serve as a win-win for both parties.

Leveraging IoT Technology

The software dealing with IoT devices is complex, and many of the most popular tools rely on AI and machine learning to provide base functionality. However, AI can do so much more when fed data from IoT devices. Some IoT devices will inevitably fail, and AI algorithms can detect when a device is likely to break in the near future, allowing for preventative maintenance. Furthermore, AI can detect use modes that lead to bugs and other problems, enabling companies to request software fixes. Perhaps the most powerful technique, however, is to let AI roam free across IoT data and look for correlations. While correlated data doesn’t necessarily show causation, it’s often the case that unexplored metrics have more of an impact than initially imagined. Machine learning, in particular, can be great for finding unexpected knowledge buried within volumes of IoT-generated data, and it’s worth exploration.

Currently, AI today is far less human-like than experts predicted in the mid-20th century. But while we haven’t quite reached the singularity yet, hardware has become many orders of magnitude faster and cheaper, and the economic benefits of AI are making it a red hot field for research. While we’re just at the beginning of what many believe is an AI revolution, companies of all sizes need to take note of how the technology is advancing and how others are leveraging the technology. For C-level decision-makers who prefer to defer decisions to others, it’s time to personally take a closer look at what AI can do.

IoT’s Growing Security Challenge

Aerial view of a massive highway intersection at night in Tokyo, Japan

The Internet of Things holds tremendous promise. However, having so many connected devices presents significant security issues, and companies that fail to properly secure their devices place themselves at tremendous risk

In their latest report, entitled “The Internet of Things (IoT): A New Era of Third-Party Risk,” the Ponemon Institute found that companies are acutely aware of potential risks of IoT adoption. In total, 97 percent of respondents believed a data breach or cyberattack caused by an unsecured IoT device could be catastrophic. This comes in an era where IoT technology is growing rapidly and attacks are becoming more common. In fiscal year 2017, 15 percent of companies reported experiencing a data breach due to unsecured IoT devices; this number jumped to 21 percent in 2018. Similarly, cyberattacks caused by IoT devices rose five percent, from 16 to 21 percent.

Although awareness of the potential dangers posed by IoT devices in general, and third-party devices specifically, has risen significantly, security practices have been slow to increase. Only 29 percent of survey respondents claim to actively monitor IoT devices used by third parties. This stands in contrast to further fears respondents expressed, as 81 percent of respondents believe their company is likely to be a victim of a data breach caused by improperly secured IoT devices, and 82 percent believe IoT devices will lead to cyberattacks, including denial-of-service attacks.

IoT's Growing Security Challenge TechNative
© 2018 Ponemon Institute and The Santa Fe Group

In all, the report expressed a mixed state of affairs regarding third-party IoT risk and IoT security management in general. Although approximately two-third of those surveyed believe taking a strong tone regarding security from the top is important, 58 percent also stated that it is currently impossible to uncover whether IoT and third-party safeguards are appropriate. Furthermore, 53 percent of those surveyed rely on contracts, while 46 percent already have policies allowing them to disable devices that might pose a security threat. However, fewer than half can monitor compliance in these scenarios, which can increase the risk of data breaches and cyberattacks.

Perhaps most surprising was how frequently companies fail to inventory their IoT devices. Among the 56 percent of those surveyed who don’t keep an inventory of their IoT devices, 88 percent blamed the lack of centralized tools for managing IoT devices. In all, less than 20 percent of respondents are able to identify a majority of their IoT devices.

Finding in the report indicate that companies can improve their security by treating third-party IoT devices similarly to their internal devices. Although 50 percent of respondents report monitoring their internal IoT devices, only 29 percent monitor third-party devices. There were also signs of progress: The number of respondents who require third parties to identify IoT devices connected to their network increased from 41 percent to 46 percent.

So how do businesses offset the risk and ensure their adoption of IoT devices is secure? Properly defending the IoT requires using multiple technologies. Here are some of the most popular.

Traditional Networking Security

In some cases, tried and true solutions are still the best. IoT devices often connect through the public internet, and standard network security should be used wherever the IoT and public internet meet. Firewalls can prevent a broad class of attacks, and antivirus and antimalware products can determine if devices have been affected. One of the benefits of combining the IoT with the public internet is the maturity of security suites. Cisco, for example, has decades of experience crafting appropriate tools that are great options for securing IoT devices. Intrusion detection technology is especially useful, as it has the potential to detect attacks before they can cause harm.

Symantec EMEA CTO & VP Darren Thomson sees traditional security methods as “absolutely valid” in the age of IoT. Below he explains why security should be front of mind when making IoT buying decisions.


Again, tested technology is often the most effective, and encrypting data can prevent many popular attacks. All data should be encrypted as it’s sent over any network, even propriety and internal networks. However, it’s also worth encrypting data as its being used on the device itself, as preventing physical access to extensive IoT networks can be difficult. Not all data sent over IoT infrastructure is critical; sensor data relating to the temperature of hardware, for example, doesn’t present a risk. However, any data leaked to hackers can present further risks, so leave no data unprotected. Major vendors, including HPE, have extensive experience with using encryption. Symantec also offers a number of encryption tools.

Public Key Infrastructure Security

When it comes to control, PKI security is a step above other available options. Although it will take some time to set up, baking PKI security into an IoT installation will provide a degree of assurance unmatched by more ad hoc solutions to security. Furthermore, PKI technology is mature, and popular implementations, including X.509, have proven successful in critical environments. Fortunately, there are ample options for PKI security on IoT networks, and vendors are already familiar with the technology. DigiCert and Entrust Datacard are well known in the field, and large vendors including HPE and Symantec can provide proven solutions as well.

IoT's Growing Security Challenge TechNative


We sometimes fall into the trap of thinking IoT devices are simpler than they really are. Nearly all IoT devices are full-fledged computers capable of general purpose tasks, and their processors are often surprisingly powerful. The operating systems on IoT devices are substantial in size, and they’re likely to have bugs on occasion. Furthermore, the software they run can also potentially be hijacked. Because IoT networks are so large, releasing patches for all devices can be especially difficult, but it’s essential for preventing your infrastructure from becoming a zombie network. Make patching capabilities a key element of your network from day one to avoid potentially expensive problems down the line.

System Monitoring

The scalability of IoT technology accounts for much of its appeal, but it’s easy to lose track of just how large the network is and how many devices it contains. Because of this, visualization software has become a critical part of managing IoT networks. When picking tools for monitoring performance and visualizing the network, it’s worth picking up tools that also provide security feedback as well. Solutions from Kaspersky Lab, SAP, and others can use machine learning and other technology to detect potential threats and send alerts, letting your respond in a timely manner. IoT protection requires vigilance, and 24/7 monitoring is essential.

Focus on the APIs

Although lower-level security is key to designing a safe network, higher-level security is important as well. The most popular area to focus on is the APIs used to protect data while it’s in transit. Again, encryption plays an important role, but API-level security provides another means of ensuring data isn’t rerouted or compromised while in use. Furthermore, APIs provide a means for developers to make their software secure from the beginning, so programming mistakes, which are inevitable, are far less likely to lead to intrusions. There are ample options to choose from with picking an API, including options from Google, CA Technologies, and Akana.

The Internet of Things is transforming companies of all sizes, and this transformation shows no signs of slowing. With the opportunities provided by the IoT, however, comes a set of new obligations, as the many potential attack vectors on IoT infrastructure will prove tempting for malicious actors. Fortunately, there are plenty of options available for ensuring your IoT infrastructure is secure, and investing in security early on can allow you to reap the benefits of the IoT while keeping you critical data safe.

How Will Quantum Computing Impact Cyber Security?

3d render abstract technology background. Quantum macro zoom. High detailed patterns with red, green, blue colors blending with each other.

Quantum computing is not an incremental improvement on existing computers

It’s an entirely new way of performing calculations, and can solve problems in a single step that would take traditional computers years or even longer to solve. While this power is great in a number of fields, it also makes certain types of computer security techniques trivial to solve. Here are a few of the ways quantum computing will affect cybersecurity and other fields.

Today’s Security

Cryptography powers many of today’s security systems. Although computers are great at solving mathematical problems, factoring especially large numbers can be effectively impossible for even the most powerful computers, with modern algorithms requiring decades or even longer to crack. The nature of quantum computing, however, means that cryptography based on factoring numbers will be effectively useless.

Fortunately, many cryptography approaches in use today are designed to be safe from quantum computers that haven’t yet been built. Business, governments agencies, and other entities that place a high priority on security don’t necessarily need to switch to quantum-safe approaches just yet, but it’s important that organizations are able to make the transition promptly should quantum computing technology develop faster than anticipated. It’s also worth noting that other forms of security won’t be affected by quantum computing. Two-factor authentication, for example, will be just as effective.

Tomorrow’s Security

The basics on quantum computing sound almost unbelievable, but they’re based on well-established science and mathematics. Modern computers rely on discrete values; a bit is either a 0 or a 1. Quantum computers, on the other hand, are able to store both of these possibilities simultaneously in what are called qubits, and the value only truly forms when it is observed.

Combined with the equally baffling concept of quantum entanglement, which allow qubits to be bound no matter how far away they’re located, and quantum computing can open the door to cryptography techniques that are theoretically unbreakable. No matter how much computing power is dedicated to solving quantum-based security implementations, they’ll still provide a safe conduit to send data through. With certain implementations, keys used to encrypt data will instantly stop working if anyone attempts to uncover them, leading to inherent security.

Quantum Arms Race

The ability to defeat common security implementations makes quantum computers a goal for intelligence agencies. Anticipating their eventually invention, many intelligence agencies are believed to be intercepting traffic that can’t yet be cracked but that may be vulnerable in the future if it can be decrypted. The first agencies to gain access to quantum computing power will have a substantial edge on their counterparts in other nations, and news of quantum computing success will spur further investment in other nations.

Unlike the development of weapons, however, there are also commercial and academic interests in quantum computers, so developing an arms treaty seems unlikely. Furthermore, non-government entities can likely gain access to quantum computers as well, presenting even more risk for compromising data. These threats underline the importance of ensuring new security measures are able to handle existing computers as well as potential quantum computers.

Who Will be the First?

The first intelligence agency with access to a quantum computer will gain a significant edge, and the first company with quantum computers for sale will stand to gain tremendously. Some of the names are long-time staples, including IBM, which has made slow but steady progress toward quantum computing over the years and expects major advances during the next decade. Another big name is Microsoft. We recently spoke to their senior technologist Rob Fraser about the transformative impact of quantum computing.

While other companies are attempting to build quantum computers, IBM seems most likely to be the first to succeed.

However, it’s important to appreciate the role academia is playing. The concept of quantum computing is built on quantum mechanical theory, a field where typical hardware engineers have no experience. Contributions to the field have come from academic institutes with a strong history in technology, including MIT and Harvard. The perplexing nature of quantum mechanics, which is difficult to comprehend even for the world’s leading researchers, means that development will always be largely based on theory and not just engineering.

What to Expect

Although quantum computing can perform some tasks impossible or impractical on standard computers, they may never replace the typical computer architecture we’re used to. Quantum computers in development now are incredibly sensitive, and there doesn’t seem to be an engineering solution to this sensitivity. Furthermore, the calculations quantum computers excel at aren’t especially useful for standard computer tasks.

However, quantum computing will lead to scientific advances that can benefit society at large. Furthermore, they may play a role for internet infrastructure, potentially improving performance. Although there may not be a quantum computer in every home, the impact of quantum computing will be substantial if, much like quantum mechanics itself, unpredictable.

Toshiba Combines Edge Computing With Wearable Technology

Toshiba recently announced the release of new technology designed to combine the power of edge computing with assisted reality

Edge devices are typically viewed as self-contained computing device that offer little user interaction, but Toshiba’s portable dynaEdge DE-100 device combines with AR100 Viewer smart glasses to combine the power of edge computing with wearable technology.

Use Cases

Toshiba’s new technology offers excellent portability. The smart glasses are easy to wear, and the computing device is lightweight and simple to slip into a pocket. Although it’s being marketed as a general purpose device, Toshiba is placing an early emphasis on manufacturing and maintenance, fields that require workers to move around large areas. Logistics is a focus as well, as the camera, especially when combined with GPS, Wi-Fi, and Bluetooth capabilities, allows for better tracking and management compared to more traditional solutions. With up to 6.5 hours of battery life, this technology lasts for most of the day and can be swapped out if needed. Training is likely to be a popular use case as well, as the glasses can deliver information while keeping the user’s hands free.

We spoke to David Sims from Toshiba to find out more about the device.

Powerful Capabilities

There’s plenty of power in Toshiba’s new device. With a 6th generation Intel Core vPro processor, up to sixteen gigabytes of RAM, and 512 gigabytes of SDD storage, this portable device can perform complex tasks that go beyond the capabilities of many edge computing devices. Furthermore, it runs Windows 10, a first for an edge device of its type, proving excellent software compatibility. Skype for Business is supported, making the DE-100 a great device for communicating. Other Windows applications can be used as well, which might let some workers who were dependent on laptops use their smart glasses instead.

Also supported is the Toshiba Vision DE Suite, which allows frontline viewers to share files in addition to live video and images, making collaboration easier even across a large area. Toshiba’s partnership with Ubimax, a market leader providing enterprise software for smart glasses, makes their innovative product compelling for an ever deeper integration into a companies’ workflows.

The future of retail still lies in the store — it just needs an IT refresh

iot smart retail use computer vision, sensor fusion and deep learning concept, automatically detects when products are taken from or returned to the shelves and keeps track of them in a virtual cart.

The last few years have been tough for the retail industry, and 2018 hasn’t seen the situation improve

In the UK we have already seen large companies such as Toys R Us and Maplin fall into administration, resulting in significant store closures and job losses.

This has led both industry experts and armchair pundits to question the future of the retail store. Many have predicted ‘the death of the high street’.

Despite the ever-increasing portion of retail that is gobbled up by e-commerce, to say that the high street will soon be obsolete is obviously over simplistic. In fact, the future of retail will still be centred around the physical store, but its role and function will need to evolve.

Recent research found that physical retail will still account for 80 per cent of sales globally by 2025, an indicator that consumers still see huge value in high street shopping. However, the in-store demands and expectations of these consumers are changing quickly, and so retailers need to keep up if they are to remain healthy and competitive.
Ultimately, retailers need to take advantage of innovative technologies in order to deliver an in-store customer experience that drives engagement and loyalty. This is backed up by research from Vista which found that more than two thirds (67%) of consumers believe retailers should be taking advantage of technologies such as AI, AR and VR. The challenge, however, lies in embracing these technologies while simultaneously delivering cost savings.

Despite the best intentions of many retailers, they often face a common stumbling block in their journey towards the ‘store of the future’ — their existing store IT infrastructures. Many of these infrastructures are preventing the innovative in-store experience customers now expect, and so there needs to be a significant shift in IT approach if retailers are to realise their ambitions.

The future of retail still lies in the store — it just needs an IT refresh TechNative

If stores are to flourish in the future, they require agile IT infrastructures that deliver the applications on which new services are built quickly, cost effectively, and without fuelling the costly hardware proliferation in some stores. The infrastructures also need to be secure, easy to support and financially viable. IT needs to deliver cost savings today, as well as a platform for innovation.
Many current retail IT infrastructures are not optimised for the technologies customers now expect to see in-store, cannot easily be integrated to deliver personalisation across in-store and online, and often result in unnecessary and untenable costs.

What’s required is a focus on the ‘retail edge’ – transforming in-store technology to become cost effective, manageable and agile. This new “retail edge” approach enables the distributed store estate to be managed as an integrated whole — one that can run all required applications and experiences, be deployed, upgraded, secured and supported as one entity from a central point, rather than as a collection of disparate and dedicated in-store systems with separate management and support needs.

This implies a powerful distributed virtualised in-store technology to securely run multiple applications, peripheral hardware and more in-store. And while this technology delivers many of the flexibility and cost benefits of traditional cloud architectures, it needs to be in-store rather than in-cloud. The dependency on cloud availability and in-built latency is simply too high a risk for retailers to deal with — particularly for POS and other related peripherals. At the heart of such an approach must be intelligent automation technology to enable the control and updating of IT across the entire retail estate; simplifying previously complex IT tasks and ensuring a consistent and secure IT environment


For physical stores to reach their full potential in today’s market their role needs to be redefined, and this cannot be achieved without a redefinition of the technology landscape. Old alternatives of “run it all from the cloud”, or “keep the old stuff working” aren’t enough anymore. Fortunately, managed, distributed and virtualised in-store solutions provide the way forward and most importantly, retailers are not forced into an expensive ‘rip and replace’ buying cycle. There are many retailers now taking a ‘step by step’ approach to digitising the physical store with their legacy IT and achieving significant commercial benefits that have an immediate impact on the bottom line while providing a more innovative customer experience.

About the Author

The future of retail still lies in the store — it just needs an IT refresh TechNativeNick East is CEO at Zynstra. Zynstra’s intelligent infrastructure is transforming edge computing for retailers. Purpose-built for retail, it delivers high reliability automation.

IoT in Healthcare: Balancing Patient Privacy & Innovation

vital sign monitoring concept. 3d rendering. abstract mixed media.

Medical technology tends to lag behind other technologies, as the cost of mistakes at medical practices and hospitals can be astronomical

As a result, the field can lag behind when it comes to adopting the latest digital or IoT technologies. Patient privacy is a major issue, so all new technologies must be adopted carefully while adhering to various data compliance obligations that apply both to companies in general as well as healthcare organisations specifically.

Streamlining Hospitals and Clinics

Managing clinics and hospitals is complex, and it’s expensive. Many healthcare organizations rely on multiple computer and networking systems. Through smart bracelets, administrators can better track patient movement, and they can determine how often patients meet with their doctors. In addition, IoT technology can make it easier to track and analyze patients’ vital signs and other metrics, offering invaluable feedback and resolution not possible with manually measurements. Successfully managing a healthcare center is about maximizing resources, and IoT data and its analysis will prove to be invaluable.

More Time at Home

Occupied beds in a hospital are expensive, and most patients would rather spend their time at home. Through the IoT, doctors and other healthcare providers can receive real-time feedback on various medical metrics in patients regardless of their location. Sensors for patients with diabetes, for example, can provide instant feedback if certain levels are too high. Furthermore, IoT devices, when combined with artificial intelligence and deep learning, can even predict if individuals are likely to encounter problems in the near future, leading to more prompt and effective intervention.

Covering the Costs

Technological costs are unavoidable in healthcare. Fortunately, many hospitals and clinics have moved to cloud-based approaches over the years, and these systems tend to be better suited to IoT technology compared to older technologies. IoT devices are relatively inexpensive, and hospitals can conduct small trials to determine their effectiveness. Upgrade cycles are common when budgeting for medical care, so making the upgrade during the next technology upgrade period can make the costs more affordable and, in some cases, less expensive than maintaining current systems.

Security Demands

In the IoT context, some leaked data is fairly insignificant; if a farmer’s temperature sensors are hacked, for example, the impact is insignificant. In the medical field, however, any patient data that is accessed illicitly can lead to lawsuits and other sanctions. Much of the reason for the slow adoption of the IoT in healthcare is ensuring devices are hardened against attacks and meet stringent regulations. Data storage is central to compliance requirements in healthcare, but as Shannon Hulbert, CEO of Opus Interactive points out “accessibility is key”.

Established Companies and Startups

Established technology entities have been somewhat slow to provide medical IoT services. However, Amazon, Google, and other cloud providers now offer services for healthcare centers, pharmacology entities, and biotech companies. Startups are playing a major role as well, and many are using smartphones patients likely already have. Orbita brings healthcare services to digital assistants, including Amazon Echo and Google Home. Even relatively simple apps can provide significant benefits. Sunshine, for example, can track how often a patient is outdoors, which is useful for treating many conditions.

The high stakes involved in the medical industry mean organizations must take a conservative approach when adopting new technology. As the IoT continues to prove invaluable in other sectors, however, organizations are taking note. While the fully connected hospital will take some time to come to fruition, patients can expect to see their healthcare centers evolve over the coming years.

HPE announces general availability for HPE OneSphere

Developers in office

Hewlett Packard Enterprise (HPE) recently announced the general availability of HPE OneSphere, the industry’s first SaaS-based multi-cloud management solution for on-premises IT and public clouds

Delivered as a service, HPE OneSphere provides users with a single point to access all their applications and data from anywhere. HPE OneSphere is unique in that it provides a unified view of your hybrid IT estate—consolidating IT operations, developer tools, and business analytics in a single, collaborative platform.

Designed to bring collaboration across the entire enterprise, HPE OneSphere helps IT management, developers, and lines of business accelerate digital transformation.

With HPE OneSphere, IT managers simplify operations through visibility and automation. IT managers can view their entire hybrid IT environment — from a variety of public clouds to traditional on premises IT across containers, VMs and even bare metal. Automation and an unified portal to easily manage resources and applications from anywhere delivers a low-ops  experience, giving IT more time to pursue new, strategic tasks that can help grow the business.

HPE OneSphere also enables fast app deployment. With it, developers get self-service access to curated tools, templates, and resources. IT can also provide unique project workspaces for workgroups, projects, or individuals, streamlining the DevOps workflows.

Business executives can make better business decisions because of increased transparency. Real-time, cross-cloud insights enable CIOs and lines of business to increase resource utilization and reduce costs, improving efficiency across the board.

To learn more about HPE OneSphere, register for the live demo webinars at

*This initial release will be available in North America (US and Canada), the United Kingdom, and Ireland and will be extended to other regions with future releases.

About the Author

HPE announces general availability for HPE OneSphere TechNativeChris Purcell drives analyst relations for the Software-Defined and Cloud Group at Hewlett Packard Enterprise. HPE has assembled an array of resources that are helping businesses succeed in a hybrid IT world. Learn about HPE’s approach to managing hybrid IT by checking out the HPE website, HPE OneSphere. To read more articles from Chris Purcell, check out the HPE Converged Data Center Infrastructure blog.

1 2 3 5
Page 1 of 5