By Molly Wasonga

Technology has revolutionised the way every facet of the world operates. Driving this revolution is data, without which, none of the changes we’ve seen in the last few decades would have been possible.

The common maxim today, is that data is more valuable than oil as Rezelde Botha, Business Unit Manager for data and analytics at Axiz, SA’s leading value-added ICT distributor points out. This commodity is driving all major business decisions today.

“Data is vital to helping businesses better understand their customers’ needs, as well as what is happening in the markets they operate in. It is also the key to understanding what drives our customers’ purchasing decisions, how successful our marketing efforts were, and more importantly, what we need to be doing better,” she says.

She adds that there is no question that in today’s world, data is indispensable. For any business, having the ability to gather its data, and analyse it to gain insights that can help the company make strategic decisions, is critical. However, the amount of data is growing exponentially, and organisations across the board are bursting at the seams, making this a highly complex and onerous task.

Data is big and it’s only going to get bigger, Botha says; “Consider the number of Internet of Things and other devices that are now producing and capturing data we have these days – this number has exploded over the last decade. The data we are accumulating every second has reached monumental proportions, and is growing by exponentially massive proportions each day.”

Big data she notes, is made up of three V’s, namely volume, velocity and variety. “In this way, organisations in every industry need to deal with data that is huge (volume), is growing rapidly (velocity), and comes from a wide range of sources (variety).

And of course, with each ‘V’ comes a unique set of challenges. In this day and age, I believe velocity is the greatest challenge, as new data is flooding us so rapidly, there simply isn’t enough manpower or time to deal with data manually. Data is outpacing our ability to manage it.

“This is where big data analytics can be enormously beneficial to any business. These tools are purpose-built to examine vast amounts of data to pinpoint hidden patterns, correlations and many other insights,” adds Botha

The big data and analytics tools we find today, give organisations the ability to analyse their data, and get almost instant answers from it, something that would never have Big data analytics enables businesses to harness the power of their data and use it to identify new opportunities. In turn, this leads to smarter business decisions, more efficient operations, cost savings, and improved customer experience.

“These technologies, bring dramatic cost advantages when it comes to storing vast quantities of data, and while they are doing this, can highlight more efficient ways of doing business. These tools help businesses analyse their data immediately, and make decisions based on what they have been told.” Rezelde Botha

With this, a business has a much better ability to measure customer needs and satisfaction, and through identifying patterns of behaviour, can give customers exactly what they want, and design products and solutions that meet their needs, she adds.

“There’s no question that data can be a competitive differentiator for the world’s top businesses as they look to transform themselves digitally too,” says Botha. “Data and analytics are fuelling every businesses’ digitisation and transformation efforts, and these tools are being used to stay ahead of the competition. In fact, analysts Gartner, predicted that within three years, a staggering 90% of business strategies will cite information as a critical business asset and analytics as a non-negotiable competency.”

Any organisation’s ability to compete in the emerging digital economy will require the right decisions to be made, and at speed. These decisions will need to be predictive, and look to the future too.

At the end of the day, Botha says that businesses of every size, and in every industry that are not investing in the way they manage their data to move them into the technological future, are at risk of becoming organisations of the past.

“Businesses who wish to succeed, and remain ahead of the curve, need to constantly strive for operational excellence. Keeping relevant and improving all the time is key to innovating and improving products, processes and services by boosting efficiency and increasing quality. It is this ongoing effort drives a competitive advantage for companies who get it right but consistency is impossible to achieve without the help of data and analytics tools.” she concluded.

Path to 4IR paved by SMEs

Source: IT Web

South Africa is poised to become a smart nation, as described recently by the acting CEO of the State Information Technology Agency (SITA) Ntutule Tshenye: delivering digital services to all citizens, introducing innovative technologies that make government business processes smoother and finding the best solutions to ICT challenges in public service delivery.

This is the vision for the public sector, but it appears across the country’s socioeconomic picture. Businesses are adapting to the new possibilities and citizens are coming to grips with the benefits offered by digital technologies.

Yet, while the country is on the right trajectory, it’s still got a long way to go. From realising digital strategies in a tough economy to closing the digital divide, the promises of the industry 4.0 era will only pay dividends once more has been accomplished. SME technology providers are particularly well-placed to get us there. But this requires more support and development for them.

“It’s important to look at avenues to activate the South African economy,” says George Masemola, Business Unit Manager at Axiz. “SMEs are widely regarded as the growth engines for modern economies. They can increase employment and grow GDP. But they must be capacitated with skills and other forms of support.”

SME growth engine

Faith in SMEs is well placed. Economies featuring strong growth almost inevitably have a vibrant SME community heating it. Globally, 95% of enterprises are SMEs and they are responsible for up to 70% of formal employment. But South Africa lags behind these trends: its SME sector employs 47% of the workforce. It also contributed around 20% to SA’s GDP – a figure that again lags well behind international norms.

For many, this represents an opportunity, one that will require different types of support. The most cited barrier is the policy environment, says Masemola: “State policy must support SMEs. There are already some policies, but those aren’t enough. There also needs to be better execution of policies. But we are seeing improvements under the current Presidential administration, such as rising seven spots in the WEF competitiveness ranking.”

Yet he doesn’t want this to come across as a problem for the state alone: “There can be more buy-in from the private sector. If more of them realise they can rely on SMEs to provide services to them – especially technology services – then it will really stimulate SME growth.”

Delivering technology services to South African businesses used to be the reserve of large providers, with SMEs focused on lesser projects. But the shifts in the market – such as the cloud, consumption-based costs, integrated platforms and commodity services – are letting SMEs participate in high-end and large digital projects.

In other words, they are poised to make a difference. But there are still challenges to overcome.

Enabling SMEs

Several barriers stand in front of technology SMEs. According to Masemola, these range from technical requirements such as skills to business capabilities such as access to market and trust-building.

SMEs can collaborate as one player in a project with larger technology providers. They might operate as an SME consortium or focus unilaterally on a customer project. This gives an idea of the different types of relationships an SME has to build to participate in SA’s digital channel. Most SMEs have a service to offer, but they struggle to establish those relationships.

This is where the larger channel players can make a big difference. Says Masemola: “The market is more services-orientated, which is good for SMEs. They can be more agile and stay close to the customer journey. But they can fall short in several ways, such as experience, capacity, ability to scale, finances, technical skills, business skills and sales skills. They might need coaching around contracts or someone to facilitate meeting customers and build trust. These are things larger players such as distributors and OEMs can help with.”

Some market leaders are proactively chasing this responsibility. For example, Axiz has established an SME programme that has been addressing all the above-mentioned issues. SME partners are vetted and then introduced to a supportive environment where coaching, support and trust-building pipelines are available to them.

By establishing criteria and frameworks with different OEMs and vendors, the programme gives SMEs opportunities to upskill and also meet suitable partners to collaborate with. This extends to private sector customers as well as the public sector, facilitated by a special framework agreement with SITA.

“The programme is a platform to cultivate trust,” says Masemola. “Those procuring technology services should know that an SME represented by our programme is reliable and suited to the project at hand. The SMEs involved get to develop the skills, performance and experience required to satisfy the market. It bridges the gap between what the market needs and SMEs can provide.”

Such programmes are critical if we expect SMEs to excel and add fuel to the South African economy. We can take hope in the market’s momentum and the excellent work done by these programmes. As more stakeholders get involved, it can pave the way for the emergence of Industry 4.0.

The recent slew of distributed denial of service (DDoS) attacks against banks and ISPs particularly in South African, highlighted a fact that even the largest organisations with the latest and well built security tools and solutions need to revisit their technology governance strategies.

A DDoS attack happens when threat actors attempt to make it impossible for a company to deliver its services, which is usually achieved by preventing access to networks, applications, servers, devices and suchlike. Essentially, these attacks work by flooding a system with requests for data, such as sending a web server way too many requests to serve a page that it inevitably crashes under the onslaught. Similarly, it could be a database being hit with a massive volume of queries, which will overwhelm the central processing unit, random access memory and available internet bandwidth.

“These attacks should be a wake-up call to organisations across the board,” says Rezelde Botha, Business Unit Manager for Citrix at Axiz, a leading value-added ICT distributor in South Africa. Adding; “Such attacks are not just on the rise, they are changing in nature.”

When it would seem like the new DDoS attack methods are taking over from those that have been successfully combated by the security community and law enforcement, they are unfortunately becoming increasingly complex. Businesses must find ways to protect themselves effectively.

Botha says the impact of a DDoS attack can range from a minor irritation from disrupted services to seeing whole Web sites, applications, or even the whole business taken offline.

“There are several symptoms that indicate a DDoS attack is happening, although initially, it can appear as if there are normal availability issues, such as a server or system being down. Sometimes, it can seem as if there are simply too many legitimate requests from real users taking place. However, traffic analysis will quickly separate the wheat from the chaff,” she notes.

Back in the 90s, attacks might have seen 150 server requests happening each second, a number that would sufficiently bring down systems at that time. Today, they can exceed 1000 Gbps, thanks to massive botnets we see today! Over the years, DDoS attacks have also gotten bigger and better if not worse.

Three years ago, as Botha narrates, the notorious Mirai botnet reared its head, attacking one internet performance management company called DynDNS. The Mirai botnet employed a hundred thousand hijacked IoT devices to achieve its ends, sending a barrage of DNS queries coming from tens of millions of IP addresses. This lead to services from giants such as Netflix, Amazon, Spotify, Tumbler and Twitter being disrupted, she explains.

“The Mirai botnet was notable in that, unlike the majority of DDoS attacks, it leveraged vulnerable IoT devices instead of PCs and servers. This is particularly concerning considering that there are already tens of billions of IoT devices in play, and this number is growing exponentially.” Rezelde Botha.

Early last year, another DDoS technique reared its ugly head seeing Software development platform – GitHub-  hit with an enormous DDoS attack, with a record 1.35 TB per second of traffic hitting its site. The platform managed to fight the attack off in under an hour, and only went down intermittently. The sheer scale of this attack raised an alarm within the security community.

She says interestingly, that the attack didn’t make use of massive botnets such as seen in the Dyn DNS attack, rather employed a far more simple method.

“This attack stemmed from memcached servers. Essentially, these database caching systems work to quicken networks and Web sites, but they aren’t meant to be exposed on the public Internet. Anyone can query them, and similarly, they will respond to anyone. Approximately 100 000 memcached servers, most of which are owned by businesses and other organisations, sit exposed online with zero authentication protection,” she avers.

Bad actors merely spoof the IP address of their target and send small queries to multiple memcached servers, at around 10 per second to each server, which are tailored to draw a far larger response. These memcached systems then return fifty times the data of the requests back to the victim.

In this way, threat actors can access them and send them a special command packet that the server will respond to with a much larger reply. In this way, unlike the usual botnet attacks that large DDoS attacks employ, memcached DDoS attacks do not need the power of a malware-driven botnet to achieve their ends.

One thing is clear, businesses need to find better ways to protect themselves against this sort of attack.

Step in Citrix NetScaler, which checks the client’s connection and request parameters to prevent flood attacks until a valid application request has been submitted. Using NetScaler, businesses can defend against attacks at multiple layers, including the application layer, the transport layer and the network layer.

A company’s best defence against loss of sensitive mobile data is having an effective enterprise mobility management solution, says Adeshni Rohit, Business Unit Manager for Cisco at Axiz.

Source: ITWeb

Adeshni Rohit, Business Unit Manager for Cisco at Axiz, says enterprise mobility management (EMM) solutions are soaring in popularity. And one of the major reasons behind this has been the rise in bring your own device (BYOD) policies in which businesses permit their staff to use their personal phones, computers and tablets in the workplace.

Rohit adds that nearly every organisation today allows certain employees to work remotely, using their own tools, and research suggests this trend is on the rise. However, with these devices being outside of the IT department’s control, businesses have to find a way to manage their use, lower security risks, and ensure they remain compliant. The mobile enterprise has changed the way organisations across the board approach workplace security.

“Staff who work from home or on the road need access to company data from anywhere and from any device. They access business e-mails, applications and files from a slew of devices, creating a major security headache for the tech department. They have to find ways to keep business data safe, while still supporting the mobile workforce. Mobile devices are more difficult to keep track of. They get lost and stolen with alarming regularity. They are more difficult to patch and update. And they use their own, unsanctioned applications.”

So, where to begin? The starting point must be evaluating mobile enterprise risk, she says. “The technical team needs to consider the very real level of risk posed by mobile users. And as with every tradeoff between ease of use and security, removing every single risk is simply impossible. The business needs to determine its appetite for risk, and then decide best how to protect its valuable proprietary and sensitive data.”

According to Rohit, businesses that are concerned about breaches can restrict access to critical applications, or give mobile access to data via encrypted virtual private networks (VPNs) or secure connections, preventing any packet sniffing, or interception of traffic details over connections that might be unsafe. “Highly sensitive data should never be housed where it can be reached because a bad actor happened to guess a password or network address, but even the most stringent controls can be beaten with enough determination and brute force. Those in charge of confidential data must protect it, and keep a keen eye on any vulnerabilities that may have been overlooked.”

Another way to secure data on mobile devices is through encryption, adds Rohit. Data in motion needs to be encrypted to prevent any unauthorised interception or access. Similarly, data that is stored on devices must be encrypted too. Security-conscious businesses make sure their networks are encrypted and corporate data is prevented from getting into the wrong hands. However, the plethora of employee devices and different operating systems fragment the encryption ecosystem and make centralised control an onerous process. Once again, there needs to be a balance between security and usability.

“Then there’s the human factor,” she explains. “The common maxim today is that the biggest danger to your organisation is probably sitting in the next office from you. Employees are often not conscious of basic security hygiene, and routinely break security protocols without a second thought, such as leaving flash drives lying around, sharing passwords or accessing sensitive data from unsecured WiFi networks at coffee shops. When employees are using company-issued devices, the technical department has greater control over mobile security, but even that control isn’t foolproof, the business can’t guarantee that the staff members won’t use their devices in a way that potentially exposes sensitive company data.”

Ultimately, a company’s best defence against loss of sensitive mobile data through a breach or act of negligence is having an enterprise mobility management (EMM) solution. EMM gives the organisation full control over remote devices, including remote software updates, control over device settings, device tracking, and even the ability to remotely disable, unlock or totally wipe a device. It is also the most effective way of guaranteeing compliance with company security policies. They can specify what, when and who, says Rohit.

Without EMM, the technical team is tasked with putting together a comprehensive, detailed view of network security, which would see them spending multiple hours scrutinising a plethora of data sources, manually correlating disparate data types, and trying to join the dots, she says. “This approach is onerous and fragmented, leaving plenty of room for errors and oversight. To really get on top of managing the mobile workforce and gain the upper hand over suspicious or anomalous activity across their networks, endpoints and employees, businesses need EMM to maximise their visibility into everything that is taking place on their networks and put barriers in place to prevent threats slipping through the net.”

Rohit adds a caveat: “All EMM platforms are not created equal. A comprehensive EMM platform must offer every single capability that is needed for mobile deployment, including device, application and content-level control in one, unified platform. It needs to be able to control and moderate which devices are allowed on the network according to their security posture. It needs to support all operating systems and devices, and must be prepared to support new devices as they enter the marketplace.”

With Cisco Meraki EMM, devices are centrally and securely managed from the cloud using a single Web-based dashboard. Its feature-rich, intuitive architecture enables customers to save time, lower operating costs and solve new business problems, she says. Its solutions provide total management for mobile devices and PCs. Users can provision settings and restrictions, manage inventory and device tracking, remote wipe an entire device or selectively adjust the managed apps and data, and remotely view and live troubleshoot using the included native remote desktop support.

EMM technology gives businesses the ability to manage their entire mobile ecosystem, bringing them greater control over which devices connect to the network and which applications, data and services staff members consume. “The mobile workforce is only going to grow, mobiles are already practically ubiquitous among employees, and with the appropriate controls in place, businesses will be better placed to benefit from the massive opportunities that mobile offers, while guaranteeing that sensitive information and resources are protected,” concludes Rohit.

What can AI do for cyber security?

For Tumelo Mashego, AI might be the cops that every machine needs in the fight against cyber crime.

Source: ITWeb

Axiz business unit manager Tumelo Mashego says even though artificial intelligence (AI) can be a thorny topic, especially with talks of it possibly taking humanity to new heights or the end of us, the truth is still undetermined since many of these futuristic systems don’t even exist yet.

But for Mashego, AI is a powerful tool when it comes to pattern recognition, which is very useful when trying to keep cyber criminals at bay. “These threats use the speed and volume of data, as well as the complexity of modern networks and technology stacks, to ferret into systems and hide attacks.”

Mashego adds that even though they are not clandestine, those acts can often create alerts. However, the problem is that so does everything else in an ICT estate. “According to a survey by the Cloud Security Alliance, over 30% of IT security professionals ignore alerts because there are so many false positives.” So the sheer volume of information generated by modern technology threatens to overwhelm security measures.

“The era of big data means there is much more going around. Adding to this is the fact that IT professionals have a lot to do because of digital transformation and other technology influences. They also don’t have the control they used to rely on because data can now leave the company’s parameters. Even just a poor BYOD security environment can become incredibly dangerous. Cyber security has never been harder and more complicated than it is now.”

Can AI save us?

Can one take out the security, but leave in the factors overwhelming it? Mashego explains that this is the very reason why everyone struggles with today’s information age, because we have too much information and too little time to make sense of it.

“AI has become popular for this specific reason. It can move faster than humans, take in more data, connect more dots through pattern recognition and respond at the blink of a computer’s eye,” she adds.

“Computer systems have not been inept against cyber attackers. But they tend to focus on high-volume and low-sophistication attacks. When a threat is much more advanced, more akin to a careful chess game than a random bug infection, it becomes much harder to spot. It’s how some black hat hackers have stayed inside systems for months and even years on end.”

But unsophisticated attacks can also have an edge that’s hard to stop. “Ransomware isn’t a very sophisticated attack. But once it’s in a system, it can spread quickly, right under the noses of security measures. You want to catch it at the source.

“Such unsophisticated attacks can also be introduced in sophisticated ways, such as spear-phishing. That’s when criminals use tailored correspondence to get to a specific person, usually to get their security credentials. Then the criminals can infect the systems using those login details.”

AI trained to spot for behavioural anomalies can spot such attempts. In a practice called multi-factor authentication, different indicators such as user behaviour, geography and timing are used to calculate if something doesn’t add up around certain credentials. It’s not that different from a bank noticing your credit card is suddenly being used in Burma, only more sophisticated in the behaviours it spots.

Sophisticated pattern recognition can also detect behaviours such as ransomware or malware trying to spread. With the right policies in place, the AI can lock down infections before they spread.

Augmented intelligence

Should AI be tasked to look after security? Mashego says no, that would be a bad idea, but not because of AI trust issues. AI is still a machine doing a specific task and AI becomes useless outside of its parameters.

Cyber attacks are also inevitably performed by humans who make a career out of subverting security systems. AI is just another system. Despite the dominance of AI in playing chess and go games, motivated and skilled cyber criminals can beat them. For good security, you need people in the mix.

Catch 22? Mashego points out this issue takes us back to the alert fatigue problem and another interesting statistic from that survey by Cloud Security Alliance: 40.4% of security professionals said they lacked actionable intelligence to decide on an alert. The most potent use of AI in security is perhaps to collaborate with human security professionals. “An AI can act quickly and stop certain things in their tracks. But that’s not foolproof. Humans have the intuition and experience to look at many factors and come up with creative explanations. AI can’t do that; not yet, anyway. But it can create greater context around alerts and decide what should be shown to security staff, who can then decide on the appropriate actions.”

This begs the question: Where is AI in today’s security products? Even though AI solutions are starting to appear, they are still quite scarce. One reason, Mashego says, is the cost associated: “You don’t just buy AI and install it and there it goes. AI needs to be trained and maintained. It can be a very demanding asset.”

Training is made harder by the availability of security data. Cyber attacks are a clandestine activity: even the good guys often keep serious cyber weapons secret. Access to such datasets is by its nature very limited. The massive resource demands mentioned above are also not to be underestimated. For these reasons, security AI is usually found through managed security services that can pool resources and data.

But Mashego adds that we shouldn’t focus on AI alone: “AI has potentially great benefits for security, but that doesn’t mean the other security practices fall away. Train people about good passwords and security hygiene. Put proper BYOD policies in place. Take data management seriously. Invest in end-point security and security skills, and work out the threats to your business for a security strategy. AI is emerging in today’s security products, but those products are also already really good. But they are meant to work with people and good security culture.”

AI won’t save us from cyber criminals. Yet, by giving a little help to humans and catching lightning-fast attacks before they land, it does create an advantage that we, and cyber criminals, can’t dismiss.

A wave of big data from a slew of ubiquitously networked devices, sensors and the Internet of things, is flooding organisations across the board, and putting pressure on data centres to deliver and perform at their peak. Moreover, the need for instant ‘from anywhere’ resources from the business perspective has led to the development of powerful ‘hyper-scale’ cloud data centres.

This is according to Tondani David Mphephu, Azure expert at leading ICT value-added distributor, Axiz. “It’s no surprise then that Internet giants, and the hyper-scale data centres they are creating to support their platforms, are at the centre stage of all conversations around the storage and data centre today. The sheer scale at which these providers are developing infrastructure, the innovation they are driving, and how they are competing are dominating the cloud landscape and are hot topics.”

He says the hyper-scale cloud was developed by the cloud behemoths (think Microsoft, Amazon and Google) to support the creation and delivery of software-based services at lightning speed, and with the lowest possible price tag. “They wanted a platform that underpins the ongoing, reliable and scalable delivery of software-based services without the expense and speed limitations that go hand in hand with physical hardware and networking infrastructure.”

The hyper-scale cloud is essentially a software-based environment that is removed from physical infrastructure so that all resources provided by the infrastructure can be manipulated quickly and programmatically, without long, onerous procurement cycles and time-wasting human intervention, explains Mphephu. Software applications developed to run in a hyper-scale cloud environment are designed to be fast, cheap to deploy and extremely resilient to physical infrastructure failure.

Hyper-scale cloud is not only removing the barriers to service innovation by allowing new software to be deployed almost as fast as it can be created, but it is also driving the democratisation of many technologies that were previously only available to large enterprises, such as analytics and AI, he says.

Mphephu adds a caveat: “All hyper-scale providers are not equal, and because software is at the heart of all technology innovation, those who get it right will devour market share. This is why Microsoft Azure now offers a whole new set of capabilities and features far superior to its competitors. The fact that more than 95% of Fortune 500 companies use Azure speaks for itself.”

Today’s world is an ‘everything instantly’ one. Organisations need speed to deliver, and they require vast amounts of data capacity. At the very core, hyper-scale is built on three pillars of speed – build, deploy and respond, and it can help businesses deliver on all of these. However, hyper-scale needs to be executed and deployed in phases, and this is one area where having a good partner is key, as they will not only help you plan to improve your time to value, but help navigate your organisation through deployment, ensuring that your business has the appropriate services and the space needed to expand.”

And because organisations can’t afford any downtime, a good hyper-scale partner will ensure your environment is kept up and running and is resilient, says Mphephu. A good partner will have the necessary experience in delivering hyper-scale deployments and will have proven themselves, by having a true understanding of the points of failure to avoid, and an appreciation of what challenges hyper-scale providers face.

Another major benefit of having the right hyper-scale partner is visibility. “We all know the old adage, you can’t manage what you can’t see, and in a hyper-scale environment, visibility is crucial. Too many people trying to manage such a vast ecosystem is asking for trouble, but at the same time, it is crucial that the right people know what is going on in the data centre at all times. To avoid a bad management situation, a hyper-scale provider will be able to deliver on a service-based integrated technology that gives your organisation the appropriate optics and controls that boost performance through a single pane of glass,” Mphephu continues.

Next, he says, is flexibility. “There’s no point in adopting hyper-scale if you don’t have the agility and flexibility needed to scale up and down as required, or if you are locked into a contract that doesn’t meet your needs. It is not an exact science, it is not always possible to predict exactly how much capacity the organisation needs now, never mind how much it might need in two years’ time. Finding out what works and what doesn’t can take time, and a good hyper-scale partner will work with your business, and scale and grow with you, as well as give you a flexible contract.”

Finally, in terms of cost savings, it’s no good jumping on the bandwagon without understanding the true cost of ownership (TCO). It’s not a question of an upfront expense, or even knowing exactly what you’re in for on a monthly basis financially into perpetuity. “It is crucial to work with a hyper-scale partner who can help align your business strategy with your hyper-scale needs,” concludes Mphephu.

Original article by ITWeb

AI can spot behavioural anomalies and take in more data and move faster than humans, but good security requires human intuition and experience too.

The topic of artificial intelligence (AI) can be a thorny one. Some say it will take humanity to new heights; others think it will be the end of us. The truth is hard to determine, especially since many of these futuristic systems don’t even exist yet.

But AI is certainly very powerful today, particularly if you want to do pattern recognition, and pattern recognition is very useful if you are trying to keep cybercriminals at bay. These threats use the speed and volume of data, as well as the complexity of modern networks and technology stacks, to ferret into systems and hide attacks.

But they are not clandestine. Those acts can often create alerts. The problem is that so does everything else in an ICT estate. According to a survey by the Cloud Security Alliance, over 30% of IT security professionals ignore alerts because there are so many false positives.

The sheer volume of information generated by modern technology threatens to overwhelm security measures, says Tumelo Mashego, Business Unit Manager for Security at Axiz. “The era of big data means there is much more going around. Adding to this is the fact that IT professionals have a lot to do because of digital transformation and other technology influences. They also don’t have the control they used to rely on because data can now leave the company’s parameters. Even just a poor BYOD security environment can become incredibly dangerous. Cybersecurity has never been harder and more complicated than it is now.”

Can AI save us?

Take out the security, but leave in the factors overwhelming it, and you have the reasons why everyone struggles with today’s Information Age. From finding songs on Spotify to producing situation reports, we have too much information and too little time to make sense of it.

AI has become popular for this specific reason. It can move faster than humans, take in more data, connect more dots through pattern recognition and respond at the blink of a computer’s eye.

Computer systems have not been inept against cyber attackers. But they tend to focus on high-volume and low-sophistication attacks. When a threat is much more advanced, more akin to a careful chess game than a random bug infection, it becomes much harder to spot. It’s how some black hat hackers have stayed inside systems for months and even years on end.

But unsophisticated attacks can also have an edge that’s hard to stop, as Mashego explains: “Ransomware isn’t a very sophisticated attack. But once it’s in a system, it can spread quickly, right under the noses of security measures. You want to catch it at the source.

“Such unsophisticated attacks can also be introduced in sophisticated ways, such as spear-phishing. That’s when criminals use tailored correspondence to get to a specific person, usually to get their security credentials. Then the criminals can infect the systems using those login details.”

AI trained to spot for behavioural anomalies can spot such attempts. In a practice called multi-factor authentication, different indicators such as user behaviour, geography and timing are used to calculate if something doesn’t add up around certain credentials. It’s not that different from a bank noticing your credit card is suddenly being used in Burma, only more sophisticated in the behaviours it spots.

Sophisticated pattern recognition can also detect behaviours such as ransomware or malware trying to spread. With the right policies in place, the AI can lock down infections before they spread.

Augmented intelligence

So, should we just hand over security to AI? No, that would be a bad idea, but not because of AI trust issues. AI is still a machine doing a specific task. Just like a lawnmower can’t do much on dirt, an AI becomes useless outside of its parameters.

Cyber attacks are also inevitably performed by humans who make a career out of subverting security systems. AI is just another system. Despite the dominance of AI in playing chess and Go games, motivated and skilled cybercriminals can beat them. For good security, you need people in the mix.

This takes us back to the alert fatigue problem and another interesting statistic from that survey by Cloud Security Alliance: 40.4% of security professionals said they lacked actionable intelligence to decide on an alert. The most potent use of AI in security is perhaps to collaborate with human security professionals. “An AI can act quickly and stop certain things in their tracks. But that’s not foolproof. Humans have the intuition and experience to look at many factors and come up with creative explanations. AI can’t do that; not yet, anyway. But it can create greater context around alerts and decide what should be shown to security staff, who can then decide on the appropriate actions.”

This begs the question: Where is AI in today’s security products? Even though AI solutions are starting to appear, they are still quite scarce. One reason, Mashego says, is the cost associated: “You don’t just buy AI and install it and there it goes. AI needs to be trained and maintained. It can be a very demanding asset.”

Training is made harder by the availability of security data: Cyber attacks are a clandestine activity: even the good guys often keep serious cyber weapons secret. Access to such datasets is by its nature very limited. The massive resource demands mentioned above are also not to be underestimated. For these reasons, security AI is usually found through managed security services that can pool resources and data.

But Mashego adds that we shouldn’t focus on AI alone: “AI has potentially great benefits for security, but that doesn’t mean the other security practices fall away. Train people about good passwords and security hygiene. Put proper BYOD policies in place. Take data management seriously. Invest in end-point security and security skills, and work out the threats to your business for a security strategy. AI is emerging in today’s security products, but those products are also already really good. But they are meant to work with people and a good security culture.”

AI won’t save us from cybercriminals. Yet by giving a little help to humans and catching lightning-fast attacks before they land, it does create an advantage that we, and cybercriminals, can’t dismiss.

 

Robotic Process Automation (RPA) is not about industrial automation alone, it is about eliminating repetitive, onerous tasks to enable employees to focus on more valuable business activities that require true, human intervention.

This can be described as an augmented workplace or one in which people and technology join forces to create something greater than either could do on their own. Organisations of every type are harnessing the power of technology to extend and enhance human capabilities in ways that help employees to be more productive.

Concurrently, this enables people to transfer lower-skilled tasks to machines, so they can focus their efforts on the high-value and creative tasks, which leads to improved efficiency, as well as the accuracy and reliability of work outcomes.

So says Bradley McCulloch, Business Head for IBM at Axiz, SA’s leading value added ICT distributor. “All organisations who are looking to boost efficiency in their processes should be investigating RPA, to reduce cost, speed up outcomes and have a better customer service experience all around.”

He says RPA can be defined as a tool that enables the business to capture employee inputs in rules-based processes and then use software to automate those inputs. “It is an extremely efficient way of automating those repetitive tasks within a workflow in a way that fits in organically with current working practices as well as the preferences of employees themselves.”

However, he says not all RPA tools are created equal. “Viewing RPA as a standalone, technology-only solution is a mistake. Unfortunately, few vendors have designed their solutions with a broader view of all the technologies available to them. While RPA alone can be valuable, what is infinitely more so, is a platform that can handle content management, decision making, tasks, and workflows. What is needed is a true, digital business automation platform.”

That is what IBM brings to the table.

“IBM Robotic Process Automation with Automation Anywhere, which is part of the IBM Automation Platform for Digital Business, increases the value of stand-alone RPA tools. It leverages the platform to augment task automation with additional automation capabilities, enabling RPA bots to orchestrate workflows, integrate with business rules and decisions, manage content and capture data. This is what truly sets it apart from its competitors.”

McCulloch says the platform helps businesses address their immediate task automation requirements and use the additional automation capabilities to make their RPA bots smarter, more agile and better able to help them achieve their business objectives.

One major objective for all businesses, he says, is digital transformation, and an RPA journey is intrinsically connected to this. “Driving strategic, enterprise-wide change is the sign of any successful RPA implementation. RPA has the power to augment and free up resources, completely redefining the standards of speed and efficiency, as well as significantly change the way the organisations operate.”

According to him, RPA enables the automation of a number of other tasks too, including monitoring customer activities for opportunities to upsell, as well as preparing data for customer subscription renewals. It can also help manage marketing campaigns by collecting data through web scraping and help to get the right information out to assist with marketing and sales activities.

Bots can also be programmed to monitor the policy status of your customer base, and identify gaps and opportunities for bundles or discounts, which will enable the business to send targeted communications to maximise opportunities.

Moreover, RPA can replace troubleshooting that is reactive in nature, with the proactive identification of any challenges, allowing issues to be resolved before they become a problem for the business.

“RPA also has a crucial role to play in alleviating the enormous administrative burden that our stringent regulatory environment has placed on businesses. Think about POPIA and its comprehensive policy to promote advanced data review rights and consent management. This is going to see a tidal wave of requests drowning every business, which will drive operational costs up. RPA tools can capture and interpret the data, action what is necessary, and provide appropriate responses.”

The use cases for RPA are endless, and one thing is certain, he ends. “Businesses who outperform their competitors are the ones that strive to continuously to improve how they do business, and that is what RPA is all about.”

Leading local value-added ICT distributor, Axiz, has cleaned up at the 2019 CONTEXT ChannelWatch Awards, winning in three categories – Customer Service, Retail Partner, and Overall Distributor of the Year.

CONTEXT ChannelWatch is one of the world’s largest online IT-reseller surveys, giving key insight into the behaviour, opinions and predictions of over 6500 IT resellers every year. As part of the survey, resellers in each country nominate distributors they work with for the CONTEXT ChannelWatch Distributor of the Year Award.

This year, resellers also went on to rate their distributors on a wide range of key service areas.  As a result, CONTEXT introduced five new ChannelWatch award, which reflect the breadth of service and value, which distributors bring to the modern IT channel. The new awards are: Customer Service, Innovation, Logistics, Retail Partner and Cloud Partner.

Craig Brunsden, MD of Axiz says: “We are thrilled to have won these awards, as they reflect the hard work and dedication that we have put in to growing all our partners’ businesses. It is a real honor to receive this recognition, as it shows that our efforts to provide our partners with top-quality ICT solutions to meet their needs are paying off.

He says the fact that the CONTEXT ChannelWatch awards are decided by the votes of the reseller and partner community is particularly rewarding, and is a testament to Axiz’s experience, and strong partner relationships, as well as the ecosystems it has built to meet their needs.

Howard Davies, CONTEXT CEO and co-founder said: “We are delighted to see Axiz win these three awards. It is a real testament to their ability to achieve excellence in this field, and the high regard in which they are held by their resellers.”

In the coming weeks, CONTEXT will be publishing important insights from the ChannelWatch survey, highlighting market trends in the IT distribution channel and driving greater understanding of reseller relationships with both distributors and vendors alike.

 

Building a data-driven business

  1. By Kirsten Doyle for Brainstorm

If data is the new oil, then grab yourself some cans before things get messy.

The most successful organisations understand the potential of their data as a competitive lever to gain insights into their customers. “However, any effective data management plan must be founded on an acknowledgement that data types have different needs for access, storage, and management, even as priorities around data usage change quickly,” says Rupert Brazier, country manager, Pure Storage.

Says Kate Mollett, regional manager for Africa, Veeam, the challenge for organisations is creating an integrated data management strategy that consolidates disparate cloud services, automates the movement of data across multiple workloads when and where it’s needed, and ensures the right data is available to the right decision-makers to add business value.

“We often find that people in charge of managing data sit in the security team where they’re responsible for data access, security and governance, or they sit in an infrastructure role where they’re responsible for data storage, backup and archiving. There seems to be a grey area in terms of who the ‘landlord’ of data should be. There’s also a third stakeholder to keep in mind, the people mining the data who sit in sales, marketing and even HR. The challenge currently is that everyone is operating in a silo and none of these roles form part of the same team.”

She says many local companies are using the cloud to automate processes. “However, for this to work effectively, data must evolve from policy-based to behaviour-based. Data must have built-in ML and AI to keep getting smarter about the best actions to take in any given situation. Automation improves the responsiveness, security and business value of data while reducing the cost and time that staff spend on manually managing and storing data, empowering the IT team to focus on getting real insights from their data.”

Tailored to fit

“It’s important to think about what your business needs from a data management plan and take into account when and how data is backed up and stored, how oft en, and how it is then accessed or managed once backed up,” says Joshua Grunewald, cloud hosting manager at Saicom. “Businesses need to consider how they use their data, how quickly they want to be able to restore data and how long and how far back they keep data. The plan should be a blueprint for the organisation – data management should never be seen as an insurance policy in the event of data loss or corruption, but as an opportunity to test, run reports and do integrity checks in an isolated environment. The plan also needs to make provision for storing multiple copies of data. The risk of a single copy is that if it’s compromised, there’s no backup available.”

Brazier adds that a properly executed data management plan must be built on a modern IT environment, consisting of data strategies based on flexible consumption models across on-premises, hosted, and public cloud, aligning application workloads with the most effective infrastructure. “Most importantly, a modern IT environment should work harmoniously with a common management interface, 100% non-disruptive architecture and proactive and predictive support services.”

However, the ‘big data’ trend has had a significant impact on data management. According to Mollett, the explosion of big data makes traditional backup and recovery inadequate. “Data availability is something that has become a business necessity. Companies can ill afford not to have access to data, especially when customers expect products and services to be available to them around the clock. To meet the challenges of this complex environment, businesses need a new approach to data management. Good data management practices need to ensure that data remains secure no matter the device being used to access it,” she says.

“I think the two biggest points here are that big data has created a much bigger need for finding more effective ways of doing data deduplication and compression because we’re storing so much of it,” adds Grunewald. “That technology has to stay ahead of the game and keep improving – otherwise you’re consuming more and more disk. And while it’s getting cheaper, it’s not endless or free, and moving things into the hyperscalers only works to a point. It’s important to do deduplication, but once you restore from it, it can be slow, so ensure that the hardware you’re using is purpose-built and can restore at speeds that match your business requirements. The second and biggest point is now that we are collecting so much data, what are we doing with it? How are we using it to make better decisions and how can we apply business intelligence to the sheer volume of data available?”

This brings us to dark data. “With the exponential increase in data volumes, it’s predicted that by 2020, every human will generate 1.7MB of data per second,” says Brazier. “Despite the growing recognition of data’s importance, only 0.5% is analysed and used. The remaining data, known as ‘dark’ or ‘cold’ data, is a big issue for companies.”

Dark data

Rezelde Botha, business unit manager at Axiz, says this dark data represents a massive, untapped opportunity, as well as a major threat. “ This data must be classified, managed and analysed appropriately to gain business insights and identify any data that might be putting the business at risk. If this doesn’t happen, they can’t use this data to gain a competitive advantage, and have little hope of remaining compliant in an increasingly harsh regulatory environment.”

Botha cites a recent report by Veritas, which showed that mobile and public cloud environments are two of the weakest chinks in a business’ information security armour, as the majority of data within these environments remains unclassified. “A large part of the problem could arise from the inability to assign responsibility for this data. The report revealed that a staggering 69% of global companies believe that data privacy and protection are their cloud service providers’ responsibility. This isn’t the case. Most contracts with cloud providers put the responsibility of data management squarely in the hands of the business. The problem isn’t helped by the fact that modern workforces are becoming increasingly mobile, and data increasingly distributed. Strengthening data security is key, as is gaining data visibility and control. You can’t protect what you don’t know you have, or what you can’t see.”

Says Botha: “The better the business understands its data, the better chance it has of lowering its risks. Given the flood of data drowning enterprises today, there’s no way that billions of files can be manually checked and classified. Luckily, there are good data management tools that employ intelligent algorithms, machine learning, and advanced policies to do this for you.”

Brazier says implementing a modern infrastructure and utilising today’s leading-edge technologies such as hybrid cloud, flash and NVMe can ensure that hot and cold categories are eliminated and all data can be considered hot.

Balancing on-prem and cloud

But while hybrid environments can help get a grip on dark data, to ensure effective data management across a hybrid environment, a lot of work is needed, says Brazier. “Businesses need to make strategic decisions about which cloud environment to leverage based on the type of data they’re dealing with, and the applications making use of that data. For example, if you have a mission-critical application that runs consistently every day, an on-premise configuration is best. It’s less expensive than running these kinds of apps in the cloud. On the other hand, workloads that typically spin up or down with some frequency, and that require lots of distributed compute, may be better suited for the public cloud, where they can take advantage of the economies of scale.”

For Grunewald, when it comes to hybrid environments, establishing a good framework to ensure that the data lives where it needs to is key to accessing data when it’s needed as quickly and seamlessly as possible. “Data required for long-term retention doesn’t need to live on-premise, because it’s accessed far less oft en. If you’re going to have a hybrid environment, only keep close what you need frequent access to, everything else can live in centralised environments in the cloud.”

Sean Hurwitz, BU lead: Insights at Trackmatic, says: “Hybrid data management is a mantra for a successful digital organisation. The strategy isn’t to move to a total, single environment, but to end up with a hybrid architecture and, possibly, multiple services and solution providers. Organisations need to move from Capex step changes to software, computing, storage, and resourcing on an elastic model. They also need to manage the explosion of data with increased governance while ensuring appropriate levels of datasecurity and privacy. The future architecture of the digital organisation rests on a multicloud, hybrid DevOps platform underpinned by connected and trusted data. Hybrid environments try to compromise on the ownership of data versus the high costs to maintain the infrastructure. Companies opt to store less sensitive data in the cloud, while maintaining internal servers for critical data.”

So what can we expect from data management in the next five years? Hurwitz believes there will be a huge focus on data ingestion and storage. “The impact on structured data ingested is the loss of the associated metadata and data lineage, combined with a reduction in data quality, and lifecycle management for the data being ingested. In addition, the next-generation data platform needs to have the ability to ingest and govern data at speed, supported by a bimodal IT application/product development model.”

Research suggests 80% of worldwide data will be unstructured by 2025. For many large companies, it’s reached that critical mass already. Unstructured data creates a unique challenge for organisations wishing to use their information for analysis.

Brazier believes that in the next five years, we’ll see more AI and a greater degree of autonomous management of data through cloud-based software. “As we continue to generate more data, the importance of software can’t be overstated. According to Gartner, 2019 should see worldwide spending on enterprise software reach $439 billion, an increase of 8.3% from 2018. Cloud-based management software that monitors data storage and the underlying infrastructure is the key to taking infrastructure to the next level, building in automation that saves time and money, speeding up innovation and insight for the benefit of the organisation and end-user. This also allows companies to access their data from anywhere, with 24/7 predictive support that can autonomously find and fix issues before you’re even aware of them. Also, due to the nature of SaaS, upon every login, you will automatically be using the newest version of the software to benefit from the latest features and improvements. Given these benefits, it’s not an exaggeration to say that deploying the right software can have the same effect as employing a team of highly skilled IT professionals.”

Data management is not only an imperative, but an end-all since data will be the driving currency of future businesses as to how they shape their growing needs and products. “All industries will benefit from the universal skills of data management and analytics because all industries will just be creating more data,” concludes Hurwitz.

 

Unsanctioned and unsecured cloud applications are creating a massive amount of risk for enterprises in every industry. In an on-premise data centre, the business’s IT, and security teams handle all aspects of and are responsible for, data security.

However, in the cloud, we find a shared responsibility security model (SRSM) that splits the responsibility between the customer and the cloud provider.

Bridgette Kemp, business unit manager at Axiz, says too often the business units that execute cloud applications and infrastructure don’t know that the organisation is also partly responsible for securing those cloud applications.
“This responsibility can include thoroughly vetting potential vendors, patching the sections of the cloud that lie under their purview, monitoring security alerts, and enforcing strong authentication. Unfortunately, this results in security teams having no involvement with crucial tasks such as vendor selection, security audits and suchlike.”

The division of labour needs to be carefully laid out, adds Kemp. “The report highlighted that although certain cloud service providers offer specific cloud security options, for example, encryption, it might well fall to the customer to decide if they should apply and manage these tools. At the end of the day, the security buck stops with the business, not the cloud provider, as the business has far more to lose.”
She says problem creep in when the number of alerts and incidents that enter an average enterprise security team become too onerous to handle, which happens quickly, particularly if anomalous end-user behaviour alerts are included.

A recent study conducted by Oracle and KPMG revealed that the average large enterprise handles some 3.3 billion events every month. “However, a mere 31 of those events turn out to be legitimate threats. And let’s face it, there isn’t a business out there who could afford to hire and train enough security analysts to scrutinise each alert to separate the genuine from the false.”

Unpatched systems are also endangering businesses, says Kemp. “When operating systems, applications or devices are found to contain vulnerabilities, it can take an extremely long time for IT and security teams to install and test the necessary patches or changes in configurations.”

The report suggests that more human resources isn’t the solution, intelligent automation is, as it can easily handle this kind of repetitive and mundane task, freeing up human resources to work on more valuable activities. “Encouragingly, she says the report revealed that automated patching is used by 43% of those polled in general, and 50% of the larger entities. Another 46% in general plan to use automated patching within the next year or two.”
Kemp says there are other steps that businesses can take to protect the burgeoning number of critical could services and applications they use too. “Education remains key. Ensure that all staff are trained on the wide range of social engineering attacks that cybercriminals employ, and keep up to date, because adversaries are increasingly cunning, and always looking for new ways to pull the wool over the eyes of unsuspecting users. Also, implement solutions that block phishing emails before they reach the inbox and have a monitoring tool in place that can pinpoint any anomalous behaviours that might be indicative of an email compromise.”

Moreover, she says businesses must get a handle on shadow IT, or unsanctioned applications and services that were brought into the organisation by employees. “Have strict policies in place to ensure that any use of third-party cloud services must have the full support and approval of the technology department. Shared responsibility goes beyond the business and the cloud provider, and every stakeholder in the organisation needs to play their role.”

However, without automation, too many potential threats can slip through the cracks, concludes Kemp. “It’s more crucial than ever that organisations employ automation tools to protect their information assets because security teams can only do so much. Executives in charge of security must have full visibility and control of cloud services and applications within their organisations, and all stakeholders need to understand that security is everyone’s problem.”

  • 1
  • 2

Follow us on social media: 

               

View our magazine archives: 

                       


My Office News Ⓒ 2017 - Designed by A Collective


SUBSCRIBE TO OUR NEWSLETTER
Top