QSC18: The Need for Security Visibility in the Age of Digital Transformation

George Hulme

Last updated on: October 27, 2022

Enterprises are moving full steam ahead when it comes to their digital transformation efforts. They’ve aggressively adopted cloud infrastructure and other cloud services, IoT, application containers, serverless functionality, and other technologies that are helping their organization to drive forward.

Those organizations that are way down the road in their digital transformation efforts say that they’ve witnessed improved business decision-making – both when it comes to making better decisions and when it comes to making those decisions more rapidly. They also say that they’ve improved their customer relationships by delivering an improved customer digital experience.

So it’s time to celebrate and declare digital victory, right?

Hold off before we book the band and order the champagne for the big party. In fact, those who want to move forward securely and confidently in their risk and regulatory compliance postures have some challenges ahead.

In their respective keynotes this morning, at Qualys Security Conference 2018, both Qualys chairman and CEO Philippe Courtot and Qualys chief product officer Sumedh Thakar clearly explained the challenges ahead – and how to meet these challenges.

After watching both keynotes, I’m more confident that the security and visibility challenges posed by digital transformation can be met. But it’s also clear that to succeed, organizations will need diligence, a comprehensive strategy, and the right technologies.

Security professionals who have been around awhile are not new to the race between the need for the business to move fast and the need to move quickly securely. As Courtot discussed in his keynote, the technology industry (and enterprises) have long rushed forward with technology only to try later to find a way to secure it after it has been deployed.

In the late 1990s and early 2000s, it was about trying to layer security on top of endpoints, networks, and web applications with anti-malware software, intrusion detection systems, network, and web application firewalls, the encrypting of network traffic, and so on. Every new deployment required new security defenses.

Of course, the outcome from that approach was less than optimal. As Thakar pointed out in his keynote, with the rise of mobile, cloud computing, containerization, DevOps and continuous development – the speed of application and infrastructure deployment has created a level of complexity where security truly needs to be more tightly integrated into environments, and it needs to be continuously so.

After all, it’s never been easier than it is today to take a new software application, feature, or other enhancement from concept to deployment. And it’s also never been easier to deploy new technologies, new devices, and for users to access sensitive data from anywhere. While all of this has helped organizations with their digital transformation efforts, it’s caused many a CISO a sleepless night.

After all, speed and complexity are not natural allies of enterprise security.

Just consider the lack of visibility most CIOs and CISOs have into their mobile devices, virtualized systems, application containers, databases, serverless functions, cloud and on-premises storage systems, networks, cloud application services. So much technology is being deployed, so quickly, that there has been a significant loss of visibility into these systems.

As Courtot said during his keynote, organizations can’t secure what they can’t see. In today’s environments, there is a lot CISOs can’t see.

Thakar outlined a way forward. As he detailed in his keynote, enterprises need to be able to access the state of their environments and devices continuously, so that they can persistently see the security status of these assets and make swift adjustments that will reduce risk and maintain compliance to security and regulatory policies.

What does that mean, practically, for security professionals?

It means security professionals need to be able to see and catalog every asset in their environment, and they need accurate insight into the security status of their software devices. Such a capability would enable security teams to mitigate pressing risks quickly.

Such risks could include a new vulnerability disclosure in an open source library that is used in various places in the organization. This is an area many organizations don’t have good visibility into today. It could mean blacklisting known devices that place the enterprise at an unacceptable level of risk. And it could also mean ensuring containers run with the right security policies in place.

To do this, Thakar detailed how cloud agents, active scanning and passive scanning could be used to continuously collect asset data in real time so that organizations can persistently classify hardware, software and other attributes.

That continuous asset insight means enterprises can effectively deal with blacklisted devices or out-of-date systems and applications. This includes more than effective vulnerability management, but also the ability to stay continuously compliant to policy or common security frameworks. It also means CISOs can readily eliminate unnecessary systems and software that aren’t providing the organization value but could be increasing its attack surface.

All of this is crucial in the age of digital transformation, and those enterprises that do have the ability to manage their business-technology assets continuously will also not only be more secure but also nimbler and be better able to adjust as their business conditions demand. And if there’s anything sure about the years ahead, it will be that enterprises will need to move forward with more agility and to do so securely.

Watch all Qualys Security Conference videos.

Share your Comments


Your email address will not be published. Required fields are marked *