Thursday, April 7, 2016

America's Cyber Offensive

The Department of Defense took an intriguing step in the field of cyber warfare.  Earlier in the week, Defense Secretary Ash Carter made some overtures to transforming U.S. CYBERCOM from a sub-unified role under U.S. Strategic Command to a full combatant command (Tucker, 2016).  This perhaps signifies a dramatic switch in how America’s political and military leaders have begun viewing the cyberspace domain.  Carter said this evolving outlook signifies CYBERCOM’s emerging role in the fight against ISIS.  Given the terrorist organizations sophisticated use of social media and encrypted communication networks, moving CYBERCOM to the digital front lines seems to be the next logical step. 

Recognizing this progression also expands CYBERCOM’s larger role in the defense of the United States against a host of other digital enemies.  From nation states to organized criminal syndicates, the US has long endured an onslaught of cyberattacks against its critical infrastructure.  To illustrate the point, this week the Department of Homeland Security (DHS) released details on an attack against the American power grid back in January.  The attack resulted in the exfiltration of sensitive information from American energy companies along with the planting of Cryptolocker ransomware on networks belonging to three different utility companies (Pagliery, 2016).  This piece of malware has the potential to lock digital files perhaps resulting in the disabling of portions of the electrical grid.  Although DHS described the incident as “espionage” rather than a “cyberattack”, the organization also reported that “aggressive foreign government hackers broke into American companies 17 times between October 1, 2013 and September 30, 2014.”   

Whether the intent of these intrusions was espionage, theft, or even curiosity, the most important part of this story is that our infrastructure remains woefully at risk.  In light of this newest revelation from DHS it only makes sense to develop CYBERCOM’s role into a more offensive asset.

References
Pagliery, J. (2016). Government reveals details about energy grid hacks. WCVB.com. Retrieved from http://www.wcvb.com/money/government-reveals-details-about-energy-grid-hacks/38877110

Tucker, P. (2016). Carter may elevate CYBERCOM to full combatant command. Defense One. Retrieved from http://www.defenseone.com/technology/2016/04/carter-may-elevate-cybercom-full-combatant-command/127243/



Friday, March 18, 2016

Cybersecurity Investment Forecast

In a blog I posted a couple months ago, I mused about the usefulness of prognostications when it came to the field of cybersecurity.  These sometimes less than educated speculations are often obvious pieces of data regurgitated from other reports or even findings from previous years.  The one aspect of this process I find useful however is the financial component.  Cybersecurity experts may find fault with generic threats for the upcoming year however, decision-makers often use these reports to direct their ever-increasing IT and IS budgets. 

A 2014 PricewaterhouseCoopers survey found that 69% of executives expressed “concern about cyber threats.”  This number was increased to 86% in the 2015 survey (Meola, 2016).  What these figures indicate is that cybersecurity and its associated expenditures are not going away anytime soon.  One of the highlights from Meola’s article was the following infographic which illustrated the main drivers of cyber spending.  


Meola also introduced two interesting, albeit very expensive ($495) reports from the publication, Business Insider.  Highlights from The IoT Security Report and The Cyber Insurance Report include:
BI
*Research has repeatedly shown that many IoT device manufacturers and service providers are failing to implement common security measures in their products.

*Hackers could exploit these new devices to conduct data breaches, corporate or government espionage, and damage critical infrastructure like electrical grids.

*Investment in securing IoT devices will increase five-fold over the next five years as adoption of these devices picks up.

*Traditional IT security practices like network monitoring and segmentation will become even more critical as businesses and governments deploy IoT devices.

*Cyber insurance plans cover a variety of costs related to cyber attacks, including revenue lost from downtime, notifying customers impacted by a data breach, and providing identity theft protection for such customers.

*Annual cyber insurance premiums will more than double over the next four years, growing from to ~$8 billion in 2020.

*However, many insurance companies have been hesitant to offer cyber insurance because of the high frequency of cyber attacks and their steep costs. For example, Target’s notorious data breach cost the company more than $260 million.

*Insurers also don’t have enough historical data about cyber attacks to help them fully understand their risks and exposures.

*There are large underserved markets with very low cyber insurance adoption rates such as the manufacturing sector, where less than 5% of businesses have cyber insurance coverage.

What the infographic and these two articles demonstrate is that cyber threats, both perceived and actual, are on the rise.  Perhaps more importantly, the budgets associated with mitigating or transferring the risk from these vulnerabilities is also on a similar trajectory.

References
Meola, A. (2016). This one chart explains why cybersecurity is so important. Business Insider. Retrieved from http://www.businessinsider.com/cybersecurity-report-threats-and-opportunities-2016-3

Friday, February 12, 2016

U.S. Cybersecurity Sucks

Given an ever-increasing number of cyberattacks and a seemingly inexhaustible budget, the American public would think its government is doing a better job at cybersecurity.  Spoiler alert, it is not.  A recent article by Arik Hesseldahl (2016) shows that billions of dollars thrown at this problem has had limited effect at stemming the tide of cybercrime.

“A $6 billion security system intended to keep hackers out of computers belonging to federal agencies isn't living up to expectations, an audit by the Government Accountability Office has found.

A public version of the secret audit — a secret version containing more sensitive findings was circulated to government agencies in November — released last week concerns the Einstein system, formally called the National Cybersecurity Protection System and operated by the U.S. Department of Homeland Security.

The GAO found that the system has limited capability to detect anomalies in network traffic that sometimes indicate attempts to attack a network. What it can do is scan for and detect attacks based on a list of known methods or signatures. Most of the signatures used to scan for the attacks are available in commercial-grade products, though a few were developed specially for the government.

The system relies only on signatures and doesn't use more complex methods for detecting attacks. It doesn't analyze anomalies or odd patterns in network traffic that might indicate an attack. Analyzing anomalies can sometimes be useful in detecting attacks using "zero-day" vulnerabilities, so called because they rely on weaknesses in systems that are completely unknown, giving defenders "zero days" to figure out how to head them off.

"By employing only signature-based intrusion detection, NCPS is unable to detect intrusions for which it does not have a valid or active signature deployed. This limits the overall effectiveness of the program," the report reads. A security system that relies on signatures is only as good as the list of signatures used.

Additionally, the system was properly deployed at only five of the 23 non-military government agencies for which it was intended. And only one agency had deployed it to scan for possible attacks in email, a common vector for attacks.

The stinging report provides a reminder of just how bad government agencies have been in protecting their computers and the sensitive data on them. Last year the federal Office of Personnel Management, the government's human resources branch, disclosed a data breach that revealed information on some 22 million people who had worked for the government. The information stolen dated back decades, and included fingerprint data on nearly six million people. Private sector researchers later traced the hack to a group based in China.

It's also the latest proof that government agencies suck at securing their systems. The main reason for this is that agencies check off a list of vague requirements created by lawmakers and regulatory agencies when putting security in place. But they tend not to account for the risk that the requirements aren't sufficient.

None of this is exactly news in government circles. A study by the security firm Veracode last year found that after discovering security flaws in the software they use, government agencies fixed them by applying patches only 27 percent of the time versus 81 percent for private companies. Why? Because no specific laws or regulations require it.”

References
Hesseldahl, A. (2016). Federal government confirms that it still sucks at cyber security. CNBC. Retrieved from http://www.cnbc.com/2016/02/01/federal-government-confirms-that-it-still-sucks-at-cyber-security.html

Tuesday, January 5, 2016

2016 Cybersecurity Predictions

The start of a new year always holds the promise for emerging trends, technologies and threats in the field of cybersecurity.  One of my favorite traditions when it comes to this arena is reading the commonplace and sometimes absurd predictions that “experts” will prognosticate for the upcoming 12 months.  Jon Oltsik from Network World (2016) had some fairly timely and (in my opinion) likely options for 2016. 

“Mergers and acquisitions.  Okay, this one is somewhat obvious but allow me to add my own spin.  M&A activities will be robust with numerous big deals taking place before the RSA Security Conference at the end of February.  That said, many areas of cybersecurity are actually over-invested right now (i.e. CASB, next-generation endpoint security, etc.).  Once the first few deals happen, I foresee an industry panic where Johnny-come-lately VCs get cold feet and start fire selling.  As this happens, patient cybersecurity companies will be rewarded with cybersecurity technology startup acquisitions at relative bargain basement prices. 

The Beltway crowd jumps into the commercial market.  Federal contractors like Booz Allen Hamilton, CACI International, CSC, L-3, Lockheed Martin, and Northrop Grumman have strong cybersecurity skills and assets but little penetration into the commercial market.  Look for one or several of these federal integrators to follow Raytheon’s lead by establishing commercial cybersecurity divisions, hiring management teams with vast private sector experience, and acquiring companies with strong commercial cybersecurity market share.

Growing trusted systems offerings.  Technologies like the Trusted Platform Module (TPM) and Intel’s Trusted Execution Technology (TXT) have been around for years but few software developers have taken advantage of this system-level security functionality.  I believe we will see things start to change in 2016 as enterprises look to enhance mission-critical system integrity.  Oracle and VMware will join the trusted systems fray while phones will ring off the hook at focused players like Skyport Systems and Virtual Software Systems (VSS).

Cybersecurity technology vendors will open their own kimonos.  Driven by new types of threats, CISOs will continue to increase oversight of IT vendor risk management in 2016.  This will cause a reaction on the supply side as leading vendors trumpet their own internal cyber supply chain management and secure software development best practices as a way of differentiating themselves from more lackadaisical competitors.  Microsoft secure software development lifecycle (SDL) is a good example here, look for lots of others to emulate this type of model.”

Given past trends and predicted threats, these all seem likely to come to fruition.  As I searched for additional predictions on the future of my field, I came across an interesting article entitled Hocus-Pocus: The stupidity of cybersecurity predictions, from Computer World’s Ira Winkler (2016).  Winkler purports that all predictions are either a slight variation of each other, rehashed trends from last year’s DefCon, or worse a self-fulfilling prophecy.  That is, if enough reporters / politicians / security professionals say the power grid will be hacked then eventually it will.  Winkler does concede that occasionally the cybersecurity groundhogs can predict something correctly as one analyst firm did prior to the end of the millennium when they envisioned a Y2K-related billion dollar theft.  Given the potential for jumping on this bandwagon then, I will hazard my own safer prediction.  Technology will be exploited, and the world will need more people to stop it.   

References
Oltsik, J. (2016). Cybersecurity industry predictions for 2016. Network World. Retrieved from http://www.networkworld.com/article/3019106/security/cybersecurity-industry-predictions-for-2016.html

Winkler, I. (2016). Hocus-Pocus! The stupidity of cybersecurity predictions. Computer World. Retrieved from http://www.computerworld.com/article/3019063/security/hocus-pocus-the-stupidity-of-cybersecurity-predictions.html




Thursday, December 17, 2015

Anti-Forensics

The recent terrorist attacks in Paris and California has brought to light an interesting (albeit frightening) cybersecurity phenomenon: the use of commercially available encryption by ISIS.  For security professionals, the reason why these two incidents have become even more newsworthy is that the western world’s intelligence apparatuses appear incapable of breaking their encryption.  Although some in the intelligence and law enforcement communities blame Edward Snowden for tipping off terrorists to America’s surveillance capabilities, the realty of the situation is even more ominous (Gallagher, 2015).  The fact is that terrorist and criminal organizations have been using encryption and other anti-forensic techniques for decades.  Since the late 1990’s we’ve known that Al Qaeda used steganography and other obfuscation techniques to conceal electronic documents on CDs and USB drives.  The latest evolution of this trend has been ISIS’ use of end-to-end encrypted communications applications such as WhatsApp, Signal, and Telegram to encrypt communications and anonymize the recipient of the messages. 

In the spirit of depressing hopeful forensic analysts, let’s take a look at what the good guys are up against.  The broad range of anti-forensics is a category of tools and techniques that attempts to make investigations on digital media more difficult and therefore more expensive. Some of the more common approaches include (De Lucia, 2013):

 Data Hiding, Obfuscation and Encryption
Obviously, the great advantage of hiding data is to maintain the availability of these when there is need. Regardless of the operating system, using the physical disk for data hiding is a widely used technique, but those related to the OS or the file system in use are quite common. In the use of physical disk for data hiding, these techniques are made feasible due to some options implemented during their production that are intended to facilitate their compatibility and their diffusion, while other concealment methods take advantage of the data management property of the operating system and/or file system. At this stage, we are going to attack, as we can imagine, the first phase of an investigation: “Identification.”  If evidence cannot be found, in fact, it will be neither analyzed nor reported.

– Unused Space in MBR
Most hard drives have, at the beginning, some space reserved for MBR (Master Boot Record). This contains the necessary code to begin loading an OS and also contains the partition tables. The MBR also defines the location and size of each partition, up to a maximum four. The MBR only requires a single sector. From this and the first partition, we can find 62 unused sectors (sector n. 63 is to be considered the start of cylinder 1). For a classic DOS-style partition table, the first partition needs to start here. This results in 62 unused sectors where we can hide data. Although the size of data that we can “hide” in this area is limited, an expert investigator will definitely look at its contents to search for compromising material.

1.4 – Use of Slack Space
The “Slack Space,” in a nutshell, is the unused space between the end of a stored file, and the end of a given data unit, also known as cluster or block. When a file is written into the disk, and it doesn’t occupy the entire cluster, the remaining space is called slack space. It’s very simple to imagine that this space can be used to store secret information.  The use of this technique is quite widespread, and is more commonly known as “file slack.” However, there are many other places to hide data through the “slack space” technique, such as the so-called “Partition Slack.” A file system usually allocates data in clusters or blocks as already mentioned, where a cluster represents more consecutive sectors. If the total number of sectors in a partition is not a multiple of the cluster size, there will be some sectors at the end of the partition that cannot be accessed by the OS, and that could be used to hide data.  Another common technique is to mark some fully usable sectors as “bad” in such a way that these will no longer be accessible by the OS. By manipulating file system metadata that identifies “bad blocks” like $BadClus in NTFS, it’s possible to obtain blocks that will contain hidden data.

1.6 – Steganography / Background Noise
In information security, steganography is a form of security through obscurity. The steganographic algorithms, unlike cryptographics, aim to keep the “plausible” form of data that they are intended to protect, so that no suspicion will be raised regarding actual secret content. The steganographic technique currently most widespread is the Least Significant Bit or LSB. It is based on the fact that a high resolution image is not going to change its overall appearance if we change some minor bits inside it.  For example, consider the 8-bit binary number 11111111 (1 byte): the right-most 1-bit is considered the least significant because it’s one that, if changed, has the least effect on the value of this number.  Taking into account a bearing image, therefore, the idea is to break down the binary format of the message and put it on the LSBs of each pixel of the image. Steganography, obviously, may be used with many types of file formats, such as audio, video, binary and text. Other steganographic techniques that should surely be mentioned are the Bit-Plane Complexity Segmentation (BPCS), the Chaos Based Spread Spectrum Image Steganography (CSSIS) and Permutation Steganography (PS).

1.7 – Encryption
Encryption is one of the most effective techniques for mitigating forensic analysis. We refer to it as the nightmare of every analyst. As just mentioned, using strong cryptographic algorithms, for example AES256, together with the techniques described above, adds a further fundamental level of anti-forensics security for the data that we want to hide. In addition, the type and content of the information that we want to protect or to hide, can never be compared to anything already known, because the resulting cipher-text of a good cryptographic algorithm are computationally indistinguishable from random data stream, adding the so-called “plausible deniability” on top of all our encrypted documents.  The most widely used tool for anti-forensics encryption is certainly TrueCrypt, an open source tool that is able to create and mount virtual encrypted disks for Windows, Linux and OS X systems. 

2.3 – Timestamp Alterations / MACB Scrambling
In a few words that summarize this sub-chapter, the purpose of these activities is to prevent a reliable reconstruction of the operations performed by a user or during the breach of a system.  Usually, these events are reconstructed in a “timeline” primarily through the use of MACB timestamp parameters of the file system, where MACB stands for “Modified, Accessed, Changed, Birth.”  It’s important to note that not all file systems record the same information about these parameters and not all operating systems take advantage of the opportunity given by the file system to record this information.  When we are going to change these attributes to confuse a forensic analyst, the tool that certainly comes first to mind is “Timestomp.” The software’s goal is to allow for the deletion or modification of timestamp-related information on files. The practice to completely delete these attributes, however, is not advisable as it is already evidence of changes occurring in the system.  It’s important to note that “Timestomp” can modify only the SI ($STANDARD_INFO) MACE values and, after modification, a forensic analyst could still compare these valueswith those in FN ($FILE_NAME) MACE to check the accuracy of the information found. The comparison with the FN MACE is the only point where it is useful to look for changes occurred in the timestamp parameters (excluding other data from external systems). This means that if we can modify FN MACE attributes, we can also profoundly confuse even an expert analyst.

2.4 – Log Files
There’s not much to say about the log files. Every computer professional knows of their existence and the ease with which they can be altered. Specifically, in contrast to a forensic analysis, the log files can be altered in order to insert dummy, misleading or malformed data. Simply, they can also be destroyed. However, the latter case is not recommended, because a forensic analyst expects to find some data if he goes to look for them in a specific place, and, if he doesn’t find them, will immediately think that some manipulation is in place, which of course could also be demonstrated. The best way to deal with log files is to allow the analyst to find what he is looking for, but of course making sure that he will see what we want him to see.  It’s good to know that the first thing that a forensic analyst will do if he suspects a log alteration, will be to try to find as many alternative sources as possible, both inside and outside of the analyzed system. So it is good to pay attention to any log files replicated or redundant (backups?!).

– Data Deletion
The first mission of a forensic examiner is to find as much information as possible (files) relating to a current investigation. For this purpose, he will do anything to try to recover as many files as possible from among those deleted or fragmented. However, there are some practices to prevent or hinder this process in a very efficient way.

– Wiping
If you want to irreversibly delete your data, you should consider the adoption of this technique. When we delete a file in our system, the space it formally occupied is in fact marked only as free. The content of this space, however, remains available, and a forensics analyst could still recover it. The technique known as “disk wiping” overwrites this “space” with random data or with the same data for each sector of disk, in such a way that the original data is no longer recoverable. Generally, in order to counter the use of advanced techniques for file recovery, more “passages” for each sector and specific overwriting patterns are adopted.  “Data wiping” can be performed at software level, with dedicated programs that are able to perform overwriting of entire disks or based on specific areas in relation to individual files.

– Physical Destruction
The technique of physical destruction of media is certainly self explanatory. However, we should focus on the most effective and clean of these: disk degaussing.  “Degaussing” refers to the process of reduction or elimination of a magnetic field. This means, when referring to hard drives, floppy disks or magnetic tape, a total cancellation of the data contained within these.  Although it’s very effective, degaussing is a technique rarely used because of the high costs of the equipment needed to put it into practice. In view of modern magnetic media, to use this technique means to make the media totally unusable for future writings. (De Lucia, 2013)

I’d like to report that there’s some good news for those agencies hoping to thwart evildoers, but it only gets worse.  Agencies like the NSA no longer receive backdoors into these tools from the developers.  Moreover, “even if the US government were to press forward a demand for companies such as Apple, Facebook, and Google to provide a way to tap into message traffic, that would do little to prevent the use of existing peer-to-peer encryption and other encrypted social media tools by terror organizations (Gallagher, 2015).”  Long story short is cybersecurity professionals need to stay vigilant.  The best tool we have moving forward is staying current on trends and techniques.  See you at the next Def/Derby-con.  

References
De Lucia, E. (2013). Anti-forensics: Part 0x01. Forensics. Retrieved from http://resources.infosecinstitute.com/anti-forensics-part-1/

Gallagher, S. (2015). ISIS using encrypted apps for communications; former intel officials blame Snowden. Ars Technica. Retrieved from http://arstechnica.com/information-technology/2015/11/isis-encrypted-communications-with-paris-attackers-french-officials-say/




Thursday, November 5, 2015

Cybersecurity in the 2016 Presidential Election

Colossal data breaches, persistent cyberattacks, and contentious legislation all dominate the headlines except when an executive branch hopeful is involved.  To date, presidential debate topics have included the economy, gun control, overzealous policing, and even the regulation of fantasy sports, but not cybersecurity.  This is ironic considering last week the Senate passed the Cybersecurity Information Sharing Act (CISA), a carbon copy of the same privacy destroying bill first defeated in 2012.

A little background on CISA: “supporters say that it could prevent security breaches in the future by encouraging private companies to voluntarily share information on cyberattacks with the government. Opponents don't like the potential for abuse, especially after the details of the National Security Agency's surveillance program were made public” (Wagstaff, 2012).  To date, the only major candidate with a stance on CISA or national cybersecurity legislation has been Bernie Sanders.  Although Sanders supported the Cybersecurity Act of 2012, like Paul Ryan he opposed CISA on privacy grounds.  Hillary Clinton on the other hand hasn’t taken a public stance on the legislation at all.  Although the former Secretary of State has campaigned on the importance of enhancing America’s cyberdefenses, her stance on this subject is somewhat muddled by her use of an insecure personal email server.

On the Republican side of this equation, none of the major candidates have issued any definitive opinion on cybersecurity.  Jeb Bush comes the closest with his criticism of President Obama’s handling of the OPM breach.  The former Florida governor has written at length on the issue of cybersecurity on his website outlining his position on the topic.  And unlike Carson or Paul, Bush supports CISA, writing that the United States should “reduce legal and technical barriers to cybersecurity information sharing between the federal government and private sector” (Wagstaff, 2012).

Unlike many of the other topics dominating the headlines, few experts see cybersecurity as a partisan issue.  There shouldn’t be a Democrat or Republican position on this matter.  Although the president holds little budgetary power, the executive office does nominate the heads of the Departments of Justice, Defense, and Homeland Security; all influential positions when it comes to cyber.  Given the lack of appeal this topic represents for most Americans, it isn’t unusual how little cyber is talked about in the president cycle, it is however still somewhat unsettling.   

References
Wagstaff, K. (2015). Why aren’t presidential candidates talking about cybersecurity? NBC News. Retrieved from http://www.nbcnews.com/tech/tech-news/why-arent-presidential-candidates-talking-about-cybersecurity-n451826




Friday, October 23, 2015

Enhancing Cybersecurity

One of the most discussed cybersecurity topics in recent years has been the concept of regulatory compliance.  Many agencies and industries within the United States are covered by some form of legislation or at least a set of best-practices, and yet most of this guidance fails when it comes to “advising organizations on the ins and outs of information security” (Sharkasi, 2015).  This is where organizations like ISACA and NIST play an important role in covering the gaps in IT education.  In a recently published article by ISACA entitled, Addressing Cybersecurity Vulnerabilities, Sharkasi covers a lengthy framework of improvements organizations should address to improve their overall security posture.  The following are some of the more salient points.

Emerging Technology Risk
“Assessing and minimizing the risk of emerging technology security are the first things enterprises do before using Internet of Things (IoT) technologies to manage IT systems, building equipment, smart phones and other web-enabled intelligent systems. To reduce risk, enterprises should pay more attention to newly proposed technology initiatives, ensure involvement of IT auditors in the early stages of any IT project, and extend the audit scope to include new technologies and management systems. Additionally, the performance of post-implementation review should be considered or viewed as a value-added audit project by the audit team. The audit team needs to have the right level of support and sponsorship to engage in the early stage of any IT projects. Auditors should play a significant role in IT projects and be part of the monitoring processes to ensure quality inputs and the merits of the project, rather than simply being involved with the outcome.”

Mind the Internal Threat
“While the majority of enterprises use networks as the backbone for secure data exchange transactions, standard encryption and firewall technologies can provide some measure of protection from outside attacks and theft by competitors, hackers or mercenaries. But what about the internal threat committed by the enterprise’s employees armed with computer access and passwords? The employee element is commonly overlooked. In fact, one of the most common bugs exploited by hackers to gain access to the inner workings of equipment is using default passwords. Default passwords are, from a manufacturing point of view, a convenient way of ensuring that its engineers can get into the company’s own computers when carrying out maintenance. Too often, security administration is overwhelmed with the task of trying to do it all (e.g., managing operating systems, applications, network, mobile devices, physical security). Security administration must segregate duties and define and deploy a security policy for one area before moving on to another hot spot. In conjunction with preventing internal irregularities, segregation of duties (SoD) should be applied so that the person responsible for assessing users’ level of access authorization is not the same person who implements the access controls.”

Struggling to Deal With Legacy Systems
“Now that Microsoft has pulled the support plug for Windows XP, financial institutions (FIs) and companies that have not switched to Windows 7 need to explore their options. For FIs, this means upgrades to Windows 7 and Agilis 3 are required to keep up with the latest patches and maintain Payment Card Industry Data Security Standard (PCI DSS) compliance. Most FIs began a legacy system replacement early in 2014. But some FIs failed to truly understand the complexity of management reporting they had developed internally over the years, not to mention integrating multiple systems from different vendors. Specifically, neglecting the reliance on numerous system features or databases that tied to the old system required processing and culture changes to switch software and get off of those old functions. For these reasons, FIs felt that they needed a more comprehensive compliance plan before jumping in with upgrades. As a best practice, many FIs found it possible to get by with a special contract with Microsoft in which they could keep Windows XP and get the necessary security patches to remain compliant until they are ready to upgrade in conjunction with other planned changes. Now that the Windows XP transition deadline has passed, continuing to ignore the upgrade puts FIs at risk. And because other requirements are coming, it makes sense to create a plan that addresses not only a Windows 7 upgrade, but future needs as well.”

Cybersecurity Test Tools
“Cyberattacks on enterprises and banks worldwide reflect a frightening new era in cyberwarfare. As many security experts say, ‘You cannot hack or protect what you cannot see.’ Traditional network security strategies have become increasingly complex and costly, yet they do not deliver the level of reliability that modern mission-critical computing environments require. The solution is moving to a deeper, inside-out software-based approach that greatly reduces the number of vulnerabilities that hackers and cybercriminals can exploit. Cybersecurity stealth tools do exactly this and are an innovative, software-based approach to security that saves money, increases security, and is an agile component that adapts to changes in critical business networks and rapidly evolving regulatory requirements. To that end, it is good to see developers starting to introduce security tools that bring together maintenance and help-desk products with the security system. Security professionals should become familiar with the tools, techniques and weapons used in attacking their security infrastructure. Then they will be prepared to make a number of wise acquisitions, bringing in the best-of-breed products.”

The report goes on to detail a host of additional topics, all of which represent critical points of entry into a facilities IT infrastructure.  The point Sharkasi and ISACA are making is that “attackers need to find only one weakness to get into an enterprise system and spread their reach.”  While one weakness is all an attacker may require, as defenders we are responsible for securing the whole system.  This involves a holistic approach that encompasses hardware, software and wetware (people) and must be a concerted effort embraced by both the public and private sectors to be effective.

References
Sharkasi, O. Y. (2015). Addressing cybersecurity vulnerabilities. ISACA Journal. Retrieved from http://www.isaca.org/Journal/archives/2015/Volume-5/Pages/addressing-cybersecurity-vulnerabilities.aspx