John Reed Stark

As I noted in a post at the time, on February 20, 2018, the SEC issued its guidance for cybersecurity-related disclosures. In the following guest post, John Reed Stark, President of John Reed Stark Consulting and former Chief of the SEC’s Office of Internet Enforcement, has pulled together of list of 12 takeaways for corporate officials from the SEC’s guidance. I would like to thank John for his willingness to allow me to publish his article as a guest post on this site. I welcome guest post submissions from responsible authors on topics of interest to this blog’s readers. Please contact me directly if you would like to submit a guest post. Here is John’s article.




On February 20th, 2018, the U.S. Securities and Exchange Commission (“SEC”) issued further interpretive guidance to assist public companies in preparing disclosures about cybersecurity risks and incidents (the “2018 SEC Guidance”).


The 2018 SEC Guidance offers the SEC’s views about public companies’ disclosure obligations under existing law with respect to matters involving cybersecurity risk and incidents.  It also addresses the importance of cybersecurity policies and procedures and the application of disclosure controls and processes, insider trading prohibitions, and Regulation FD and selective disclosure prohibitions in the cybersecurity context.


The 2018 SEC Guidance serves as a follow-up to the October 13, 2011 SEC Division of Corporation Finance staff guidance, which also pertained exclusively to the cybersecurity-related disclosure obligations of public companies (the “2011 SEC CF Guidance”).


The fact that the 2011 SEC CF Guidance was published by the staff while the 2018 SEC Guidance was adopted by the Commission itself, though indicative of the gravity of the issue of the SEC and cyber incident disclosure, makes little actual difference for practitioners.  Whether guidance emanates from the SEC staff or from the SEC itself, it should be taken with the same high level of significance and attentiveness.


Along those lines, much of the 2018 SEC Guidance tracks the 2011 SEC CF Guidance, retaining a focus on “material” cyber risks and incidents and expanding upon its predecessor while also reinforcing the SEC’s expectations about cyber-disclosure.  But if the 2011 SEC CF Guidance was a wake-up call for public companies, the 2018 SEC Guidance is a resounding fire alarm — and is a must-read for any C-suite executive at a public company.


In short, the 2018 SEC Guidance:


  • Stresses the need for public companies to put into practice disclosure controls and procedures designed to escalate cybersecurity risks and incidents to the right c-suite executives;
  • Emphasizes the urgency for public companies to make appropriate disclosure to investors; and
  • Articulates the SEC’s growing concerns about unlawful trading involving data security incidents.


The 2018 SEC Guidance is also a stark reminder that for public companies that disclosures relating to data security events present an array of regulatory and litigation issues and has quickly evolved into an increasingly specialized area of securities regulation.


To help manage this emerging challenge, this article unpacks the 2018 SEC Guidance into 12 key takeaways, including a discussion of a particularly relevant SEC enforcement action and parallel criminal prosecution announced, probably not coincidentally, on the same day as the release of the SEC 2018 Guidance.


Pre-2018 SEC Cyber-Disclosure Guidance


On October 13, 2011, the SEC released the 2011 SEC CF Guidance,  its first ever staff guidance pertaining exclusively to the cybersecurity-related disclosure obligations of public companies.

With the 2011 SEC CF Guidance, the SEC officially (and quite noticeably) added cybersecurity into the mix of disclosure by putting every public company on notice that cyber-attacks and cybersecurity vulnerabilities fell squarely within a public company’s reporting responsibilities.


The 2011 SEC CF Guidance covered a public company’s reporting responsibilities both just after a cyberattack as a “material” event, and before as a “risk factor.” In their essence, these notions clarified the SEC’s long- standing requirement that public companies report “material” events to their shareholders.  What precisely renders an event material has plagued securities lawyers for years and has been the subject of countless judicial decisions, SEC enforcement actions, law review articles, law firm guidance and the like – but can be effectively summed up as any important development or event that “a reasonable investor would consider important to an investment decision.”


Prior to the 2011 SEC CF Guidance, publicly traded companies were not necessarily required to report in their SEC filings if a data security incident had occurred or if they had fixed the problem. After the 2011 SEC CF Guidance, however, publicly traded companies were more compelled to acknowledge cyber-attacks and other data security incidents to regulators and explain the measures they plannned to take to close their cyber-security gaps.


With respect to the aftermath of a cyberattack, the 2011 SEC CF Guidance discussed the myriad of ways a cyber-attack can impact the operations of a public company.  Next the 2011 SEC CF Guidance set forth the various reporting sections of typical SEC filings that could warrant mention of the cyber-attack, including Risk Factors; Management’s Discussion and Analysis of Financial Condition and Results of Operations; Description of Business, Legal Proceedings, Financial Statement Disclosures; and Disclosure Controls and Procedures.


With respect to the mere possibility of a cyber-attack, the 2011 SEC CF Guidance noted that companies should also “consider the probability of cyber incidents occurring and the quantitative and qualitative magnitude of those risks, including the potential costs and other consequences resulting from misappropriation of assets or sensitive information, corruption of data or operational disruption.”


Even though the SEC staff might have viewed the 2011 SEC CF Guidance as simply a reiteration of previously existing requirements, there remained little doubt at the time of its publication that the 2011 SEC CF Guidance imposed an arguably unprecedented and certainly significant obligation upon public companies.


Soon thereafter, companies began to step up both their processes for defending themselves against corporate attack, and their plans for incident response. According to corporate finance experts at the law firm of Sullivan and Cromwell, the 2011 SEC CF Guidance jump-started a new era of cyber-disclosure in public company filings, which most public companies have now already integrated into their SEC disclosure processes:


“Following the [2011 SEC CF Guidance’s] Release, many public companies included additional cybersecurity-related disclosures in their annual and quarterly reports, often in the form of risk factors, as well as in forward-looking statement disclosure. Our recent review of risk factor disclosure indicates that many public companies in industries that are particularly vulnerable to cybersecurity risks, such as financial services, technology and healthcare, have been disclosing cybersecurity risks with specific attention to the risks facing their particular businesses.”


The 2018 SEC Guidance:  Key Takeaways


Since publishing the 2011 SEC CF Guidance, the SEC has issued intermittent informal statements relating to cybersecurity, including in connection with its Cybersecurity Roundtable in 2014 and SEC Chairman Jay Clayton’s September 2017 Statement on Cybersecurity, which disclosed an intrusion into the SEC’s own data systems. But the 2018 SEC Guidance is the first time the SEC has specifically revisited the 2011 SEC CF Guidance – and it is chock-full of meaningful takeaways for public companies.


  1. Formalize Policies, Practices and Procedures Relating to Disclosure of Data Security Incidents.


The 2018 SEC Guidance encourages companies to implement policies, practices and procedures mandating that important cyber risk and incident information escalate “up the chain,” from IT teams to senior management, allowing for informed, intelligent and knowledgeable decisions.


This particular communications edict must have hit close to home for SEC Chairman Clayton, who when testifying before Congress about the data breach at the SEC, was clearly miffed that the SEC staff had not shared certain critical information with the various SEC Commissioners, including the Chairman.  At that time, then-SEC Commissioner Michael S. Piwowar even went so far as to issue a formal statement about the lack of communication to him about the SEC data breach, stating:


“I commend Chairman Clayton for initiating an assessment of the SEC’s internal cybersecurity risk profile and approach to cybersecurity from a regulatory perspective.  In connection with that review, I was recently informed for the first time that an intrusion occurred in 2016 in the SEC’s Electronic Data Gathering, Analysis, and Retrieval (“EDGAR”) system.  I fully support Chairman Clayton and Commission staff in their efforts to conduct a comprehensive investigation to understand the full scope of the intrusion and how to better manage cybersecurity risks related to the SEC’s operations.”


  1. Be Prepared for Cyber-Disclosure Executive Certifications.


In one of its more specific and impactful requirements, the 2018 SEC Guidance advises that required executive certifications regarding the design and effectiveness of disclosure controls (such as certifications made pursuant to the Exchange Act Rules 13a-14 and 15d-14 as well as Item 307 of Regulation S-K and Item 15(a) of Exchange Act Form 20-F) include controls governing relevant cyber risk disclosures.


Disclosure controls and procedures should ensure that relevant cybersecurity risk and incident information is reported to management so that they may make required certifications and disclosure decisions.   Here, SEC Chairman Clayton is stealing a page from the playbook of former SEC Chairman Harvey Pitt, who originated the idea of executive certifications way back in 2002.


Shocked after officials of such scandal-plagued companies such as Enron and Worldcom testified on Capitol Hill that they did not know their companies were reporting false or misleading information, Chairman Pitt conjured up the idea of executive certifications which was remarkably successful, effective — and quite ingenuous.


Then Chairman Pitt dictated that top corporate officials, chief executive officers and chief financial officers, must declare personally — literally, to take an oath — that their most recent financial statements are accurate. The new rule applied to companies’ future reports as well.


By making a “certification,” these officers are swearing that they know, for certain, that financial reports are true. If the reports are not, the executives must explain why these results are not accurate. This eventually led some companies to restate their results to comply with certification.


Just like SEC Chairman Pitt’s certification requirement sought to ensure accurate financial reporting and responsible executive conduct regarding financial results, SEC Chairman Jay Clayton’s new cyber-disclosure certification rule seeks to ensure accurate cyber-attack reporting and responsible executive conduct regarding data security incidents.


These required certifications by a company’s principal executive officer and principal financial officer as to the design and effectiveness of cyber-related disclosure controls and procedures could become somewhat challenging.  Company executives making these certifications will have to consider whether a company’s disclosure controls and procedures for cybersecurity are, in particular, capable of fully assessing and escalating such cyber risks and incidents.  Company executives will need to develop and implement some methodology to “drill down” into the conclusions of their IT personnel, perhaps even independently validating IT conclusions and representations when necessary.


  1. The Board Must Be Engaged with Respect to Data Security Incidents.


The 2018 SEC Guidance advises that public companies should disclose the role of boards of directors in cyber risk management, at least where cyber risks are material to a company’s business.


Just as occurred in the financial accounting realm, old and stale governance models must be modified and enhanced to address the very real, difficult to control and ever-increasing enterprise threat of cyber-attacks.  In practical terms, this means that, just as it does for financial reporting, every corporate board should:


  • Create a cybersecurity committee (just like its audit committee);
  • Engage an independent cybersecurity firm to conduct an annual cybersecurity audit (just like an independent accounting firm conducts and signs off on an annual financial audit); and
  • Add cybersecurity expertise and knowledge to the board (sitting right beside the board’s accounting and financial expert).


Historically, when it comes to their CFOs and the financial reporting function, the successful board paradigm has been one of vigorous and independent supervision, requiring the participation of independent third parties.  The same should go for CTOs, CIOs and CISOs, and the maxim of trust but verify should be equally operative in both contexts.


  1. The SEC Remains Schizophrenic About the Nature of Disclosure.


Much like the 2011 SEC CF Guidance, the 2018 SEC Guidance can be somewhat maddening with respects to the actual content of a company’s data security incident’s disclosure.


For example, as to the particularity of any data security incident’s disclosure, the SEC seems to want to have its cake and eat it too. On the one hand, the 2018 SEC Guidance appears to allow for a lack of specifics so as not to compromise a company’s security, stating:


“This guidance is not intended to suggest that a company should make detailed disclosures that could compromise its cybersecurity efforts – for example, by providing a “roadmap” for those who seek to penetrate a company’s security protections. We do not expect companies to publicly disclose specific, technical information about their cybersecurity systems, the related networks and devices, or potential system vulnerabilities in such detail as would make such systems, networks, and devices more susceptible to a cybersecurity incident.”


On the other hand, the 2018 SEC Guidance cautions companies not to use any sort of generic “boilerplate” type of language in its disclosures, stating somewhat opaquely:


“We expect companies to provide disclosure that is tailored to their particular cybersecurity risks and incidents. As the Commission has previously stated, we ‘emphasize a company-by-company approach [to disclosure] that allows relevant and material information to be disseminated to investors without boilerplate language or static requirements while preserving completeness and comparability of information across companies.’ Companies should avoid generic cybersecurity-related disclosure and provide specific information that is useful to investors.”


  1. The SEC Understands That Digital Forensic Investigations Take Time.


The SEC recognizes that the investigation of data security incidents can take time, and that some companies may not want to make any disclosures about an incident when they do not have some comfortable handle on the facts of the situation.  The 2018 SEC Guidance states:


“Understanding that some material facts may be not available at the time of the initial disclosure, we recognize that a company may require time to discern the implications of a cybersecurity incident. We also recognize that it may be necessary to cooperate with law enforcement and that ongoing investigation of a cybersecurity incident may affect the scope of disclosure regarding the incident.”


But the SEC also qualifies its recognition of the complexity of data security incidents and warns companies that the need for a lengthy investigation into a data security incident is not necessarily an automatic excuse for delaying the disclosure of a data security incident, stating:


“However, an ongoing internal or external investigation – which often can be lengthy – would not on its own provide a basis for avoiding disclosures of a material cybersecurity incident.”


Public companies should take some comfort that the 2018 SEC Guidance recognizes that data security incident investigations are complicated and can take time.  Of course, when a data breach happens, the public’s demand for immediate answers is understandable. Lifesavings are at risk while the perpetrators of hacking schemes are rarely identified, let alone captured and prosecuted. However, in the aftermath of most data breaches, there exists no CSI-like evidence which would allow for speedy evidentiary findings and rapid remediation.


The most effective cyber-attack investigative methodology is a tedious and exhaustive iterative process of digital forensics, malware reverse engineering, monitoring and scanning. As analysis identifies any possible indicator of compromise (IOC), investigators examine network traffic and logs, in addition to scanning system hosts for these IOCs. When this effort reveals additional systems that may have been infiltrated, investigators will then forensically image and analyze those systems, and the process repeats itself. Armed with the information gathered during this “lather, rinse, repeat,” phase, investigators can detect additional attempts by an attacker to regain access and begin to contain the attack.


While some breaches may provide key evidence early-on, most never do, or even worse, provide a series of false positives and other stumbling blocks. The evidence among the artifacts, remnants and fragments of a data breach is rarely in plain view; it rests among disparate logs (if they even exist), volatile memory captures, server images, system registry entries, spoofed IP addresses, snarled network traffic, haphazard and uncorrelated timestamps, Internet addresses, computer tags, malicious file names, system registry data, user account names, network protocols and a range of other suspicious activity.


In short, the evidence analyzed during a data breach response is a massive, jumbled and chaotic morass of terabytes of data. That is why the investigation of a data breach can take weeks, perhaps months, before any concrete conclusions begin to take shape. Rushing to judgment (and disclosure) not only creates further confusion and expense, but it also undermines the objectivity, truth and confidence that the public (especially shareholders) deserves.


  1. The Importance of Timely and Comprehensive Disclosure of Data Security Incidents.


The 2018 SEC Guidance clearly emphasizes the need for timely disclosure, probably taking a lesson from the Equifax data breach and, ironically from the SEC’s own data breach experience, which SEC Chairman Jay Clayton believes should have been disclosed earlier.


Equifax, one of three elite repositories of personal credit information, and a trusted source for personal security and identity theft defense products, disclosed a cyber-attack that could potentially affect 143 million consumers — nearly half of the U.S. population. The accessed Equifax data reportedly included sensitive information such as social security numbers, birthdays, addresses, and in some instances, driver’s license numbers — a virtual treasure trove for identity thieves.


Not long after the Equifax data breach, SEC Chairman Jay Clayton also announced a data breach into the SEC’s EDGAR system, a vast database that contains information about company earnings, share dealings by top executives and corporate activity such as mergers and acquisitions. Accessing that information before it’s disclosed publicly could allow hackers to profit by trading ahead of the information’s release.


With respect to the Equifax data breach, now “retired” Equifax CEO Richard Smith told a breakfast meeting in mid-August 2017 that data fraud is a “huge opportunity,” allowing Equifax to sell consumers more offerings. Smith touted the company’s credit-monitoring offerings, according to a video recording of the meeting at the University of Georgia’s Terry College of Business, and declared that protecting consumer data was “a huge priority” for the company.


But what the Equifax CEO failed to mention was that less than three weeks earlier, Equifax had apparently discovered a potentially massive data security incident and that Equifax had called in expert incident response firm Mandiant, to investigate. Yet, it was not until a few weeks later on Sept. 7, that Equifax disclosed the massive data breach to the public.


With respect to the SEC data breach, the SEC itself may have opted for a similar path of delayed notification. Reports and SEC Chairman Clayton’s testimony before the Senate Banking Committee indicate that the SEC data breach was discovered in 2016, and the possible illegal trades were detected in August of 2017, but the SEC did not disclose any information about the incident until September 20th, 2017.


Senior executives at both the SEC and Equifax have angered their constituents with their arguably sluggish disclosure. Both entities probably focused too much upon what they were legally and contractually obligated to disclose, rather than taking a more holistic approach to the question.


Per the 2018 SEC Guidance, when a company has learned of a cybersecurity incident or cyber-risk that is material to its investors, companies are expected to make appropriate disclosures, including filings on Form 8-K or Form 6-K as appropriate.  The 2018 SEC Guidance even goes so far as to remind public companies to consider obligations under the stock listing requirements,   such as Section 202.05 of the NYSE Listed Company Manual and NASDAQ Listing Rule 5250(b)(1).   Additionally, when a company experiences a data security incident of any size, the 2018 SEC Guidance emphasizes the need to “refresh” previous disclosures during the process of investigating a cybersecurity incident or past events.


When organizing the disclosure of data security incidents and overall cybersecurity risks, just like the 2011 SEC CF Guidance, the 2018 SEC Guidance explains that disclosure of data security incidents may be required in sections of public filings addressing Risk Factors, MD&A, Description of Business, Legal Proceedings and Financial Statement Disclosures.  Thus, the 2018 Sec Guidance will have little impact on corporate practices regarding the placement and organization of cybersecurity disclosures.


Indeed, for most of today’s public companies, their counsel has by now incorporated cybersecurity concerns into their disclosure reviews and sprinkled them in various parts and sections of corporate filings in an orderly, appropriate and consistent fashion.


  1. Watch Out for Insider Trading Based on Data Security Incidents.


While empirical data suggests otherwise, some data security incidents can impact a company’s stock price – such as the stock price drops immediately following disclosure of the Equifax and Target breaches.  In such situations, executives who learn of a data security incident, if it is material and nonpublic, could be violating insider trading laws if they engage in any trading of the company’s stock.


Thus, it is not surprising that the 2018 SEC Guidance warned corporate insiders not to sell shares of a company when holding confidential knowledge about cyberattacks and breaches that could affect stock price.  This is an area not covered by the 2011 SEC CF Guidance but made sense to include in the 2018 SEC Guidance.


Equifax once again probably triggered the SEC’s concerns and prompted inclusion of this principle in the 2018 SEC Guidance.  The Equifax data breach also involved a stock sell-off by some of its executives before the disclosure of its experience of a cyber-attack and spurred an SEC insider trading investigation that may still be ongoing. Intel CEO Brian Krzanich got hit with a similar backlash, too, for selling a large block of shares after learning of the Meltdown and Specter computer chip vulnerabilities, but before disclosing them to the public.


The 2018 SEC Guidance should prompt companies to evaluate their insider trading policies and procedures.  Company’s should review, with data security incidents in mind, their trade restriction policies, permissible trading windows, insider trading training curricula, codes of ethics, trade authorization procedures, trading training manuals and the like.


The SEC is obviously expecting thoughtful and well-documented consideration of data security incidents in the context of possible trading on material, nonpublic information – and carefully drafted, robust and precise policies, practices and procedures in place evidence a rigorous culture of compliance.


  1. Be Mindful of Regulation FD When Briefing Outsiders About a Data Security Incident.


Regulation FD (for “Fair Disclosure”), promulgated by the SEC under the Securities Exchange Act of 1934, as amended prohibits companies from selectively disclosing material nonpublic information to analysts, institutional investors, and others without concurrently making widespread public disclosure.


Regulation FD reflects the view that all investors should have equal access to a company’s material disclosures at the same time. Since its enactment in 2000, Regulation FD has fundamentally reshaped the ways in which public companies conduct their conference calls, group investor meetings, and so‐called “one‐on‐one” meetings with analysts and investors.


The SEC adopted Regulation FD to address the selective disclosure by issuers of material nonpublic information. In its adopting release, the SEC expressed concerns about reported instances of public companies disclosing important nonpublic information, such as advance warnings of earnings results, to securities analysts or selected institutional investors or both, before making full disclosure of the same information to the general public.  Those privy to the information beforehand were able to profit or avoid a loss at the expense of everyone else.


The 2018 Guidance emphasizes that companies subject to Regulation FD should have policies and procedures to promote compliance with Regulation FD regarding cybersecurity risks and incidents.


In particular, these policies and procedures should work to ensure that the company does not make any selective disclosures about cybersecurity risks and incidents to Regulation FD-enumerated persons without the required broadly disseminated public disclosure.  This can create unanticipated problems for a public company experiencing any form of cyber-attack, because Regulation FD can throw a wrench into an already challenging disclosure process.


For example, in the aftermath of a data breach of any kind, in addition to any consumer notifications, a broad range of other important notifications may immediately arise, such as briefings to customers, partners, employees, vendors, affiliates, insurance carriers, and a range of other interested/impacted parties.  Given the broad swath of interested parties, making sure disclosures are consistent, let alone not selective, will require careful and methodical communications practices.


  1. Failure to Disclose a Data Security Incident Can Operate as a Fraud on Customers (and Can Even Result in a CEO’s Federal Arrest).


Perhaps the most important takeaway from the 2018 SEC Guidance is a notion not specifically stated in the four corners of the document, but rather found in an SEC enforcement action (and parallel DOJ criminal prosecution) filed on the very same day of the 2018 SEC Guidance’s release.


In the SEC enforcement action, captioned SEC v. Jon E. Montroll and Bitfunder, the SEC charged a former bitcoin-denominated platform and its operator with operating an unregistered securities exchange and defrauding users of that exchange.  The SEC also charged the operator with making false and misleading statements in connection with an unregistered offering of securities.


Among other accusations, the SEC alleges that BitFunder and its founder Jon E. Montroll operated BitFunder as an unregistered online securities exchange and defrauded exchange users by misappropriating their bitcoins and failing to disclose a cyberattack on BitFunder’s system that resulted in the theft of more than 6,000 bitcoins.

The SEC actually alleges fraud because of the lack of disclosure of the data security incident to customers/account holders, effectively bypassing the issue of whether there is actually any statutory or regulatory disclosure obligation. In other words, by keeping the data security incident a secret, the exchange (which was unlawfully unregistered), committed a fraud upon its customers.  The SEC Complaint states:


“Montroll failed to disclose the theft [which occurred by means of a cyber-attack] and the deficit to Ukyo Notes investors and potential investors. By failing to disclose these facts, Montroll misled investors and potential investors – who were led to believe they would profit, at least in part, from BitFunder’s operations – to reasonably believe that BitFunder was a secure and profitable business.”


Concealing a data security incident can not only prompt SEC enforcement actions but can also lead to being arrested and taken into custody.  In a parallel criminal case, the U.S. Attorney’s Office for the Southern District of New York filed a complaint against Montroll for perjury and obstruction of justice during the SEC’s investigation.  In other words, whether a public company or private company and whether a regulated entity or an unregulated one — keeping a data security incident secret can be the kind of act that triggers an indictment.  The SDNY’s press release about their parallel case states:


“As alleged, Montroll committed a serious crime when he lied to the SEC during sworn testimony.  In an attempt to cover up the results of a hack that exploited weaknesses in the programming code of his company, he allegedly went to great lengths to prove the balance of bitcoins available to BitFunder users in the WeExchange Wallet was sufficient to cover the money owed to investors.  It’s said that honesty is always the best policy – this is yet another case in which this virtue holds true.”


  1. Beware the Perils of “Incidental Disclosure.”


If opting not to disclose a data security incident, perhaps because of a lack of materiality, be mindful that the SEC might learn about the incident another way and be none too thrilled about the initial lack of disclosure.


Hence the perils of “incidental disclosure,” where public companies can too often find themselves in bigger trouble than if they had simply initially disclosed the data security incident when it first occurred.


There exist a broad range of triggers relating to the disclosure of a data security incident, including: 1) statutes, rules and regulations; 2) contractual requirements; and 3) a particular incident, such as an audit, negotiation, event, happenstance or communication.


Ultimately, whether a cyber-attack victim has a legal obligation to disclose the attack to regulators; partners; customers; operators; employees; vendors and a range of other constituencies will be driven by a robust, methodical and independent forensic investigation. In other words, before a victim can make decisions about any legal responsibility for disclosure, the cyber-attack victim will need to conduct its own investigation and determine, among other things, the nature of the attacker’s efforts; the scope of the attack vector; and whether credit card data, personal identifying information, personal health information, intellectual property or any other relevant data was targeted, accessed or exfiltrated.


Ultimately, whether a cyber-attack victim has a contractual obligation to disclose an attack to partners; customers; operators; employees; vendors and a range of other constituencies will turn upon the various agreements in place relating to those parties.


However, even though a cyber-attack victim might not have any legal or contractual requirement to disclose an attack, the victim company might still opt to disclose the attack to regulators; partners; customers; operators; employees; vendors, etc. Indeed, certain circumstances can arise where disclosure of the attack becomes necessary, prudent and/or practical.


Such so-called “incidental disclosures” can occur during certain events or because of certain relationships, such as:


  • PCI Audit. If a cyber-attack victim takes credit cards and is about to undergo a Payment Card Industry (PCI) Compliance Assessment, the Qualified Security Assessor conducting the compliance review will undoubtedly ask questions about the victim company’s overall cybersecurity, which could prompt or necessitate candid disclosure of the attack;
  • Cybersecurity Due Diligence. Certain vendors, customers, partners, etc. may send a cyber-attack victim company a data security questionnaire, which is likely part of their due diligence concerning the cybersecurity strength of the relationship. Along those lines, cybersecurity-related inquiries, solicitations and demands to a victim company have become increasingly common. Any one of these kinds of cybersecurity requests or queries directed at a victim company could prompt or require candid disclosure of an attack;
  • Whistleblowers. So-called bad leavers (those who leave a company badly) or disgruntled insiders with an axe to grind, could learn of the cyber-attack and disclose the details to the media; to regulators; to contracting parties; or to any other interested party. Any of this kind of unwieldy disclosure could prompt candid disclosure of the attack;
  • Law Enforcement Actions. The sophistication and prevalence of cyber-threats seems to be growing and law enforcement seems committed not only to deterring the attacks but also to capturing the perpetrators. If a company discloses a data security incident to law enforcement, the company should assume that law enforcement will be sharing information concerning the data security incident with the SEC.  Moreover, should the federal government bring a prosecution against a cyber-attack perpetrator or issue public warnings concerning an attacker, the attack could become public or could prompt or require incidental disclosure of the attack;
  • Contractual Negotiations. If the cyber-attack victim company is a sophisticated corporation with many business relationships, contracts and agreements – and has any ongoing contractual negotiations or those contractual relationships up for renewal, discussions of cybersecurity could arise, which, in turn, could trigger incidental disclosure. Moreover, if the victim company is pursuing any new corporate associations, affiliations, acquisitions or other relationships, discussions of cybersecurity could arise, which, in turn, could trigger incidental disclosure of the attack;
  • Special Relationships. Some contractual relationships carry with them a unique inherent/implied degree of trust and confidence or are of such extraordinary business importance, that, despite a cyber-attack victim company having no legal or contractual requirement to do so, the victim company may feel nonetheless inclined or obligated disclose the attack (such as during a status conference or other routine get-together); and
  • The Public Interest. Some companies (like Equifax) or government agencies (like the U.S. Office of Personnel Management) have special and complex relationships with the public, and there is a definitive expectation that when a cyber-attack occurs, the company or government agency will disclose the situation publicly, not because of a legal obligation, not because of a contractual obligation, but simply due to a sense of public safety and possible danger.


  1. Two SEC Commissioners Believe the 2018 SEC Guidance Did Not Go Far Enough.


Though voting to approve the 2018 SEC Guidance, SEC Commissioners Robert Jackson and Kara Stein both took the unusual step of publishing separate statements outlining their concerns (Jackson Statement and Stein Statement) – essentially asserting that the 2018 SEC Guidance did not go far enough.


Commissioners Jackson and Stein, and other commentators and legislators, have sought more rigorous rulemaking to police disclosure around cybersecurity issues, or requiring certain cybersecurity policies at public companies.


Commissioner Jackson, pointing to an analysis from the White House Council of Economic Advisers that finds companies frequently under-report data security incidents to investors, asserted that some companies were ignoring the 2011 SEC CF Guidance and consistently failing to report data security incidents and failing to adequately disclose the risk of data security mishaps and cyber-attacks.


Commissioner Stein is even more critical, stating pointedly:


“While it may have the potential of providing both companies and investors with incremental benefit, the guidance does not sufficiently advance the ball—even in the context of disclosure guidance. Even more, it may provide investors a false sense of comfort that we, at the Commission, have done something more than we have.”


Commissioner Stein also suggested proposing rules to improve boards’ risks management frameworks related to cyber risks and threats; requiring companies to provide notice to investors in an appropriate time frame following a cyber-attack; and requiring companies to create cybersecurity-related policies and procedures beyond mere disclosure requirements.


Though the SEC is unlikely to propose any of Commissioner Stein’s recommendations, it is always telling for public companies (and useful to bear in mind) such a public notice of a Commissioner’s preferences and predispositions.  This dissention and lingering dissatisfaction means that at least two Commissioners are hungry for enforcement actions relating to cybersecurity disclosure failures and will likely strongly support any enforcement staff recommendation along those lines.


  1. Disclosure Guidelines from the U.S. Department of Justice (DOJ) Should Always be Kept in Mind.


Aside from SEC mandated disclosure of data security incidents, public companies will almost always be considering (at the same time) disclosure of a data security incident to the U.S. Federal Bureau of Investigation (“FBI”).  The FBI will not only potentially investigate the data security incident but, as noted above, the FBI may also opt to share their investigative files with the SEC.


DOJ’s Best Practices for Victim Response and Reporting of Cyber Incidents, is an official government publication which encourages companies to engage with law enforcement when a data security incident occurs, stating:


“If an organization suspects at any point during its assessment or response that the incident constitutes criminal activity, it should contact law enforcement immediately. Historically, some companies have been reticent to contact law enforcement following a cyber incident fearing that a criminal investigation may result in disruption of its business or reputational harm. However, a company harboring such concerns should not hesitate to contact law enforcement. The FBI and U.S. Secret Service place a priority on conducting cyber investigations that cause as little disruption as possible to a victim organization’s normal operations and recognize the need to work cooperatively and discreetly with victim companies. They will use investigative measures that avoid computer downtime or displacement of a company’s employees. When using an indispensable investigative measure likely to inconvenience a victim organization, they will do so with the objective of minimizing the duration and scope of any disruption.”


Interestingly, the DOJ’s Guidance reminds companies of one of the benefits of federal law enforcement notification of a data security incident: a possible temporary reprieve from other reporting obligations. The DOJ Guidance states:


“ . . . [M]any [state] data breach reporting laws allow a covered organization to delay notification if law enforcement concludes that such notice would impede an investigation. State laws also may allow a victim company to forgo providing notice altogether if the victim company consults with law enforcement and thereafter determines that the breach will not likely result in harm to the individuals whose personal information has been acquired and accessed. Organizations should consult with counsel to determine their obligations under state data breach notification laws. It is also noteworthy that companies from regulated industries that cooperate with law enforcement may be viewed more favorably by regulators [such as the SEC] looking into a data breach.” (emphasis added)


Even if unsuccessful, notification to federal law enforcement of a data security incident is often expected by the regulators, shareholders, customers, partners and the many other constituencies potentially impacted by a data security incident. After all, a cyber-attack victim has been unlawfully violated and needs federal help, protection and advice. Moreover, a cyber-attack victim will want to demonstrate that they are availing themselves of all available resources to protect against the potentially ongoing or future harm from the attackers.


However, notification to federal law enforcement can have costly and complicated ramifications. On one hand, the FBI, Secret Service, U.S. Air Force and other law enforcement agencies may be trying to identify and prosecute the intruders – and may even share with a cyber-attack victim the results of their investigation. On the other hand, myriad attorneys general and other regulatory agencies are issuing requests and demanding answers about the safety of the personal information of their respective citizenries. Managing this delicate balance can become challenging.


Law enforcement agencies may also: seek from the victim company forensic images of affected systems; request to attach a recording appliance to a victim company’s network in hope of capturing traces of possible future attacker activity; ask to receive briefings of all findings from any incident response efforts; and want a range of other information, technological data and interviews. These kinds of law enforcement requests raise a host of legal issues, including whether providing information to law enforcement could violate customer privacy or inadvertently waive the attorney-client privilege.


Looking Ahead


In addition to a company’s shareholders, a data security incident can trigger a litany of legal notification/disclosure requirements, including notice to state regulators; federal regulators; vendors; partners; insurance carriers; customers; consumers; employees; and any other constituency who may have a vested interest in a victim-company.


Hence, it is not surprising that disclosure of cybersecurity incidents has evolved into one of the most perplexing and multifaceted aspects of a data breach response – and that the SEC enforcement division has yet to file an enforcement or administrative action alleging any sort of data security incident disclosure failure.  Clearly, having the SEC add the kind of transparency and intelligibility of the 2018 SEC Guidance into the mix of regulatory and statutory requirements is a big plus.


Those who would argue that the 2018 SEC Guidance is not much more than a mere reiteration and reprocessing of the 2011 SEC CF Guidance are sorely misguided.  Though the 2018 SEC Guidance certainly reinforces some of the key principles of the 2011 SEC CF Guidance and aligns with its prior interpretations and current public company disclosure practices, the 2018 SEC Guidance is also jam-packed with an assortment of helpful takeaways and reminders – and is worthy of thoughtful consideration and attention.


John Reed Stark is president of John Reed Stark Consulting LLC, a data breach response and digital compliance firm. Formerly, Mr. Stark served for almost 20 years in the Enforcement Division of the U.S. Securities and Exchange Commission, the last 11 of which as Chief of its Office of Internet Enforcement. He currently works as an adjunct professor at the Duke University Law School Winter Session and also worked for 15 years as an Adjunct Professor of Law at the Georgetown University Law Center, where he taught several courses on the juxtaposition of law, technology and crime, and for five years as managing director of a global data breach response firm, including three years heading its Washington, D.C. office. Mr. Stark is the author of, “The Cybersecurity Due Diligence Handbook.”