Earlier this week, media reports circulated that this past spring Google had exposed the private data of thousands of the Google+ social network users and then opted not to disclose the issue, in part because of concerns that doing so would draw regulatory scrutiny and cause reputational damage. In the wake of these revelations, one question is whether the SEC will look into these circumstances. In the following guest post, John Reed Stark, President of John Reed Stark Consulting and former Chief of the SEC’s Office of Internet Enforcement, takes a look at what he regards as a likely SEC investigation and the questions that the SEC likely will be asking. A version of this article originally appeared on Securities Docket. I would like to thank John for allowing me to publish his article on this site. I welcome guest post submissions from responsible authors on topics of interest to this blog’s readers. Please contact me directly if you would like to submit an article. Here is John’s post.
Google has a problem and, having served for over 11 years as Chief of the SEC’s Office of Internet Enforcement, my guess is that the SEC is probably investigating.
On October 8, 2018, Google announced that it will close most of its failing social media platform Google+ and implement several new privacy measures because of a previously undisclosed software bug relating to its Google+ application programming interphase (API). Google created the API to help app developers access an array of profile and contact information about the people who sign up to use their apps, as well as the people they are connected to on Google+.
Google also mentioned that up to 500,000 Google+ users potentially had their personal data exposed. In addition, Google reported that up to 438 applications may have used the defective Google+ API, which makes estimations of impacted individuals difficult to ascertain.
Meanwhile the Wall Street Journal and the Washington Post are reporting that Google hid the Google+ API defect from shareholders and others for fear of regulatory examination, Congressional inquiry and other negative ramifications.
What a mess. Let the onslaught of scrutiny begin, which in my opinion will undoubtedly include an investigation by the U.S. Securities and Exchange Commission (SEC), the federal regulator tasked with policing the disclosures to shareholders by public companies like Google.
But what precisely will the SEC want to know? Assuming I am right about an ongoing SEC investigation, this article presents ten questions SEC enforcement staff will be posing to Google executives and others, in connection with an investigation of Google’s public disclosures of the Google+ API defect.
Some Background: Post 2018 SEC Cybersecurity Disclosure Guidance
On February 20th, 2018, the SEC issued further interpretive guidance to assist public companies in preparing disclosures about cybersecurity risks and incidents (the “2018 SEC Guidance”).
The 2018 SEC Guidance offers the SEC’s views about public companies’ disclosure obligations under existing law with respect to matters involving cybersecurity risk and incidents. It also addresses the importance of cybersecurity policies and procedures and the application of disclosure controls and processes, insider trading prohibitions, and Regulation FD and selective disclosure prohibitions in the cybersecurity context.
The 2018 SEC Guidance serves as a follow-up to the October 13, 2011 SEC Division of Corporation Finance staff guidance, which also pertained exclusively to the cybersecurity- related disclosure obligations of public companies (the “2011 SEC CF Guidance”).
The fact that the 2011 SEC CF Guidance was published by the staff while the 2018 SEC Guidance was adopted by the Commission itself, though indicative of the gravity of the issue of the SEC and cyber incident disclosure, makes little actual difference for practitioners. Whether guidance emanates from the SEC staff or from the SEC itself, it should be taken with the same high level of significance and attentiveness.
Along those lines, much of the 2018 SEC Guidance tracks the 2011 SEC CF Guidance, retaining a focus on “material” cyber risks and incidents and expanding upon its predecessor while also reinforcing the SEC’s expectations about cyber-disclosure. But if the 2011 SEC CF Guidance was a wake-up call for public companies, the 2018 SEC Guidance was a resounding fire alarm — and is a must-read for any C-suite executive at a public company.
In short, the 2018 SEC Guidance:
- Stresses the need for public companies to put into practice disclosure controls and procedures designed to escalate cybersecurity risks and incidents to the right c-suite executives;
- Emphasizes the urgency for public companies to make appropriate disclosure to investors; and
- Articulates the SEC’s growing concerns about unlawful trading involving data security incidents.
The 2018 SEC Guidance also serves as a stark reminder for public companies that disclosures relating to data security events present an array of regulatory and litigation issues and has quickly evolved into an increasingly specialized area of securities regulation.
Pre-2018 SEC Cyber-Disclosure Guidance
On October 13, 2011, the SEC released the 2011 SEC CF Guidance, its first ever staff guidance pertaining exclusively to the cybersecurity-related disclosure obligations of public companies.
With the 2011 SEC CF Guidance, the SEC officially (and quite noticeably) added cybersecurity into the mix of disclosure by putting every public company on notice that cyber-attacks and cybersecurity vulnerabilities fell squarely within a public company’s reporting responsibilities.
The 2011 SEC CF Guidance covered a public company’s reporting responsibilities both just after a cyberattack as a “material” event, and before as a “risk factor.” In their essence, these notions clarified the SEC’s long- standing requirement that public companies report “material” events to their shareholders. What precisely renders an event material has plagued securities lawyers for years and has been the subject of countless judicial decisions, SEC enforcement actions, law review articles, law firm guidance and the like – but can be effectively summed up as any important development or event that “a reasonable investor would consider important to an investment decision.”
Prior to the 2011 SEC CF Guidance, publicly traded companies were not necessarily required to report in their SEC filings if a data security incident had occurred or if they had fixed the problem. After the 2011 SEC CF Guidance, however, publicly traded companies were more compelled to acknowledge cyber-attacks other data security incidents to regulators, and explain the measures they planned to take to close their cyber-security gaps.
With respect to the aftermath of a cyberattack, the 2011 SEC CF Guidance discussed the myriad of ways a cyber-attack can impact the operations of a public company. Next the 2011 SEC CF Guidance set forth the various reporting sections of typical SEC filings that could warrant mention of the cyber-attack, including Risk Factors; Management’s Discussion and Analysis of Financial Condition and Results of Operations; Description of Business, Legal Proceedings, Financial Statement Disclosures; and Disclosure Controls and Procedures.
With respect to the mere possibility of a cyber-attack, the 2011 SEC CF Guidance noted that companies should also “consider the probability of cyber incidents occurring and the quantitative and qualitative magnitude of those risks, including the potential costs and other consequences resulting from misappropriation of assets or sensitive information, corruption of data or operational disruption.”
Even though the SEC staff might have viewed the 2011 SEC CF Guidance as simply a reiteration of previously existing requirements, there remained little doubt at the time of its publication that the 2011 SEC CF Guidance imposed an arguably unprecedented and certainly significant obligation upon public companies.
Against the backdrop of the 2011 CF Guidance and the 2018 SEC Guidance, below are ten questions Google should expect from SEC enforcement staff.
Question #1: Was Google’s Internal Investigation of the Google+ API Vulnerability a Fulsome and Robust Process of Neutrality, Objectivity, Transparency and Candor?
When it comes to data security incidents, one option for Google is to investigate the problem itself – get to the bottom of it, improve security practices, policies and procedures and move forward with better, stronger and more robust security. This seems to have been Google’s approach – and it may have very well succeeded.
However, there is a far more effective, more rewarding and more cost-effective option, which SEC enforcement staff has come to respect and appreciate. Whether it is British Petroleum struggling to handle the aftermath of an oil refinery explosion killing 15 Alaskan workers; Wells Fargo adjusting its operations after a massive company fraud committed by 5,300 employees against over two million customer accounts; or any company experiencing a threat to its customers, the same lesson always rings true. Confront the issue head-on with independence, transparency and integrity.
For starters, strong leaders seek answers from independent and neutral sources of information and when responding to data security incidents. Google’s leaders should:
- Engage a former law enforcement agent or prosecutor from an independent and neutral law firm or consulting firm (preferably never engaged before) to conduct an investigation and report its findings to the board;
- Direct the law firm to engage an independent digital forensics firm to examine the Google+ API issue;
- Report the investigation’s progress to shareholders, regulators and other constituencies and fiduciaries every step of the way; and
- Disclose the details of the findings to those users impacted.
Instead of trying to characterize an incident, effective leaders begin with these three steps, which evidence strong corporate ethics; fierce customer dedication; and steadfast corporate governance.
Next is to anoint someone from the engaged outside investigative team to serve as the face of the response. This person should be a former law enforcement official with impeccable credentials and the kind of gravitas that customers, shareholders and other important members of the public will trust and respect.
By navigating problems with integrity and transparency, Google can shift the tides in their favor, seizing the opportunity to reinforce strong business ethics; renewed customer dedication; and steadfast corporate governance.
But Google’s announcement makes no mention of any independent investigation. Rather Google’s announcement focuses on findings and conclusions rendered by its own “Privacy & Data Protection Office,” a council of top Google product executives who oversee key decisions relating to privacy.
C-Suite executives and boards of directors are not politicians and do not have the luxury of conducting potentially self-serving investigations and offering sanitized reports and findings; they have fiduciary obligations to shareholders and others to seek the truth and they should do so with independency, neutrality, transparency and candor. Otherwise, any formal findings can lack credibility and integrity, and no one, including the SEC enforcement staff, will take the investigative and remedial effort seriously.
Question #2: What Did the Google Board Know, and When Did They Know It?
The 2018 SEC Guidance for public companies on cybersecurity-related disclosures garnered a great deal of attention for what it says about the threat and risk that cybersecurity presents for public companies — large and small. With cyber-incidents capturing headlines around the world with increasing frequency, businesses and regulators have come to recognize that cyber-incidents are not a passing trend, but rather in our digitally connected economy, an embedded risk that is here to stay. Indeed, these cybersecurity risks represent a mounting threat to businesses — risks that can never be completely eliminated.
Much of the published commentary concerning the 2018 SEC Guidance focused on the technical aspects of the SEC’s instructions regarding the need for additional disclosure in a company’s periodic filings and the SEC’s updated views on the timing of cyber-related disclosures and what that means for insider trading windows. However, the 2018 SEC Guidance also says a lot about the SEC’s expectations of boards with respect to data security incidents, including disclosure-related responsibilities.
The SEC’s views on the role of the board have evolved over the past few years, culminating with the release of the 2018 SEC Guidance, which likely prompted many corporate boards to take tangible steps to translate their general awareness and high-level concerns around cybersecurity risks into specific behaviors and precise actions that are identifiable, capable of being readily implemented and heavily documented.
The comments contained in the 2018 SEC Guidance evidence the SEC’s strong views regarding the board’s essential role in this emerging area of enterprise risk and remove any doubt that for those who serve as corporate directors, “cybersecurity” can no longer be just a buzz word or a simple talking point. While many board members characterize cybersecurity risks as “an existential threat,” few, if any, have taken the time to go beyond attaining a superficial understanding of what that really means for their companies. Corporate directors now must consider themselves on notice. When it comes to cybersecurity, they are expected to dig in and, therefore, must demand greater visibility into what is oft presented as a murky and highly complex area best left to technologists.
Specifically, the 2018 SEC Guidance advises that public companies should disclose the role of boards of directors in cyber risk management, at least where cyber risks are material to a company’s business. With respect to the Google+ vulnerability, the SEC enforcement staff will probe about communication lines up to the board from the ground level at Google, searching for any broken links; any lack of transparency or candor; any concealment or “cleansing” of inculpatory information and any other related corporate governance failure.
Historically, when it comes to their CFOs and the financial reporting function, the successful board paradigm has been one of vigorous and independent supervision, requiring the participation of independent third parties. The same should go for CTOs, CIOs and CISOs, and the maxim of trust but verify should be equally operative in both contexts.
The SEC enforcement staff will want to know what steps, if any, Google took to enhance its board’s cybersecurity oversight in response to the 2018 SEC Guidance, and will want to understand the Google board’s approach to cybersecurity and the Google board’s involvement in the handling and management of the Google+ API defect.
Question #3: Where was Google’s CEO?
If Google’s CEO does not embrace and understand the importance of cybersecurity, the company has little chance of effectively carrying out its responsibility to ensure proper risk-based measures are in place and functioning. It is the CEO who is charged with day-to-day management responsibility and, as history tells us, those in the organization will, in fact, “follow the leader.” This may seem like an obvious point, but its criticality cannot be overstated.
Why would a CEO not take the issue of cybersecurity seriously? CEOs have a lot on their plate. And, like it or not, it is a reality of human behavior that there is a tendency to downplay the potential for certain risks — “this is not going to happen to us” — until those risks manifest themselves and then it is just too late. By then, the damage is already done, and the consequences can be immediate and, at times, catastrophic.
Recognizing this reality, the 2018 SEC Guidance actually offers shareholders an assist in the effort to focus the attention of the CEO. The SEC explicitly recognizes the importance of “tone at the top,” as demonstrated by one of its more specific and impactful directives, requiring that so-called executive certifications regarding the design and effectiveness of disclosure controls now encompass cybersecurity matters (such as certifications made pursuant to the Exchange Act Rules 13a-14 and 15d-14 as well as Item 307 of Regulation S-K and Item 15(a) of Exchange Act Form 20-F).
Disclosure controls and procedures should ensure that relevant cybersecurity risk and incident information is reported to management so that they may make required certifications and disclosure decisions. Here, the SEC actually stole a page from the playbook of former SEC Chairman Harvey Pitt, who originated the idea of executive certifications way back in 2002.
Shocked after officials of such scandal-plagued companies such as Enron and Worldcom testified on Capitol Hill that they did not know their companies were reporting false or misleading information, Chairman Pitt conjured up the idea of executive certifications which was remarkably successful, effective — and quite ingenuous.
Then Chairman Pitt dictated that top corporate officials, chief executive officers and chief financial officers, must declare personally — literally, to take an oath — that their most recent financial statements are accurate. The new rule applied to companies’ future reports as well.
By making a “certification,” these officers are swearing that they know, for certain, that financial reports are true. If the reports are not, the executives must explain why these results are not accurate. This eventually led some companies to restate their results to comply with certification.
Just like former SEC Chairman Pitt’s certification requirement sought to ensure accurate financial reporting and responsible executive conduct regarding financial results, current SEC Chairman Jay Clayton’s 2018 SEC Guidance seeks to ensure accurate cybersecurity reporting and responsible executive conduct regarding data security incidents.
These required certifications by a company’s principal executive officer and principal financial officer as to the design and effectiveness of cyber-related disclosure controls and procedures can be somewhat challenging. Company executives making these certifications have to consider whether a company’s disclosure controls and procedures for cybersecurity are, in particular, capable of fully assessing and escalating such cyber risks and incidents. Along these lines, the SEC will look to see if Google executives have developed and implemented some methodology to “drill down” into Google’s technical conclusions, perhaps even independently validating IT conclusions and representations when necessary.
The expanded certification rule seeks to drive executive-level ownership and accountability with respect to the reporting of cybersecurity incidents and the broader area of data security. Indeed, the 2018 SEC Guidance states:
“These certifications and disclosures should take into account the adequacy of controls and procedures for identifying cybersecurity risks and incidents and for assessing and analyzing their impact.”
The SEC certainly understands the centrality of the CEO’s role and now the CEO must affirmatively certify to the adequacy of the organization’s cybersecurity controls. SEC enforcement staff will attempt to determine if Google’s CEO and senior executives have accepted – and embraced – both the spirit and the language of this new SEC certification requirement.
Question #4: What Were Google’s Formal Policies, Practices and Procedures Relating to Disclosure of Data Security Incidents?
SEC enforcement staff will want to review all of Google’s formal policies, practices and procedures relating to the disclosure of cybersecurity incidents.
The 2018 SEC Guidance encourages companies to implement policies, practices and procedures mandating that important cyber risk and incident information escalate “up the chain,” from IT teams to senior management, allowing for informed, intelligent and knowledgeable decisions.
This particular communications edict must have hit close to home for SEC Chairman Clayton, who when testifying before Congress about a data breach at the SEC, was clearly miffed that the SEC staff had not shared certain critical information with the various SEC Commissioners, including the Chairman. At that time, then-SEC Commissioner Michael S. Piwowar even went so far as to issue a formal statement about the lack of communication to him about the SEC data breach, stating:
“I commend Chairman Clayton for initiating an assessment of the SEC’s internal cybersecurity risk profile and approach to cybersecurity from a regulatory perspective. In connection with that review, I was recently informed for the first time that an intrusion occurred in 2016 in the SEC’s Electronic Data Gathering, Analysis, and Retrieval (“EDGAR”) system. I fully support Chairman Clayton and Commission staff in their efforts to conduct a comprehensive investigation to understand the full scope of the intrusion and how to better manage cybersecurity risks related to the SEC’s operations.”
Question #5: What was the Nature of any Google Disclosure Related to the Data Security Incident or the Risks of Data Security Incidents?
Much like the 2011 SEC CF Guidance, the 2018 SEC Guidance can be somewhat maddening with respects to the actual content of a company’s disclosure regarding a data security incident.
For example, as to the particularity of any data security incident’s disclosure, the SEC seems to want to have its cake and eat it too. On the one hand, the 2018 SEC Guidance appears to allow for a lack of specifics so as not to compromise a company’s security, stating:
“This guidance is not intended to suggest that a company should make detailed disclosures that could compromise its cybersecurity efforts – for example, by providing a “roadmap” for those who seek to penetrate a company’s security protections. We do not expect companies to publicly disclose specific, technical information about their cybersecurity systems, the related networks and devices, or potential system vulnerabilities in such detail as would make such systems, networks, and devices more susceptible to a cybersecurity incident.”
On the other hand, the 2018 SEC Guidance cautions companies not to use any sort of generic “boilerplate” type of language in its disclosures, stating somewhat opaquely:
“We expect companies to provide disclosure that is tailored to their particular cybersecurity risks and incidents. As the Commission has previously stated, we ‘emphasize a company-by-company approach [to disclosure] that allows relevant and material information to be disseminated to investors without boilerplate language or static requirements while preserving completeness and comparability of information across companies.’ Companies should avoid generic cybersecurity-related disclosure and provide specific information that is useful to investors.”
Along these lines, any conclusions about the adequacy (or inadequacy) of the actual substance of any Google disclosure concerning the Google+ API defect will be the subject of debate. SEC staff will review texts, emails, reports and other relevant documents pertaining to the Google+ vulnerability discovery and remediation, and then follow-up seeking testimonial evidence from Google employees and outside experts to better understand the particulars of the “bug.”
With respect to Google’s risk disclosures, the SEC enforcement staff will likely consider Google’s vague reference to risk in its October 8th announcement, which stated:
“The review did highlight the significant challenges in creating and maintaining a successful Google+ that meets consumers’ expectations. Given these challenges and the very low usage of the consumer version of Google+, we decided to sunset the consumer version of Google+”
The SEC staff will want to know if Google incorporated into its SEC filings the risk associated with the “challenges,” and will want to read the details of the Google “review,” cited above in Google’s October 8th announcement.
Question #6: How Long Did It Take for Google to Remediate After Discovery of the Data Security Vulnerability?
Good news on this front for Google. Google should take some comfort that the 2018 SEC Guidance recognizes that data security incident investigations are complicated and cannot be completed overnight.
The SEC recognizes that the investigation of data security incidents can take time, and that some companies may not want to make any disclosures about an incident when they do not have some comfortable handle on the facts of the situation. The 2018 SEC Guidance states:
“Understanding that some material facts may be not available at the time of the initial disclosure, we recognize that a company may require time to discern the implications of a cybersecurity incident. We also recognize that it may be necessary to cooperate with law enforcement and that ongoing investigation of a cybersecurity incident may affect the scope of disclosure regarding the incident.”
But the SEC also qualifies its recognition of the complexity of data security incidents and warns companies that the need for a lengthy investigation into a data security incident is not necessarily an automatic excuse for delaying the disclosure of a data security incident, stating:
“However, an ongoing internal or external investigation – which often can be lengthy – would not on its own provide a basis for avoiding disclosures of a material cybersecurity incident.”
Of course, when a data security incident happens, the public’s demand for immediate answers is understandable. Lifesavings are at risk while the perpetrators of hacking schemes are rarely identified, let alone captured and prosecuted. However, in the aftermath of most data security incidents, there exists no CSI-like evidence which would allow for speedy evidentiary findings and rapid remediation.
While some data security incidents may provide key evidence early-on, most never do, or even worse, provide a series of false positives and other stumbling blocks. The evidence among the artifacts, remnants and fragments of a data security incident is rarely in plain view; it can rest among disparate logs (if they even exist), volatile memory captures, server images, system registry entries, spoofed IP addresses, snarled network traffic, haphazard and uncorrelated timestamps, Internet addresses, computer tags, malicious file names, system registry data, user account names, network protocols and a range of other suspicious activity.
Moreover, evidence can become difficult to nail down — logs are destroyed or overwritten in the course of business; archives become corrupted; hardware is repurposed; and the list goes one.
For instance, in Google’s case, according to the Wall Street Journal, certain key logs were simply never retained, which created obstacles for its internal investigators:
“Because the company kept a limited set of activity logs, it was unable to determine which users were affected and what types of data may potentially have been improperly collected, the two people briefed on the matter said. The bug existed since 2015, and it is unclear whether a larger number of users may have been affected over that time.”
In short, the evidence analyzed during a data security incident can be a massive, jumbled and chaotic morass of terabytes of data. That is why the investigation of a data security incidents can take weeks, perhaps months, before any concrete conclusions begin to take shape. Rushing to judgment (and disclosure) might not only create further confusion and expense, but it can also undermine the objectivity, truth and confidence that the public (especially shareholders) deserves.
Question #7: Did Google Undertake a Timely and Comprehensive Disclosure of its Data Security Incident?
The Wall Street Journal reported one blockbuster fact that will be a lightning rod for SEC enforcement attention:
“[Google] opted not to disclose the issue this past spring, in part because of fears that doing so would draw regulatory scrutiny and cause reputational damage.”
The SEC will likely begin its investigation by reviewing Google’s disclosures since March of 2018, when a privacy task force formed inside Google, code-named Project Strobe, apparently discovered the API problem during a company-wide audit of the company’s APIs. Google’s announcement states that:
“We discovered and immediately patched this bug in March 2018. We believe it occurred after launch as a result of the API’s interaction with a subsequent Google+ code change.”
The 2018 SEC Guidance clearly emphasizes the need for timely disclosure, probably taking a lesson from the Equifax data breach and, ironically from the SEC’s own data breach experience, which SEC Chairman Jay Clayton admitted should have been disclosed earlier.
Equifax, one of three elite repositories of personal credit information, and a trusted source for personal security and identity theft defense products, disclosed a cyber-attack that could potentially affect 148 million consumers — nearly half of the U.S. population. The accessed Equifax data reportedly included sensitive information such as social security numbers, birthdays, addresses, and in some instances, driver’s license numbers — a virtual treasure trove for identity thieves.
Not long after the Equifax data breach, SEC Chairman Jay Clayton also announced a data breach into the SEC’s EDGAR system, a vast database that contains information about company earnings, share dealings by top executives and corporate activity such as mergers and acquisitions. Accessing that information before it’s disclosed publicly could allow hackers to profit by trading ahead of the information’s release.
With respect to the Equifax data breach, now “retired” Equifax CEO Richard Smith told a breakfast meeting in mid-August 2017 that data fraud is a “huge opportunity,” allowing Equifax to sell consumers more offerings. Smith touted the company’s credit-monitoring offerings, according to a video recording of the meeting at the University of Georgia’s Terry College of Business, and declared that protecting consumer data was “a huge priority” for the company.
But what the Equifax CEO failed to mention was that less than three weeks earlier, Equifax had apparently discovered a potentially massive data security incident and that Equifax had called in expert incident response firm Mandiant, to investigate. Yet, it was not until a few weeks later on Sept. 7, that Equifax disclosed the massive data breach to the public.
With respect to the SEC data breach, the SEC itself may have opted for a similar path of delayed notification. Reports and SEC Chairman Clayton’s testimony before the Senate Banking Committee indicate that the SEC data breach was discovered in 2016, and the possible illegal trades were detected in August of 2017, but the SEC did not disclose any information about the incident until September 20th, 2017.
Senior executives at both the SEC and Equifax have angered their constituents with their arguably sluggish disclosure. Both entities probably focused too much upon what they were legally and contractually obligated to disclose, rather than taking a more holistic approach to the question.
Per the 2018 SEC Guidance, if Google learned of a cybersecurity incident or cyber-risk that was material to its investors, then Google was expected to make appropriate disclosures. The 2018 SEC Guidance even goes so far as to remind public companies to consider obligations under the stock listing requirements, such as Section 202.05 of the NYSE Listed Company Manual and NASDAQ Listing Rule 5250(b)(1). Additionally, when Google experienced a data security incident of any type, the 2018 SEC Guidance emphasizes the possible need to “refresh” previous disclosures during the process of investigating a cybersecurity incident or past events.
When organizing the disclosure of data security incidents and overall cybersecurity risks, just like the 2011 SEC CF Guidance, the 2018 SEC Guidance explains that disclosure of data security incidents may be required in sections of public filings addressing Risk Factors, MD&A, Description of Business, Legal Proceedings and Financial Statement Disclosures.
No doubt, SEC enforcement staff will be pouring over these various sections of disclosure, looking for any possibly misleading information or material omission.
Question #8: Was There any Trading by Any Google Personnel Who Knew of the Data Security Incident?
While empirical data may suggest otherwise, some data security incidents can impact a company’s stock price – such as the actual stock price drops immediately following disclosure of the Equifax and Target breaches. In such situations, executives who learn of a data security incident, if it is material and nonpublic, could be violating insider trading laws if they engage in any trading of the company’s stock.
Along these lines, the 2018 SEC Guidance warned corporate insiders not to sell shares of a company when holding confidential knowledge about cyberattacks and breaches that could affect stock price. This is an area not covered by the 2011 SEC CF Guidance but made sense to include in the 2018 SEC Guidance.
Equifax once again probably triggered the SEC’s concerns and prompted inclusion of this principle in the 2018 SEC Guidance. The Equifax data breach also involved a stock sell-off by some of its executives before the disclosure of its experience of a cyber-attack and spurred an SEC insider trading investigation that resulted in at least one SEC enforcement action against an Equifax manager for unlawful insider trading. Intel CEO Brian Krzanich got hit with a similar backlash, too, for selling a large block of shares after learning of the Meltdown and Specter computer chip vulnerabilities, but before disclosing them to the public.
The SEC is obviously expecting that Google have thoughtful and well-documented consideration of data security incidents in the context of possible trading on material, nonpublic information – and carefully drafted, robust and precise policies, practices and procedures in place to demonstrate a rigorous culture of compliance.
SEC enforcement staff will likely explore whether the 2018 SEC Guidance prompted Google to review, with data security incidents in mind, their trade restriction policies, permissible trading windows, insider trading training curricula, codes of ethics, trade authorization procedures, trading training manuals and the like.
Question 9: Was Google Mindful of Regulation FD When Briefing Outsiders About its Data Security Incident?
Regulation FD (for “Fair Disclosure”), promulgated by the SEC under the Securities Exchange Act of 1934, as amended prohibits companies from selectively disclosing material nonpublic information to analysts, institutional investors, and others without concurrently making widespread public disclosure.
Regulation FD reflects the view that all investors should have equal access to a company’s material disclosures at the same time. Since its enactment in 2000, Regulation FD has fundamentally reshaped the ways in which public companies conduct their conference calls, group investor meetings, and so‐called “one‐on‐one” meetings with analysts and investors.
The SEC adopted Regulation FD to address the selective disclosure by issuers of material nonpublic information. In its adopting release, the SEC expressed concerns about reported instances of public companies disclosing important nonpublic information, such as advance warnings of earnings results, to securities analysts or selected institutional investors or both, before making full disclosure of the same information to the general public. Those privy to the information beforehand were able to profit or avoid a loss at the expense of everyone else.
The 2018 SEC Guidance emphasizes that companies subject to Regulation FD (like Google) should have policies and procedures to promote compliance with Regulation FD regarding cybersecurity risks and incidents.
In particular, these policies and procedures should work to ensure that Google did not make any selective disclosures about cybersecurity risks and incidents to Regulation FD-enumerated persons without the required broadly disseminated public disclosure. This can create unanticipated problems for any public company experiencing any form of data security incident, because Regulation FD can throw a wrench into an already challenging disclosure process.
For example, in the aftermath of a data security incident of any kind, in addition to any consumer notifications, a broad range of other important notifications may immediately arise, such as briefings to customers, partners, employees, vendors, affiliates, insurance carriers, and a range of other interested/impacted parties.
Given the broad swath of interested parties, SEC enforcement staff will be looking to make sure Google maintained careful and methodical communications practices to ensure that their disclosures were consistent, and not selective.
Question #10: Do Google’s Disclosures, or Lack Thereof, Amount to Criminal Behavior?
Perhaps the most important takeaway from the 2018 SEC Guidance is a notion not specifically stated in the four corners of the document, but rather found in an SEC enforcement action (and parallel DOJ criminal prosecution) filed on the very same day of the 2018 SEC Guidance’s release.
In the SEC enforcement action, captioned SEC v. Jon E. Montroll and Bitfunder, the SEC charged a former bitcoin-denominated platform and its operator with operating an unregistered securities exchange and defrauding users of that exchange. The SEC also charged the operator with making false and misleading statements in connection with an unregistered offering of securities.
Among other accusations, the SEC alleges that BitFunder and its founder Jon E. Montroll operated BitFunder as an unregistered online securities exchange and defrauded exchange users by misappropriating their bitcoins and failing to disclose a cyberattack on BitFunder’s system that resulted in the theft of more than 6,000 bitcoins.
The SEC actually alleges fraud because of the lack of disclosure of the data security incident to customers/account holders, effectively bypassing the issue of whether there is actually any statutory or regulatory disclosure obligation. In other words, by keeping the data security incident a secret, the exchange (which was unlawfully unregistered), committed a fraud upon its customers. The SEC Complaint states:
“Montroll failed to disclose the theft [which occurred by means of a cyber-attack] and the deficit to Ukyo Notes investors and potential investors. By failing to disclose these facts, Montroll misled investors and potential investors – who were led to believe they would profit, at least in part, from BitFunder’s operations – to reasonably believe that BitFunder was a secure and profitable business.”
Concealing a data security incident can not only prompt SEC enforcement actions but can also lead to being arrested and taken into custody. In a parallel criminal case, the U.S. Attorney’s Office for the Southern District of New York filed a complaint against Montroll for perjury and obstruction of justice during the SEC’s investigation. In other words, whether a public company or private company and whether a regulated entity or an unregulated one — keeping a data security incident secret can be the kind of act that triggers an indictment. The SDNY’s press release about their parallel case states:
“As alleged, Montroll committed a serious crime when he lied to the SEC during sworn testimony. In an attempt to cover up the results of a hack that exploited weaknesses in the programming code of his company, he allegedly went to great lengths to prove the balance of bitcoins available to BitFunder users in the WeExchange Wallet was sufficient to cover the money owed to investors. It’s said that honesty is always the best policy – this is yet another case in which this virtue holds true.”
Nothing in any of the few public reports of the Google+ incident indicates any clear-cut nefarious form of fraud or chicanery. However, the Wall Street Journal reports that Google failed to disclose the Google+ API defect for fear of regulatory and other ramifications, stating:
“The [internal Google] document shows Google officials felt that disclosure could have serious ramifications. Revealing the incident would likely result “in us coming into the spotlight alongside or even instead of Facebook despite having stayed under the radar throughout the Cambridge Analytica scandal,” the memo said. It “almost guarantees Sundar [Google’s CEO] will testify before Congress.”
Google executives should realize that SEC enforcement staff will be looking for any hint of deception in their handling of the Google+ API defect — and that SEC enforcement staff will gladly pass along any evidence of fraud to the Federal Bureau of Investigation and the U.S. Department of Justice.
In addition to Google’s shareholders, a data security incident can trigger a litany of legal notification/disclosure requirements, including notice to state regulators; federal regulators; GDPR Supervisory Authorities; vendors; partners; insurance carriers; customers; consumers; employees; and any other constituency who may have a vested interest in a victim-company.
Hence, it is not surprising that disclosure of cybersecurity incidents and cybersecurity risks has evolved into one of the most important program areas of SEC enforcement, mentioned in so many SEC speeches and panel discussions. Yet the SEC enforcement division has only filed one SEC enforcement action, against Altaba, formerly known as Yahoo!, alleging any sort of data security incident disclosure failure. This is probably because cybersecurity disclosures are usually made in good faith and lack the kind of obvious misconduct and fraud that the SEC typically prosecutes.
But that should provide Google with little solace. The SEC enforcement division is always looking to prosecute the “big fish” to reinforce a regulatory priority or decree — and Google could fit that bill, especially if the Wall Street Journal report about Google’s efforts at concealment turn out to be even slightly treacherous or outrageous.
As an aside, the Wall Street Journal report implies the existence of an active whistleblower at Google who provided inculpatory memoranda and other documents. The SEC loves whistleblowers; cultivates whistleblowers; seeks out whistleblowers — and financially rewards whistleblowers for their efforts, especially when a whistleblower is a company insider. Whoever was speaking to the Wall Street Journal regarding the Google+ API issue will probably be cooperating with the SEC soon enough.
For Google, perhaps the Google+ API defect was only a minor data mishap and an aberration, and its internal investigative team acted promptly, carefully, swiftly and in the best interest of Google shareholders to remediate the vulnerability. Only time will tell.
In the meantime, my advice is for Google to prepare itself for a vigorous and relentless SEC enforcement division investigation – and have its responses ready not just to the ten questions cited above, but also to the many other questions that the SEC will most certainly pile on.
And if it has not done so by now, Google should consider engaging an independent law firm and digital forensics firm to confirm the findings of Google’s Privacy & Data Protection Office and recommend future remedial actions. To me, injecting independence, transparency and sunlight into Google’s process not only seems like a no-brainer, but would also contribute to a far more thoughtful, conscientious and meritorious defense.
*John Reed Stark is president of John Reed Stark Consulting LLC, a data breach response and digital compliance firm. Formerly, Mr. Stark served for almost 20 years in the Enforcement Division of the U.S. Securities and Exchange Commission, the last 11 of which as Chief of its Office of Internet Enforcement. He has taught most recently as Senior Lecturing Fellow at Duke University Law School Winter Sessions and will likely be teaching a cyber-law course at Duke Law in the Spring of 2019. Mr. Stark also worked for 15 years as an Adjunct Professor of Law at the Georgetown University Law Center, where he taught several courses on the juxtaposition of law, technology and crime, and for five years as managing director of global data breach response firm, Stroz Friedberg, including three years heading its Washington, D.C. office. Mr. Stark is the author of, “The Cybersecurity Due Diligence Handbook.”