> top > docs > CORD-19:2c670a00ad73cd21a2576e9e29820b954b1948e9

CORD-19:2c670a00ad73cd21a2576e9e29820b954b1948e9 JSONTXT

Malware and Disease: Lessons from Cyber Intelligence for Public Health Surveillance Abstract Malicious software and infectious diseases are similar is several respects, as are the functional requirements for surveillance and intelligence to defend against these threats. Given these similarities, this article compares and contrasts the actors, relationships, and norms at work in cyber intelligence and disease surveillance. Historical analysis reveals that civilian cyber defense is more decentralized, private, and voluntary than public health in the United States. Most of these differences are due to political choices rather than technical necessities. In particular, political resistance to government institutions has shaped cyber intelligence over the past 30 years, which is a troubling sign for attempts to improve disease surveillance through local, state, and federal health departments. Information sharing about malware is also limited, despite information technology being integral to cyberspace. Such limits suggest that automation through electronic health records will not automatically improve public health surveillance. Still, certain aspects of information sharing and analysis for cyber defense are worth emulating or, at the very least, learning from to help detect and manage health threats. D iscourse about cybersecurity is riddled with biological analogies and metaphors. We describe malicious software in terms of ''viruses'' and ''worms,'' in part because some of their similarities are significant for security. Most important, contagion of either kind can threaten the community at large. Infectious diseases can spread through travel and trade, as illustrated by the Ebola outbreak in West Africa, Middle East respiratory syndrome, and the H1N1 pandemic, among countless other examples. Like the human ecosystem, the internet is another kind of global network, and malware ranging from Conficker and Zeus to Stuxnet has spread through cyberspace in recent years. To the extent that malware and infectious diseases are analogous threats, they may warrant similar solutions. In particular, several policy analysts claim that ''cyber public health'' and ''cyber hygiene'' would be improved by creating a ''cyber CDC,'' mirroring the US Centers for Disease Control and Prevention (CDC), or even a ''cyber WHO'' modeled on the World Health Organization. [1] [2] [3] [4] [5] [6] [7] [8] ''Both public health and cybersecurity aim to achieve a positive state (health or security) in a loosely affiliated but highly interdependent network.'' 9(p77) While the details differ, some of the functional requirements for achieving cyber and health security are similar as well. Among others, these functions include collecting data about threats to the population (eg, malware or disease), analyzing this information, and reporting to inform action-in a word, surveillance. What insight does cybersecurity offer for improving public health surveillance, given their family resemblance? Can public health learn from its younger, digital cousin? To answer these questions, this article compares disease surveillance with what is now called cyber intelligence, specifically as it relates to malicious software. The research methodology is described first. Important similarities between malware and infectious disease are identified next, followed by the functional similarities between disease surveillance and cyber intelligence. The organizational histories of these fields are then compared and contrasted at the local, national, and international levels. Despite roughly analogous threats and functional requirements, there are striking differences in how these fields are organized. Cyber intelligence is more decentralized, private, and voluntary than disease surveillance in the United States. These differences inform at least 3 lessons for public health policy: (1) the prevailing politics of the time have changed to the detriment of health agencies inside government; (2) electronic health records will not automatically improve disease surveillance; and (3) the CDC could adopt aspects of information sharing and analysis from cyber defense to help improve its image and data exchange. The logic of each lesson is examined in the final discussion. This article provides a qualitative, historical analysis of cyber intelligence and disease surveillance. It compares and contrasts the actors, relationships, and norms at work in each field. The empirical evidence draws on more than a dozen semistructured interviews with practitioners in public health and cyber defense (30 to 120 minutes in length, tape recorded or written notes, most not for attribution), as well as archival research and existing literature (grey and peer reviewed). While the primary focus is on the United States, international cooperation is also addressed. Cybersecurity would be important for public health even if malware and disease were unrelated. The confidentiality, integrity, and availability of health information are vital to this profession and threatened by security challenges ranging from medical data breaches to ransomware targeting hospitals. Of course, it is important to note that malicious software and infectious diseases are not the same. Malware is computer code processed by digital electronics, whereas diseases and their hosts are biological organisms. Malware is not lethal, while infectious diseases kill millions of people each year. Usually, malware is created with malicious intent (as its name implies), but the vast majority of infectious diseases are naturally occurring and lack hostile motives (with the rare exception of biological weapons). 10 Nevertheless, these threats are similar in several ways that are significant for security. First and foremost, both malware and infectious disease can be contagious and widely spread. Transmissibility varies: Not all cyber attacks are broadcast or intended to spread beyond a specific computer, and not all infectious diseases are contagious. Yet, the potential for spread means that the threat to any individual or computer depends in part on the health of the overall population or network. This interdependence also means that securing either kind of system through intelligence or surveillance depends in part on local, national, and international cooperation. Second, both malware and disease have nonkinetic effects that can be difficult to detect. 11 This difficulty complicates surveillance and source attribution because symptoms can be delayed, hidden, or nonspecific, leaving victims and vectors unaware. Detection, diagnosis, and treatment can therefore require specialized knowledge and tools: Just as medical researchers have to study a virus before they can create a vaccine, antivirus makers must capture a computer virus, take it apart and identify its ''signature''unique signs in its code-before they can write a program that removes it. 12 Here again, cooperation is helpful for developing these countermeasures. For example, antivirus companies now share malware samples to improve detection, and the vaccine industry has long benefited from the virus samples shared through the Global Influenza Surveillance and Response System (GISRS). 13, 14 Both the advantages of cooperation and difficulties of detection are compounded by the variety of malware and disease. New pathogens are always emerging (eg, through antigenic drift and shift in influenza). New malware is also common. According to Symantec, ''there were more than 317 million new pieces of malware created'' in 2014, ''meaning nearly one million new threats were released into the wild each day.'' 15(p7) Finally, some of the specialized knowledge and tools used to detect and treat malware and disease are dual use. This means that they can be used for both benefit and harm, or, alternatively, for both military and civilian applications. Cyber intelligence and disease surveillance are both conducted by military and civilian agencies because malware and disease are both interpreted as threats to national security. 16 Furthermore, knowing how to detect malware can be used to help remove it or to hide the code and cause harm. Disease surveillance is rarely thought of in terms of potential harm, since the information it provides is intended to help reduce morbidity and mortality. But this information can also be used to stigmatize and discriminate against individuals and groups, as well as restrict travel and trade. Fear of misuse has fueled resistance to disease surveillance, including popular opposition to name-based reporting of HIV and government opposition to reporting outbreaks such as SARS and H5N1. [17] [18] [19] One consequence of these dual-use dilemmas is that cooperation is not inevitable, even when it would be advantageous for cyber or health security. Just as malware and infectious disease are roughly analogous threats, so too are certain aspects of acquiring, interpreting, and communicating information about them. According to the CDC, ''Public health surveillance is the ongoing, systematic collection, analysis, interpretation, and dissemination of data regarding a health-related event for use in public health action.'' 20 Similar terminology is used to define ''cyber intelligence''-namely, ''the acquisition and analysis of information to identify, track, and predict cyber capabilities, intentions, and activities that offer courses of action to enhance decision making.'' 21(p2) Malware is one kind of cyber capability subject to intelligence, as infectious disease is one kind of health-related event-along with noncommunicable diseases, injuries, medical services and devices, and the like-subject to surveillance, biosurveillance, or ''health intelligence.'' 22 And both fields differ from espionage. Following revelations by Edward Snowden, associating the word ''cyber'' with ''intelligence'' or ''surveillance'' may bring to mind warrantless wiretapping, bulk collection of metadata, and other controversial programs conducted by the National Security Agency (NSA). But clandestinely gathering intelligence for counterterrorism differs from collecting, analyzing, and sharing information with consent to improve internet security, just as disease surveillance differs from spying. As shown in Table 1 , similar kinds of information can be collected, analyzed, and shared through cyber intelligence and disease surveillance. First, information about malware includes the source code (human readable) and binary files (machine readable). These files can be summarized as signature or hash functions to help automate detection; they can also be analyzed to identify other indicators of compromise and clues about their author or origin. Second, data on susceptible, infected, and exposed hosts can provide information about vectors and vulnerable computers in the population. Third, cyber intelligence, like disease surveillance, relates to action. Information about defensive action includes the actors involved, as well as how they detect, contain, and remediate threats using patched or recon-figured software, firewalls, blacklists, access controls, and other countermeasures. To be clear, computer forensics is not the same as shoe leather epidemiology. At an abstract level, however, cyber intelligence is sufficiently similar to disease surveillance that comparing their respective histories and policies can provide useful insight into each field. Since cyberspace is a relatively new domain, the organizations that have been created to defend it are more sensitive to recent trends in politics and technology than many older institutions in public health. Therefore, cyber intelligence may serve as a sentinel case for the forces that stand to reshape disease surveillance in the years ahead. Comparative Analysis: Local, National, and International Organizations The basic structure of disease surveillance at the local, national, and international levels is familiar to many public health professionals. But neither the organizations involved nor their future should be taken for granted. These actors, relationships, and norms look different from those in cyber intelligence, despite similar threats and functional requirements. In addition, politics have had a profound impact on both fields, and the prevailing politics have changed over time. American public health consisted mostly of local, uncoordinated, and episodic sanitation and quarantine measures until the bacteriological revolution. [23] [24] [25] These local roots are still evident in the design of many health departments across the United States. The modern practice of public health surveillance also reflects important political compromises, quite distinct from the germ theory of disease. Local laws about infectious disease date back to the colonial era, but the proliferation of modern regulations for surveillance coincided with the new science of bacteriology in the 1880s and 1890s. For instance, one of the first bacteriological laboratories in the United States was 27(p3) Bacteriology ''brought professionalism to disease control,'' and laboratory diagnosis provided these professionals with newfound ''status and authority.'' 17(p3) However, the political power afforded by this status and authority was insufficient to overcome countervailing interests. ''Public health . sought to expand the role of government as a guardian against disease,'' 17(p256) but many physicians in private practice opposed surveillance as a ''demand for unpaid labor'' and threat to their clinical authority. 17(p13) To win or defuse conflicts of interest over reporting tuberculosis and venereal disease at the turn of the century, public health officials-including outspoken proponents of surveillance such as Hermann Biggs-conceded that they would not encroach on the health care provided through private practice in exchange for notification. This concession was a political compromise. While it was expedient, limiting the actions that might otherwise have been informed by disease surveillance is hard to justify in terms of maximizing technical efficiency. Nevertheless, this compromise was one of many, and it endures to this day as a practical constraint on the relationship between disease surveillance and local intervention. Germ theory revolutionized biology, but cyberspace is manmade and thus somewhat less mysterious (eg, malware was never blamed on miasma or treated with leeches, although scam and snake oil solutions are not uncommon). John von Neumann theorized about self-replicating automata as early as 1949, and Fred Cohen coined the term ''computer virus'' in 1984, years before malware became a common threat. 28, 29 Instead of a new scientific theory, organizations for civilian cyber defense emerged in response to communication and coordination problems highlighted by the Morris worm. In November 1988, the Morris worm exploited flaws in the Unix operating system to quickly spread, infecting and slowing about 10% of the 60,000 to 80,000 computers that constituted the entire internet at that time. A few weeks later, another hack compromised the US military's network. Postmortem reviews of these incidents concluded that response was hampered by limited communication and coordination. 30 To address these problems, the Defense Advanced Research Projects Agency (DARPA) funded the Software Engineering Institute at Carnegie Mellon University to establish the Computer Emergency Response Team Coordination Center (CERT/CC). Naming the CERT/CC a ''coordination center'' did not mean that the collection, analysis, or dissemination of data about malware and software vulnerabilities were centrally coordinated. The CERT system was a decentralized approach to cyber intelligence. From the beginning, ''other agencies and constituencies were encouraged to create and sustain their own teams.'' 31(p19) These teams typically serve narrow and discrete constituencies (eg, a single company), unlike the local, municipal, and state health departments that serve the entire population in a given location. The result is a loosely coordinated ''mosaic of hundreds of independently operating CERTs across the world.'' 32(p92) The CERT/CC is not a government agency, in contrast to the health departments that conduct most disease surveillance. Nor can it make or enforce rules. ''From the very beginning,'' according to one founder of the CERT/CC (interview October 2015), ''DARPA stressed . that we had no authority.'' Moreover, little capacity for cybersecurity was developed inside local, municipal, or state governments. Even now, fewer than half of local law enforcement agencies have a cybercrime unit. 33 Therefore, unlike other threats to public health and safety, the only recourse that most malware victims have is through a commercial service provider, such as an antivirus vendor like Symantec or McAfee. In sum, in addition to being more decentralized, the United States chose a more private and voluntary approach to cyber defense compared to public health at the local level. Disease surveillance in the United States is a patchwork of state systems, developed through a long political process rather than an inevitable response to contagion. Congress created the US Marine Hospital Service in 1798, but control was not centralized until John Woodworth was appointed the first Surgeon General in 1871. Woodworth ''adopted a military model for his medical staff'' that was later institutionalized as the commissioned corps of the Public Health Service (PHS). 34 He also aimed ''to expand the service into a national quarantine agency'' and ''gather information about the spread of contagious diseases throughout the world.'' 23(p162) Despite political resistance, the Marine Hospital Service eventually gained control over quarantine through bureaucratic empire-building within the federal government. Here, as elsewhere, federal authority can be seen as constrained by the Tenth Amendment to the Constitution or empowered by the General Welfare Clause and the Commerce Clause. 35 Either way, deference to states' rights remained the norm for disease surveillance. The PHS appointed epidemiologists to serve in state health departments in 1914, but the federal government did not establish an agency dedicated to disease surveillance until 1946. 27 Even then, when the CDC was built on a wartime campaign against malaria, its bureaucratic survival was not assured. ''The CDC sometimes encountered opposition from better-established wings of the Public Health Service that resented the intrusion of a newcomer,'' and ''the National Institutes of Health [NIH] . also resented the upstart institution in Atlanta,'' 36(p28) so ''the possibility that CDC would be dismantled and merged with NIH was very real.'' 36(p49) To survive, the CDC sought to distinguish its fieldwork in service to the states from research at NIH. Similarly, Alexander Langmuir cited the threat of biological weapons during the Korean War to justify CDC's Epidemic Intelligence Service, which was also used to support state and local health officials. 11 Service to the states therefore became gospel at CDC, helping it become an authoritative federal agency while still deferring to states' rights. Prompted by Langmuir at CDC, the Council of State and Territorial Epidemiologists (CSTE) created an annual list of notifiable diseases in 1951. Reporting through the National Notifiable Disease Surveillance System is voluntary and somewhat variable; however, notification is compulsory at the state level, and, ''in practice, states do notify the CDC.'' 35(p1038) Disease surveillance is conducted by other federal agencies as well (eg, the Department of Agriculture, the Food and Drug Administration, and the Department of Defense), but CDC is the dominant authority in this field. It is also the biggest source of public health grant money, and this funding gives the CDC considerable power over the state health departments that it serves (interview with local public health official and former EIS officer, July 2015). There is no national institution for cyber intelligence with the same authority and resources as the CDC. Instead, the CERT system spread alongside a global marketplace-now worth almost $100 billion-for cybersecurity products and services. These products and services include various forms of threat data and analysis, as well as hardware and software to limit the damage caused by cyber attacks (eg, patches, antivirus, firewalls, encryption, intrusion detection and prevention systems, etc). How effective is this organizational ecosystem? Paradoxically, while the internet has drastically decreased the cost of communication, limited information is shared about threats to information technology. 2,3,37-39 Reporting is often ad hoc and idiosyncratic, depending more on interpersonal trust than institutional affiliation. Data collection and analysis are also balkanized. 40 Unlike epidemiology, where there is some consensus over CSTE case definitions, different vendors and CERTs use different nomenclature to name malware and count its frequency. Consequently, ''comparing incident statistics across teams is difficult and often meaningless,'' which diminishes the value of what little information is shared. 31(p83) Proposals for the federal government to establish authoritative standards and agencies have gained little traction. In 1989, the General Accounting Office (GAO) argued that internet security required a focal point. It proposed that the Office of Science and Technology Policy (OSTP) establish this focal point, but to no avail. 30 In 1991, the National Research Council recommended that the government help establish a nongovernmental organization to serve a similar function-again, with no apparent impact. 41 CERT/CC even pitched a similar proposal in 1997, which, by implication, was a concession that the CERT/CC had failed to establish itself as the single clearinghouse for cyber intelligence. 42 In 1998, President Bill Clinton issued Presidential Decision Directive 63 (PDD-63), 43 calling for ''A Public-Private Partnership'' to reduce the vulnerability of critical infrastructure to cyber attack. PDD-63 encouraged the private sector to create an Information Sharing and Analysis Center (ISAC), modeled on the CDC: As ultimately designed by private sector representatives, the ISAC may emulate particular aspects of such institutions as the Centers for Disease Control and Prevention that have proved highly effective. . 43 However, unlike the CDC (ie, a federal agency with police powers, informed by compulsory reporting at the state level), this ISAC was to be private and voluntary. Moreover, ''the single ISAC-the one PDD-63 calls for-was never built'' (interview with former US National Security Council staff, September 2015). Rather than establish a single clearinghouse, building separate ISACs became a business opportunity. 44 At least 19 sector-specific ISACs were subsequently established for financial services, energy, healthcare, and other parts of critical infrastructure. While reasonable, this outcome was not inevitable, especially since variation inside a given sector (eg, small and medium enterprises versus multinational corporations) may be just as significant for security as variation across sectors (eg, finance versus energy or healthcare). The Department of Homeland Security (DHS) was created in the aftermath of September 11, and it built capacity for cyber intelligence through the United States Computer Emergency Readiness Team (US-CERT) and the National Cybersecurity and Communications Integration Center (NCCIC). In the future, the NCCIC or US-CERT may come to resemble the CDC. Yet, the CDC started out with hundreds of employees. 45 US-CERT started out with a few dozen people and a small room, ''with one table and one chair in one corner'' (interview with CERT/CC staff, October 2015). 46 Today, the CDC provides more than $3 billion per year in grants for domestic public health, along with more than $400 million for global health. 47, 48 The Homeland Security Grant Program provides more than $1 billion to help states counter terrorism and other hazards, but most of this money is spent on physical rather than cyber security (interview with SMITH former MS-ISAC staff, November 2015). In addition, US-CERT was intended to play an international role, and yet it focuses on domestic agencies, with less global engagement than initially imagined (interview with former US-CERT staff, September 2015). Analogous functions and threats aside, national cyber intelligence is structured differently from disease surveillance. Less than a year after the Morris worm, the CERT/CC helped the US Department of Energy and the National Air and Space Administration respond to the Wank worm, an early attempt at antinuclear hacktivism that protested the plutonium-powered Galileo spacecraft in 1989. Initial signs pointed to France and then Australia, and this incident fueled interest in international information sharing. 49 CERT/CC therefore helped create the Forum of Incident Response and Security Teams (FIRST) in 1990. FIRST provides ''access to best practices, tools, and trusted communication.'' 50 It started out as a small club (except for 1 French team, the other founding members were all American). 31 It has since grown into an international forum for more than 300 national and industry teams from around the world. Given this international membership, FIRST can be compared with the World Health Organization. As with comparisons between CERTs, ISACs, and the CDC, however, FIRST is no WHO. Unlike the WHO, FIRST did not build on previous international organizations. FIRST was the first of its kind. In contrast, the WHO was preceded by several international organizations and false starts, including the Pan American Sanitary Bureau (PASB), the Office International d'Hygiene Publique, and the Health Organization of the League of Nations. The structure of WHO reflects the politics of this legacy. For example, PASB leveraged its existing status to maintain autonomy and resist full integration into the WHO in 1948, a political precedent that helped other regional offices lock in similar arrangements and authorities. 51, 52 Furthermore, FIRST did not build on international law. In contrast, WHO administers the International Health Regulations (IHR), the origins of which trace back to the 19th century. After decades or more of noncompliance with these legally binding regulations, the IHR were revised in 2005. The revised IHR require member states to report all public health events of international concern, and they authorize WHO to acknowledge unofficial information about these events. In theory, WHO can use this authority to ''name and shame'' noncompliance (even though it rarely does so, bowing instead to political pressure from member states, and noncompliance persists). 52, 53 However, ''FIRST exercises no authority over the organization and operation of individual member teams.'' 54 It does not pretend to be an operational agency or wield international law. FIRST merely consists of a small board, a few committees, and an annual conference. It is dwarfed by WHO, which has a $4 billion budget and oversight of surveillance through programs such as GISRS and the Global Public Health Intelligence Network (GPHIN). As small as FIRST appears in this comparison, it is still more prominent than other international organizations involved with cyber intelligence. For instance, the International Multilateral Partnership Against Cyber Threats (IMPACT) is supposedly ''modelled after the Centers for Disease Control and Prevention . to enhance the global community's capabilities in dealing with cyber threats.'' 55 But IMPACT has little following or credibility in advanced industrialized countries (interview with former AusCERT staff, May 2015). Here again, this international ecosystem looks very different from disease surveillance. The institutional differences between disease surveillance and cyber intelligence are striking. On balance, cyber defense is more private, voluntary, and decentralized than public health. Which field is more effective is beyond the scope of this study, and ''there is no simple relationship between central control and effective surveillance.'' 22(p1071) Nevertheless, the chorus of criticism surrounding cyber defense suggests that public health may have the advantage. Either way, the history of cyber intelligence provides at least 3 important lessons for disease surveillance. First, both fields reflect political choices at least as much as any technological imperative. For example, in public health, the power and interests of physicians in private practice constrain the potential for local intervention informed by disease surveillance. The CDC's service-or federal deference-to states is a political choice as well, as is the patchwork of different state systems. ''US public health law and policy are inescapably political,'' and they have been ''from their very inception.'' 56(p46) The same is true for global health governance. Cyber intelligence is also shaped by politics, particularly the American ideology of ''anti-statism.'' 57 Ideological opposition to government institutions and interventions was ascendant during the 1980s and 1990s, at the same time that cyberspace and network threats were coming online. Anti-statism was bipartisan: The debate was less over whether to cut government and more over how much to cut. Thus, it was politically infeasible to provide cyber intelligence as a public good or service through government. This ideology helps explain why civilian cyber defense is now more decentralized, private, and voluntary than public health. Things could have been different. After all, the internet was born on public infrastructure with military funding. The United States could have built a focal point for cyber intelligence inside the government, like the CDC. For better or worse, it could have worked through an intergovernmental organization such as the International Telecommunications Union, which, like the WHO, is part of the United Nations (UN). Instead, the United States chose to rely on CERTs, ISACs, FIRST, and other private sources for cyber intelligence. The 1980s and 1990s were not the 1940s, when the United States had been willing to build government agencies at home (eg, CDC) and abroad (eg, the UN and the WHO). The ideological pendulum had swung towards anti-statism, and cyber intelligence reflects this shift in the marketplace of ideas. Cyber intelligence therefore provides a cautionary tale for public health. As underfunded and uncoordinated as disease surveillance may be, it could be worse. If the CDC did not already exist, it would be difficult to build this kind of government agency in this day and age. Building the WHO would be even harder. Path dependency suggests that the CDC and the WHO will survive, but they should not be taken for granted. More important, if these agencies are going to help improve disease surveillance, then their policies and actions must be politically sophisticated, given the prevailing ideology in the United States. Without this kind of leadership, progress is unlikely. A second lesson to learn from cyber intelligence is that information technology does not guarantee information sharing. Electronic records are native to the internet, as are the means to share them (email, listserv, web portals, etc). But information sharing about malware and other cyber threats remains limited, which, in turn, limits how much ''big data'' is available and amenable to analysis. Sometimes, critical information is not even shared with potential victims. For example, IP addresses infected by the Conficker worm are routinely identified without informing the users of these compromised systems or removing the malware. 13 This practice may break with sound medical ethics for diagnosing and treating disease (although it pales in comparison to the infamous Tuskegee experiments). Either way, the lack of information sharing is not due to any lack of information technology. Therefore, as revolutionary as information technology and electronic health records may be for disease surveillance, they are no panacea. The technical challenges of moving from manual to automated data management are easy when compared to the political barriers to sharing information through interoperable systems. These barriers range from conflicts between public and private interests to privacy and liability concerns derived from the dual use of data, all of which have contributed to the balkanization of cyber intelligence. Public health officials should work to avoid similar fragmentation. To this end, it is important to recognize how the design and implementation of electronic records are not merely functions of technical efficiency but rather choices with distributional consequences (ie, there are winners and losers). One potential advantage for public health agencies is that disease surveillance is ''hierarchical and relatively simple in structure.'' 58(p2) This structure places CDC and the Department of Health and Human Services (HHS) in a strong bargaining position to promote open standards for information sharing (including common nomenclature and data types). If harmonized, these standards can create network effects-namely, positive externalities-that substantially increase the volume and value of information available. So, in addition to modernizing the National Electronic Disease Surveillance System (NEDSS), the CDC and HHS should aggressively push the development and adoption of data standards, as well as ''meaningful use'' criteria for electronic health records to include public health. 59 Finally, while cyber intelligence leaves much to be desired, some aspects of this field may be worth replicating. After all, the relatively hierarchical system whereby state health departments report information to the CDC is not above reproach. The CDC can be criticized for being a black box into which information flows ''but does not come back.'' Ironically, this criticism echoes what some cybersecurity experts say about sharing information with intelligence agencies like the NSA. Given that the politics of American anti-statism resist increasing federal administration of public health surveillance, alternative approaches for improving collection, analysis, and reporting also warrant consideration. Because ISACs for cyber intelligence failed to actually emulate the CDC, one alternative approach would be for the CDC to sponsor an ISAC for public health surveillance. 60 The CDC Office of Public Health Scientific Services could seed this public-private partnership, which might look like the National Health Information Sharing and Analysis Center (NH-ISAC) for cybersecurity in the healthcare sector. Ideally, it would combine the lessons learned from NH-ISAC with experience from CDC's pilot fusion center, BioPHusion. 61 Likewise, private sector participation may help this ISAC avoid some of the problems suffered by the DHS National Biosurveillance Integration Center. 62 The goals would be consistent with the CDC's surveillance strategy. 63 The ISAC could help make surveillance ''more adaptable to the rapidly changing technology landscape'' by allowing tech vendors and users to engage in an operational environment on an ongoing basis. 63 It could help make surveillance ''more versatile in meeting demands for expanding knowledge about evolving threats to health'' 63 by providing a sanctioned forum for stakeholders to share and SMITH evaluate heterogeneous data alongside internet-based surveillance (GPHIN, HealthMap, ProMED, etc), official sources (eg, NEDSS, Epi-X), and similar systems. The ISAC could also help make surveillance ''more able to meet the demands for timely and . [specific] information''especially demands from outside the CDC-by broadening participation. 63 Of course, the devil is in the details, and an ISAC cannot replace surveillance by local, state, and federal authorities (a lesson that may be dawning on cybersecurity experts). This article provides a foundation for further research, and an ISAC for public health surveillance may be found to complement the existing architecture. Even if the CDC rejects this particular approach, it can still learn from the successes and failures of cyber intelligence. This field is in touch with information technology, by definition. It is sensitive to contemporary politics, lacking the longer legacy of public health. Therefore, while cyber intelligence and disease surveillance are not the same, the threats and functional requirements are too similar to ignore when crafting better policy.

projects that include this document

Unselected / annnotation Selected / annnotation