kitchin

Transcript

1 INFORMATION, COMMUNICATION & SOCIETY, 2017 29 – VOL. 20, NO. 1, 14 http://dx.doi.org/10.1080/1369118X.2016.1154087 Thinking critically about and researching algorithms Rob Kitchin NIRSA, National University of Ireland Maynooth, County Kildare, Ireland ABSTRACT ARTICLE HISTORY Received 30 September 2015 More and more aspects of our everyday lives are being mediated, Accepted 10 February 2016 augmented, produced and regulated by software-enabled technologies. Software is fundamentally composed of algorithms: KEYWORDS ned steps structured to process instructions/data to sets of de fi Algorithm; code; produce an output. This paper synthesises and extends emerging epistemology; methodology; critical thinking about algorithms and considers how best to research research them in practice. Four main arguments are developed. First, there is a pressing need to focus critical and empirical attention on algorithms and the work that they do given their increasing importance in shaping social and economic life. Second, algorithms can be conceived in a number of ways – technically, computationally, mathematically, politically, culturally, economically, contextually, materially, philosophically, ethically – but are best understood as being contingent, ontogenetic and performative in nature, and embedded in wider socio-technical assemblages. Third, there are three main challenges that hinder research about algorithms (gaining access to their formulation; they are heterogeneous and embedded in wider systems; their work unfolds contextually and contingently), which require practical and epistemological attention. Fourth, the constitution and work of algorithms can be empirically studied in a number of ways, each of which has strengths and weaknesses that need to be systematically evaluated. Six methodological approaches designed to produce insights into the nature and work of algorithms are critically appraised. It is contended that these methods are best used in combination in order to help overcome epistemological and practical challenges. Introduction: why study algorithms? The era of ubiquitous computing and big data is now fi rmly established, with more and play, consumption, work, travel, communication, more aspects of our everyday lives – domestic tasks, security, etc. – being mediated, augmented, produced and regulated by eld, ; Kitchin digital devices and networked systems powered by software (Green fi 2006 ; Steiner, 2012 ; Manovich, 2011 ). Software is fundamentally composed of & Dodge, 2013 algorithms ned steps structured to process instructions/data to produce an fi sets of de – ’ (Gillespie, output – with all digital technologies thus constituting ‘ algorithm machines enable extensive and complex tasks to be tackled ). These 2014a ‘ algorithm machines ’ that would be all but impossible by hand or analogue machines. They can perform millions Rob Kitchin CONTACT [email protected] © 2016 Informa UK Limited, trading as Taylor & Francis Group

2 15 INFORMATION, COMMUNICATION & SOCIETY of operations per second; minimise human error and bias in how a task is performed; and cantly reduce costs and increase turnover and pro fi fi can signi t through automation and 2011 creating new services/products (Kitchin & Dodge, ). As such, dozens of key sets of algorithms are shaping everyday practices and tasks, including those that perform search, secure encrypted exchange, recommendation, pattern recognition, data com- pression, auto-correction, routing, predicting, pro fi ling, simulation and optimisation (MacCormick, ). 2013 We As Diakopoulos ( re living in a world now where algorithms 2013 , p. 2) argues: ‘ ’ Algorithms, driven ... adjudicate more and more consequential decisions in our lives. 2012 , p. 214) thus by vast troves of data, are the new power brokers in society. ’ Steiner ( contends: algorithms already have control of your money market funds, your stocks, and your retire- ’ ment accounts. They ll soon decide who you talk to on phone calls; they will control the music that reaches your radio; they will decide your chances of getting lifesaving organs transplant; and for millions of people, algorithms will make perhaps the largest decision of in their life: choosing a spouse. ) document ), Arnoldi ( Similarly, Lenglet ( ), Pasquale ( 2015 2011 ), MacKenzie ( 2014 2016 nance fi how algorithms have deeply and pervasively restructured how all aspects of the sector operate, from how funds are traded to how credit agencies assess risk and sort cus- , 2009 ) details how algorithms are used to assess security risks in the tomers. Amoore ( 2006 ’ war on terror fi ling passengers and citizens. With respect to the creation ‘ through the pro , p. 345) notes how algorithms ‘ of Wikipedia, Geiger ( 2014 help create new articles, edit existing articles, enforce rules and standards, patrol for spam and vandalism, and generally Likewise, Anderson ( ) details ’ work to support encyclopaedic or administrative work. 2011 how algorithms are playing an increasingly important role in producing content and med- iating the relationships between journalists, audiences, newsrooms and media products. In whatever domain algorithms are deployed they appear to be having disruptive and transformative effect, both to how that domain is organised and operates, and to the ) provides numerous examples of how labour market associated with it. Steiner ( 2012 algorithms and computation have led to widespread job losses in some industries through automation. He concludes programmers now scout new industries for soft spots where algorithms might render old Determining the next paradigms extinct, and in the process make mountains of money ... eld to be invaded by bots [automated algorithms] is the sum of two simple functions: the fi , pp. 6, 119) potential to disrupt plus the reward for disruption. (Steiner, 2012 Such conclusions have led a number of commentators to argue that we are now entering an era of widespread algorithmic governance, wherein algorithms will play an ever- increasing role in the exercise of power, a means through which to automate the disciplin- fi ciency of capital accumulation. ing and controlling of societies and to increase the ef , p. 2, original emphasis) warns that: ‘ What we generally However, Diakopoulos ( 2013 ’ Such lack as a public is clarity about how algorithms exercise their power over us. clarity is absent because although algorithms are imbued with the power to act upon data and make consequential decisions (such as to issue fi nes or block travel or approve a loan) they are largely black boxed and beyond query or question. What is at stake

3 R. KITCHIN 16 algorithm machines ’ is new forms of algorithmic power that are then with the rise of ‘ reshaping how social and economic systems work. In response, over the past decade or so, a growing number of scholars have started to focus critical attention on software code and algorithms, drawing on and contributing to science and technology studies, new media studies and software studies, in order to unpack the nature of algorithms and their power and work. Their analyses typically take one of three forms: a detailed case study of a single algorithm, or class of algorithms, 2012 to examine the nature of algorithms more generally (e.g., Bucher, ; Geiger, ; 2014 ); a detailed examination of the use of algorithms 2012 Mackenzie, 2007 ; Montfort et al., , 2009 ) or 2011 in one domain, such as journalism (Anderson, ), security (Amoore, 2006 2015 ); or a more general, critical account of algorithms, their , 2014 fi nance (Pasquale, 2014a , 2014b nature and how they perform work (e.g., Cox, 2013 ). 2013 ; Gillespie, ; Seaver, This paper synthesises, critiques and extends these studies. Divided into two main sec- – the paper makes four key – tions thinking critically about and researching algorithms arguments. First, as already noted, there is a pressing need to focus critical and empirical attention on algorithms and the work that they do in the world. Second, it is most pro- ductive to conceive of algorithms as being contingent, ontogenetic, performative in nature and embedded in wider socio-technical assemblages. Third, there are three main challenges that hinder research about algorithms (gaining access to their formulation; they are heterogeneous and embedded in wider systems; their work unfolds contextually and contingently), which require practical and epistemological attention. Fourth, the con- stitution and work of algorithms can be empirically studied in a number of ways, each of which has strengths and weaknesses that need to be systematically evaluated. With respect to the latter, the paper provides a critical appraisal of six methodological approaches that tably be used to produce insights into the nature and work of algorithms. might pro fi Thinking critically about algorithms While an algorithm is commonly understood as a set of de fi ned steps to produce particular cation. What constitutes outputs it is important to note that this is somewhat of a simpli fi an algorithm has changed over time and they can be thought about in a number of ways: technically, computationally, mathematically, politically, culturally, economically, contex- tually, materially, philosophically, ethically and so on. ) traces the term 2012 ’ to twelfth-century Spain when the scripts of ‘ Miyazaki ( algorithm ̣ ū ā al-Khw ā rizm ī ammad ibn M the Arabian mathematician Muh were translated into s Latin. These scripts describe methods of addition, subtraction, multiplication and division algorism ’ meant ‘ the speci fi c step-by-step method of perform- using numbers. Thereafter, ‘ (Miyazaki, 2012 ing written elementary arithmetic ‘ came to describe any ’ , p. 2) and ’ 2012 , p. 55). By the mid-twentieth method of systematic or automatic calculation (Steiner, c computation and early high level programming fi century and the development of scienti languages, such as Algol 58 and its derivatives (short for ALGOrithmic Language), an algorithm was understood to be a set of de fi ned steps that if followed in the correct order will computationally process input (instructions and/or data) to produce a ). desired outcome (Miyazaki, 2012 Algorithm = Logic + Control ’ ; From a computational and programming perspective an ‘ where the logic is the problem domain-speci fi c component and speci fi es the abstract

4 17 INFORMATION, COMMUNICATION & SOCIETY formulation and expression of a solution (what is to be done) and the control component is the problem-solving strategy and the instructions for processing the logic under differ- 1979 ciency of an algorithm can ent scenarios (how it should be done) (Kowalski, ). The ef fi fi be enhanced by either re ning the logic component or by improving the control over its use, including altering data structures (input) to improve ef fi 1979 ). As ciency (Kowalski, reasoned logic, the formulation of an algorithm is, in theory at least, independent of pro- it has an autonomous existence gramming languages and the machines that execute them; ‘ independent of 2008 , p. 15). “ implementation details ”’ (Goffey, Some ideas explicitly take the form of an algorithm. Mathematical formulae, for example, are expressed as precise algorithms in the form of equations. In other cases pro- blems have to be abstracted and structured into a set of instructions (pseudo-code) which can then be coded (Goffey, 2008 ). A computer programme structures lots of relatively simple algorithms together to form large, often complex, recursive decision trees ; Steiner, ). The methods of guiding and calculating decisions are 2015 (Neyland, 2012 largely based on Boolean logic (e.g., if this, then that) and the mathematical formulae and equations of calculus, graph theory and probability theory. Coding thus consists of fi two key translation challenges centred on producing algorithms. The rst is translating a task or problem into a structured formula with an appropriate rule set (pseudo-code). The second is translating this pseudo-code into source code that when compiled will perform the task or solve the problem. Both translations can be challenging, requiring the precise de fi nition of what a task/problem is (logic), then breaking that down into a precise set of instructions, factoring in any contingencies such as how the algorithm should perform under different conditions (control). The consequences of mistranslating the problem and/or solution are erroneous outcomes and random uncertainties (Drucker, 2013 ). The processes of translation are often portrayed as technical, benign and commonsen- sical. This is how algorithms are mostly presented by computer scientists and technology purely formal beings of reason ’ (Goffey, 2008 , p. 16). Thus, as companies: that they are ‘ ) notes, in computer science texts the focus is centred on how to design an 2013 Seaver ( ciency and prove its optimality from a purely technical per- fi algorithm, determine its ef spective. If there is discussion of the work algorithms do in real-world contexts this con- c task. In other words, fi centrates on how algorithms function in practice to perform a speci ‘ algorithms are understood to be strictly rational concerns, marrying the certainties of ’ (Seaver, 2013 ‘ Other knowledge mathematics with the objectivity of technology , p. 2). – is strictly out of – about algorithms such as their applications, effects, and circulation (Seaver, 2013 , pp. 1 – 2). As are the complex set of decision-making processes ’ frame nance, politics, legal and practices, and the wider assemblage of systems of thought, fi codes and regulations, materialities and infrastructures, institutions, inter-personal ). relations, which shape their production (Kitchin, 2014 Far from being objective, impartial, reliable and legitimate, critical scholars argue that fi algorithms possess none of these qualities except as carefully crafted ctions (Gillespie, 2014a ). As Montfort et al. ( 2012 , [c]ode is not purely abstract and mathemat- p. 3) note, ‘ cant social, political, and aesthetic dimensions, ’ inherently framed and ical; it has signi fi shaped by all kinds of decisions, politics, ideology and the materialities of hardware and infrastructure that enact its instruction. Whilst programmers might seek to maintain a high degree of mechanical objectivity – being distant, detached and impartial in how

5 R. KITCHIN 18 they work and thus acting independent of local customs, culture, knowledge and context ) – 1995 (Porter, in the process of translating a task or process or calculation into an algor- ithm they can never fully escape these. Nor can they escape factors such as available resources and the choice and quality of training data; requirements relating to standards, protocols and the law; and choices and conditionalities relating to hardware, platforms, ; Kitchin & Dodge, 2011 ; 2013 bandwidth and languages (Diakopoulos, 2013 ; Drucker, Neyland, 2015 ). In reality then, a great deal of expertise, judgement, choice and constraints 2014a ). Moreover, algorithms are created are exercised in producing algorithms (Gillespie, for purposes that are often far from neutral: to create value and capital; to nudge behaviour and structure preferences in a certain way; and to identify, sort and classify people. ‘ ... a live process of engagement between thinking At the same time, programming is 2008 , with and working on materials and the problem space that emerges ’ (Fuller, ‘ p. 10) and it is not a dry technical exercise but an exploration of aesthetic, material, 2012 , p. 266). In other words, creating an algorithm and formal qualities (Montfort et al., ’ unfolds in context through processes such as trial and error, play, collaboration, discussion and negotiation. They are ontogenetic in nature (always in a state of becoming), teased into being: edited, revised, deleted and restarted, shared with others, passing through mul- tiple iterations stretched out over time and space (Kitchin & Dodge, ). As a result, 2011 they are always somewhat uncertain, provisional and messy fragile accomplishments (Gil- 2015 lespie, 2014a ; Neyland, ). And such practices are complemented by many others, such as researching the concept, selecting and cleaning data, tuning parameters, selling the idea fi nance and so on. These practices are framed and product, building coding teams, raising by systems of thought and forms of knowledge, modes of political economy, organis- ational and institutional cultures and politics, governmentalities and legalities, subjectiv- 2013 ‘ algorithmic systems are not ities and communities. As Seaver ( , p. 10) notes, standalone little boxes, but massive, networked ones with hundreds of hands reaching into them, tweaking and tuning, swapping out parts and experimenting with new ’ arrangements. ... Creating algorithms thus sits at the social and material ‘ intersection of dozens of practices ’ that are culturally, historically and institutionally situated (Montfort et al., ; Takhteyev, 2012 2012 2007 , p. 93) , p. 262; Napoli, 2013 ). As such, as Mackenzie ( as a general expression of mental effort, or, perhaps ‘ argues treating algorithms simply even more abstractly, as process of abstraction, is to lose track of proximities and relation- Algorithms cannot be divorced from the conditions alities that algorithms articulate. ’ ). What this means is that under which they are developed and deployed (Geiger, 2014 algorithms need to be understood as relational, contingent, contextual in nature, framed within the wider context of their socio-technical assemblage. From this perspec- tive, algorithm ’ is one element in a broader apparatus which means it can never be under- ‘ stood as a technical, objective, impartial form of knowledge or mode of operation. Beyond thinking critically about the nature of algorithms, there is also a need to con- sider their work, effects and power. Just as algorithms are not neutral, impartial expressions of knowledge, their work is not impassive and apolitical. Algorithms le, model, simulate, visualise search, collate, sort, categorise, group, match, analyse, pro fi and regulate people, processes and places. They shape how we understand the world and they do work in and make the world through their execution as software, with profound c onsequences (Kitchin & Dodge, 2011 ). In this sense, they are profoundly performative

6 19 INFORMATION, COMMUNICATION & SOCIETY ). And while the creators as they cause things to happen (Mackenzie & Vurdubakis, 2011 replace, displace, or reduce the role of biased of these algorithms might argue that they ‘ and remove subjectivity from decision-making, compu- or self-serving intermediaries ’ tation often deepens and accelerates processes of sorting, classifying and differentially treating, and reifying traditional pathologies, rather than reforming them (Pasquale, 2014 ,p.5). Far from being neutral in nature, algorithms construct and implement regimes of power and knowledge (Kushner, ) and their use has normative implications (Ander- 2013 son, 2011 ). Algorithms are used to seduce, coerce, discipline, regulate and control: to guide and reshape how people, animals and objects interact with and pass through various systems. This is the same for systems designed to empower, entertain and enlighten, as ned rule-sets about how a system behaves at any one fi they are also predicated on de time and situation. Algorithms thus claim and express algorithmic authority (Shirky, ; Musiani, 2013 ), often through what 2009 ) or algorithmic governance (Beer, 2009 ‘ Dodge and Kitchin ( ’ (decision-making processes 2007 ) term automated management that are automated, automatic and autonomous; outside of human oversight). The conse- ) is that society now has a new rule set to live by to complement 2007 quence for Lash ( constitutive and regulative rules: algorithmic, generative rules. He explains that such ’ ; rules are embedded within computation, an expression of ‘ power through the algorithm virtuals that generate a whole variety of actuals. They are compressed and hidden ‘ they are and we do not encounter them in the way that we encounter constitutive and regulative They are ... pathways through which capitalist power works ’ (Lash, rules. , p. 71). ... 2007 It should be noted, however, that the effects of algorithms or their power is not always linear or always predictable for three reasons. First, algorithms act as part of a wider network of relations which mediate and refract their work, for example, poor input 2008 ). Second, the performance ; Pasquale, data will lead to weak outcomes (Goffey, 2014 of algorithms can have side effects and unintended consequences, and left unattended or ). Third, algorithms can 2012 unsupervised they can perform unanticipated acts (Steiner, 2013 ; Drucker, have biases or make mistakes due to bugs or miscoding (Diakopoulos, 2013 ). Moreover, once computation is made public it undergoes a process of domesti- cation, with users embedding the technology in their lives in all kinds of alternative ways and using it for different means, or resisting, subverting and reworking the algor- intent (consider the ways in which users try to game Google ’ s PageRank algorithm). ithms ’ In this sense, algorithms are not just what programmers create, or the effects they create based on certain input, they are also what users make of them on a daily basis (Gillespie, 2014a ). s( Steiner , p. 218) solution to living with the power of algorithms is to suggest ’ 2012 [g]et friendly with bots. He argues that the way to thrive in the algorithmic ‘ that we ’ build, maintain, and improve upon code and algorithms, ’ as if future is to learn to ‘ knowing how to produce algorithms protects oneself from their diverse and pernicious effects across multiple domains. Instead, I would argue, there is a need to focus more critical attention on the production, deployment and effects of algorithms in order to understand and contest the various ways that they can overtly and covertly shape life chances. However, such a programme of res earch is not as straightforward as one might hope, as the next section details.

7 R. KITCHIN 20 Researching algorithms esh out our understanding of algorithms and the work they do in the fl The logical way to world is to conduct detailed empirical research centrally focused on algorithms. Such research could approach algorithms from a number of perspectives: a technical approach that studies algorithms as computer science; a sociological approach that studies algorithms as the product of interactions among programmers and designers; fi gure and agent in law; a philosophical approach a legal approach that studies algorithms as a 2013 , p. 3) that studies the ethics of algorithms, (Barocas, Hood, & Ziewitz, ’ perspective that studies the politics and power embedded in and a code/software studies algorithms, their framing within a wider socio-technical assemblage and how they reshape particular domains. There are a number of methodological approaches that can be used to operationalise such research, six of which are critically appraised below. Before doing so, fi however, it is important to acknowledge that there are three signi cant challenges to researching algorithms that require epistemological and practical attention. Challenges Access/black boxed Many of the most important algorithms that people encounter on a regular basis and which (re)shape how they perform tasks or the services they receive are created in environ- ments that are not open to scrutiny and their source code is hidden inside impenetrable executable fi les. Coding often happens in private settings, such as within companies or state agencies, and it can be dif fi cult to negotiate access to coding teams to observe them work, interview programmers or analyse the source code they produce. This is s algorithms that provide it with a competitive unsurprising since it is often a company ’ advantage and they are reluctant to expose their intellectual property even with non-dis- closure agreements in place. They also want to limit the ability of users to game the algor- ithm to unfairly gain a competitive edge. Access is a little easier in the case of open-source programming teams and open-source programmes through repositories such as Github, but while they provide access to much code, this is limited in scope and does not include key proprietary algorithms that might be of more interest with respect to holding forms of algorithmic governance to account. Heterogeneous and embedded If access is gained, algorithms, as Seaver ( ) notes, are rarely straightforward to decon- 2013 struct. Within code algorithms are usually woven together with hundreds of other algor- ithms to create algorithmic systems. It is the workings of these algorithmic systems that we c algorithms, many of which are quite benign and pro- are mostly interested in, not speci fi works of collective authorship, made, main- cedural. Algorithmic systems are most often ‘ (Seaver, 2013 , tained, and revised by many people with different goals at different times ’ p. 10). They can consist of original formulations mashed together with those sourced from code libraries, including stock algorithms that are re-used in multiple instances. Moreover, they are embedded within complex socio-technical assemblages made up of a hetero- geneous set of relations including potentially thousands of individuals, data sets, objects, apparatus, elements, protocols, standards, laws, etc. that frame their development.

8 21 INFORMATION, COMMUNICATION & SOCIETY fl ux, revisability, and nego- Their construction, therefore, is often quite messy, full of ‘ (p. 10), making unpacking the logic and rationality behind their formulation dif- tiation ’ fi cult in practice. Indeed, it is unlikely that any one programmer has a complete understanding of a system, especially large, complex ones that are built by many teams of programmers, some of whom may be distributed all over the planet or may have only had sight of smaller outsourced segments. Getting access to a credit rating agency ’ s algorithmic system then might give an insight into its formula for assessing and sorting individuals, its underlying logics and principles, and how it was created and works in practice, but will not necessarily provide full transparency as to its full reasoning, ; Chun, ). 2012 2011 workings or the choices made in its construction (Bucher, Ontogenetic, performative and contingent fi xed in form and As well as being heterogeneous and embedded, algorithms are rarely their work in practice unfolds in multifarious ways. As such, algorithms need to be recog- xed in fi nised as being ontogenetic, performative and contingent: that is, they are never nature, but are emergent and constantly unfolding. In cases where an algorithm is fi static, for example, in rmware that is not patched, its work unfolds contextually, reactive to input, interaction and situation. In other cases, algorithms and their instantiation in code are often being re fi ned, reworked, extended and patched, iterating through various versions (Miyazaki, ). Companies such as Google and Facebook might be live 2012 running dozens of different versions of an algorithm to assess their relative merits, with no guarantee that the version a user interacts with at one moment in time is the same fi as ve seconds later. In some cases, the code has been programmed to evolve, re-writing its algorithms as it observes, experiments and learns independently of its crea- 2012 ). tors (Steiner, Similarly, many algorithms are designed to be reactive and mutable to inputs. As 2012 ) notes, Facebook s EdgeRank algorithm (that determines what posts and Bucher ( ’ timeline) does not act from above in a static, in what order are fed into each users ’ fi xed manner, but rather works in concert with the each individual user, ordering posts ’ friends. Its parameters then are contextually dependent on how one interacts with ‘ ’ weighted and fl uid. In other cases, randomness might be built into an algorithm s design meaning its outcomes can never be perfectly predicted. What this means is that the outcomes for users inputting the same data might vary for contextual reasons (e.g., ) examined Google ’ s autocomplete search algorithm by Mahnke and Uprichard ( 2014 nding differences fi typing in the same terms from two locations and comparing the results, in the suggestions the algorithm gave), and the same algorithms might be being used in quite varied and mutable ways (e.g., for work or for play). Examining one version of an algorithm will then provide a snapshot reading that fails to acknowledge or account for 2012 ). the mutable and often multiple natures of algorithms and their work (Bucher, out of control ’ in the sense that their outcomes are some- Algorithms then are often ‘ times not easily anticipated, producing unexpected effects in terms of their work in the 2005 ). As such, understanding the work and effects of algorithms world (Mackenzie, needs to be sensitive to their contextual, contingent unfolding across situation, time and space. What this means in practice is that single or limited engagements with algor- ithms cannot be simply extrapolated to all cases and that a set of comparative case studies

9 R. KITCHIN 22 need to be employed, or a series of experiments performed with the same algorithm oper- ating under different conditions. Approaches fi Keeping in mind these challenges, this nal section critically appraises six methodological approaches for researching algorithms that I believe present the most promise for shed- ding light on the nature and workings of algorithms, their embedding in socio-technical fi culties of systems, their effects and power, and dealing with and overcoming the dif gaining access to source code. Each approach has its strengths and drawbacks and their use is not mutually exclusive. Indeed, I would argue that there would be much to be gained by using two or more of the approaches in combination to compensate for the drawbacks of employing them in isolation. Nor are they the only possible approaches, with ethnomethodologies, surveys and historical analysis using archives and oral histories offering other possible avenues of analysis and insight. Examining pseudo-code/source code Perhaps the most obvious way to try and understand an algorithm is to examine its pseudo-code (how a task or puzzle is translated into a model or recipe) and/or its con- struction in source code. There are three ways in which this can be undertaken in practice. rst is to carefully deconstruct the pseudo-code and/or source code, teasing apart the The fi rule set to determine how the algorithm works to translate input to produce an outcome 2008 (Krysa & Sedek, ). In practice this means carefully sifting through documentation, code and programmer comments, tracing out how the algorithm works to process data and calculate outcomes, and decoding the translation process undertaken to construct the algorithm. The second is to map out a genealogy of how an algorithm mutates and evolves over time as it is tweaked and rewritten across different versions of code. For example, one might deconstruct how an algorithm is re-scripted in multiple instantiations of a programme within a code library such as github. Such a genealogy would reveal how thinking with respect to a problem is re ned and transformed with respect to how the fi algorithm/code performs and in relation to new technologies, situations ‘ in the wild ’ and contexts (such as new platforms or regulations being introduced). The third is to examine how the same task is translated into various software languages and how it ) in runs across different platforms. This is an approach used by Montfort et al. ( 2012 10 PRINT ‘ ’ their exploration of the algorithm, where they scripted code to perform the same task in multiple languages and ran it on different hardware, and also tweaked the c contingencies and affordances this introduced. fi parameters, to observe the speci While these methods do offer the promise of providing valuable insights into the ways in which algorithms are built, how power is vested in them through their various par- ameters and rules, and how they process data in abstract and material terms to complete fi a task, there are three signi cant issues with their deployment. First, as noted by Chandra ), deconstructing and tracing how an algorithm is constructed in code and mutates 2013 ( over time is not straightforward. Code often takes the form of a ’ : ‘ [a] hap- ‘ Big Ball of Mud ’ hazardly structured, sprawling, sloppy, duct-tape and bailing wire, spaghetti code jungle ; cited in Chandra, 2013 , p. 126). Even those that have produced it (Foote & Yoder, 1997 fi nd it very dif fi cult to unpack its algorithms and routines; those unfamiliar with its can

10 23 INFORMATION, COMMUNICATION & SOCIETY nd that the ball of mud remains just that. Second, it requires that development can often fi the researcher is both an expert in the domain to which the algorithm refers and possesses fi Big Ball of cient skill and knowledge as a programmer that they can make sense of a suf ‘ ’ Mud ; a pairing that few social scientists and humanities scholars possess. Third, these approaches largely decontextualise the algorithm from its wider socio-technical assem- blage and its use. Re fl exively producing code A related approach is to conduct auto-ethnographies of translating tasks into pseudo-code and the practices of producing algorithms in code. Here, rather than studying an algor- fl ects on and critically interrogates their own experi- ithm created by others, a researcher re ences of translating and formulating an algorithm. This would include an analysis of not only the practices of exploring and translating a task, originating and developing ideas, writing and revising code, but also how these practices are situated within and shaped by wider socio-technical factors such as regulatory and legal frameworks, form of knowl- edge, institutional arrangements, fi nancial terms and conditions, and anticipated users and fl market. Ziewitz ( 2011 ) employed this kind of approach to re ect on producing a random ecting on the ontological fl routing algorithm for directing a walking path through a city, re uncertainty in the task itself (that there is often an ontological gerrymandering effect at work as the task itself is re-thought and re-de ned while the process of producing an fi algorithm is undertaken), and the messy, contingent process of creating the rule set and parameters in practice and how these also kept shifting through deferred accountability. ) uses such an approach to consider the practices of developing Similarly, Ullman ( 1997 software and how this changed over her career. While this approach will provide useful insights into how algorithms are created, it also rst is the inherent subjectivities involved in doing an has a couple of limitations. The fi culties of detaching oneself and gaining critical distance auto-ethnography and the dif fi to be able to give clear insight into what is unfolding. Moreover, there is the possibility exive what would usually take place is in ected in unknown that in seeking to be re fl fl ways. Further, it excludes any non-representational, unconscious acts from analysis. Second, one generally wants to study algorithms and code that have real concrete ’ effects on peoples everyday lives, such as those used in algorithmic governance. One way to try and achieve this is to contribute to open-source projects where the code is incor- porated into products that others use, or to seek access to a commercial project as a pro- grammer (on an overt, approved basis with non-disclosure agreements in place). The t here is that the method can be complemented with the sixth approach set out bene fi ecting on the relationship between the production of an algor- below, examining and re fl ithm and any associated ambitions and expectations vis-à-vis how it actually does work in the world. Reverse engineering In cases where the code remains black boxed, a researcher interested in the algorithm at the heart of its workings is left with the option of trying to reverse engineer the compiled , p. 13) explains that ‘ software. Diakopoulos ( 2013 [r]everse engineering is the process of fi cations of a system through a rigorous examination drawing on articulating the speci domain knowledge, observation, and deduction to unearth a model of how that system

11 R. KITCHIN 24 While software producers might desire their products to remain opaque, each pro- works. ’ gramme inherently has two openings that enable lines of enquiry: input and output. By examining what data are fed into an algorithm and what output is produced it is possible to start to reverse engineer how the recipe of the algorithm is composed (how it weights and preferences some criteria) and what it does. The main way this is attempted is by using carefully selected dummy data and seeing what is outputted under different scenarios. For example, researchers might search Google using the same terms on multiple computers in multiple jurisdictions to get a sense of how 2014 its PageRank algorithm is constructed and works in practice (Mahnke & Uprichard, ), or they might experiment with posting and interacting with posts on Facebook to try and determine how its EdgeRank algorithm positions and prioritises posts in user time lines les into e-com- (Bucher, 2012 ), or they might use proxy servers and feed dummy user pro fi , Wall Street Journal merce systems to see how prices might vary across users and locales ( ‘ looking closely detailed in Diakopoulos, 2013 ). One can also get a sense of an algorithm by ’ at how information must be oriented to face them, how it is made algorithm-ready ; how the input data are delineated in terms of what input variables are sought and structured, and the associated meta-data (Gillespie, 2014a ). Another possibility is to follow debates on online forums by users about how they perceive an algorithm works or has changed, or fi rms that seek to game an interview marketers, media strategists, and public relations algorithm to optimise an outcome for a client (Bucher, 2012 ). While reverse engineering can give some indication of the factors and conditions city (Seaver, fi embedded into an algorithm, they generally cannot do so with any speci 2013 ). As such, they usually only provide fuzzy glimpses of how an algorithm works in ). One solution to try and practice but not its actual constitution (Diakopoulos, 2013 enhance clarity has been to employ bots, which posing as users, can more systematically 2013 ) engage with a system, running dummy data and interactions. However, as Seaver ( notes, many proprietary systems are aware that many people are seeking to determine and game their algorithm, and thus seek to identify and block bot users. Interviewing designers or conducting an ethnography of a coding team While deconstructing or reverse engineering code might provide some insights into the workings of an algorithm, they provide little more than conjecture as to the intent of the algorithm designers, and examining that and how and why an algorithm was produced requires a different approach. Interviewing designers and coders, or conducting an ethno- graphy of a coding team, provides a means of uncovering the story behind the production of an algorithm and to interrogate its purpose and assumptions. fi In the rst case, respondents are questioned as to how they framed objectives, created pseudo-code and translated this into code, and quizzed about design decisions and choices fl uences, constraints, debates with respect to languages and technologies, practices, in within a team or with clients, institutional politics and major changes in direction over ; MacKenzie, time (Diakopoulos, ; Mager, 2012 ). In the second case, a researcher 2013 2014 seeks to spend time within a coding team, either observing the work of the coders, discuss- ing it with them, and attending associated events such as team meetings, or working in situ as part of the team, taking an active role in producing code. An example of the former is s( 2007 Rosenberg ’ s attempt to produce a new product conducted ’ ) study of one company over a three-year period in which he was given full access to the company, including

12 25 INFORMATION, COMMUNICATION & SOCIETY observing and talking to coders, and having access to team chat rooms and phone confer- s( ’ 2012 ences. An example of the latter is Takhteyev ) study of an open-source coding project in Rio de Janeiro where he actively worked on developing the code, as well as taking part in the social life of the team. In both cases, Rosenberg and Takhteyev generate much insight into the contingent, relational and contextual way in which algorithms and fi software are produced, though in neither case are the speci cities of algorithms and their work unpacked and detailed. Unpacking the full socio-technical assemblage of algorithms As already noted, algorithms are not formulated or do not work in isolation, but form part of a technological stack that includes infrastructure/hardware, code platforms, data and interfaces, and are framed and conditions by forms of knowledge, legalities, governmen- fi nance and so on. A wider understanding of algorithms talities, institutions, marketplaces, then requires their full socio-technical assemblage to be examined, including an analysis of the reasons for subjecting the system to the logic of computation in the fi rst place. Exam- ining algorithms without considering their wider assemblage is, as Geiger ( 2014 ) argues, like considering a law without reference to the debate for its introduction, legal insti- tutions, infrastructures such as courts, implementers such as the police, and the operating and business practices of the legal profession. It also risks fetishising the algorithm and 2011 ). code at the expense of the rest of the assemblage (Chun, Interviews and ethnographies of coding projects, and the wider institutional apparatus surrounding them (e.g., management and institutional collaboration), start to produce such knowledge, but they need to supplemented with other approaches, such as a discur- sive analysis of company documents, promotional/industry material, procurement tenders and legal and standards frameworks; attending trade fairs and other inter-company inter- actions; examining the practices, structures and behaviour of institutions; and document- 2012 ing the biographies of key actors and the histories of projects (Montfort et al., ; ). Such a discursive analysis will also help to reveal how algorithms are ima- Napoli, 2013 gined and narrated, illuminate the discourses surrounding and promoting them, and how they are understood by those that create and promote them. Gaining access to such a wider range of elements, and being able to gather data and interlink them to be able to unpack a socio-technical assemblage, is no easy task but it is manageable as a large case study, especially if undertaken by a research team rather than a single individual. Examining how algorithms do work in the world Given that algorithms do active work in the world it is important not only to focus on the construction of algorithms, and their production within a wider assemblage, but also to examine how they are deployed within different domains to perform a multitude of tasks. This cannot be simply denoted from an examination of the algorithm/code alone for two reasons. First, what an algorithm is designed to do in theory and what it actually nement, miscodings, errors does in practice do not always correspond due to a lack of re fi in collaboration with data, technol- and bugs. Second, algorithms perform in context – and therefore their effects unfold in contin- ogies, people, etc. under varying conditions – gent and relational ways, producing localised and situated outcomes. When users employ an algorithm, say for play or work, they are not simply playing or working in conjunction ‘ learning, internalizing, and becoming intimate with ’ it with the algorithm, rather they are

13 R. KITCHIN 26 , p. 90); how they behave is subtly reshaped through the engagement, but (Galloway, 2006 at the same time what the algorithm does is conditional on the input it receives from the user. We can therefore only know how algorithms make a different to everyday life by observing their work in the world under different conditions. One way to undertake such research is to conduct ethnographies of how people engage with and are conditioned by algorithmic systems and how such systems reshape how ). It organisations conduct their endeavours and are structured (e.g., Lenglet, 2011 would also explore the ways in which people resist, subvert and transgress against the work of algorithms, and re-purpose and re-deploy them for purposes they were not orig- inally intended. For example, examining the ways in which various mobile and web appli- cations were re-purposed in the aftermath of the Haiti earthquake to coordinate disaster response, remap the nation and provide donations (Al-Akkad et al., ). Such research 2013 requires detailed observation and interviews focused on the use of particular systems and technologies by different populations and within different scenarios, and how individuals interfaced with the algorithm through software, including their assessments as to their intentions, sense of what is occurring and associated consequences, tactics of engagement, feelings, concerns and so on. In cases where an algorithm is black boxed, such research is also likely to shed some light on the constitution of the algorithm itself. Conclusion On an average day, people around the world come into contact with hundreds of algor- ithms embedded into the software that operates communications, utilities and transport infrastructure, and powers all kinds of digital devices used for work, play and consump- guring how fi tion. These algorithms have disruptive and transformative effect, recon systems operate, enacting new forms of algorithmic governance and enabling new forms of capital accumulation. Yet, despite their increasing pervasiveness, utility and the power vested in them to act in autonomous, automatic and automated ways, to date there has been limited critical attention paid to algorithms in contrast to the vast lit- erature that approaches algorithms from a more technical perspective. This imbalance in how algorithms are thought about and intellectually engaged with is perhaps somewhat surprising given what is at stake in a computationally rich world. As such, there is a press- ing need for critical attention across the social sciences and humanities to be focused on algorithms and forms of algorithmic governance. The contribution of this paper to this endeavour has been to: advance an understanding of algorithms as contingent, ontogen- etic, performative in nature and embedded in wider socio-technical assemblages; to detail the epistemological and practical challenges facing algorithm scholars; and to critically appraise six promising methodological options to empirically research and make sense of algorithms. It is apparent from the studies conducted to date that there is a range of different ways of making sense of algorithms and the intention of the paper has not been to foreclose this diversity, but rather to encourage synthesis, comparison and evalu- ation of different positions and to create new ones. Indeed, the more angles taken to uncover and debate the nature and work of algorithms the better we will come to know them. Likewise, the six approaches appraised were selected because I believe they hold the most promise in exposing how algorithms are constructed, how they work within socio-

14 27 INFORMATION, COMMUNICATION & SOCIETY technical assemblages and how they perform actions and make a difference in particular tably fi domains, but they are by no means the only approaches that might be pro pursued. My contention is, given each approach s varying strengths and weaknesses, ’ that how they reveal the nature and work of algorithms needs to be systematically eval- uated through methodologically focused research. Studies that have access to the pseudo-code, code and coders may well be the most illuminating, though they still face a number of challenges, such as deciphering how the algorithm works in practice. tably used in conjunction Moreover, there is a need to assess: (1) how they might be pro fi with each other to overcome epistemological and practical challenges; (2) what other cially deployed in order to better understand the nature, pro- fi methods might be bene duction and use of algorithms? With respect to the latter, such methods might include ethnomethodologies, surveys, histo rical analysis using archives and oral his- tories, and comparative case studies. As such, while the approaches and foci I have ne and extend, detailed provide a useful starting set that others can apply, critique, re fi there are others that can potentially emerge as critical research and thinking on algor- ithms develops and matures. Acknowledgements Many thanks to Tracey Lauriault, Sung-Yueh Perng and the referees for comments on earlier ver- sions of this paper. Disclosure statement fl ict of interest was reported by the author. No potential con Funding The research for this paper was funded by a European Research Council Advanced Investigator award [ERC-2012-AdG-323636-SOFTCITY]. Notes on contributor Rob Kitchin is a professor and ERC Advanced Investigator at the National University of Ireland Maynooth. He is currently a principal investigator on the Programmable City project, the Digital Repository of Ireland, the All-Island Research Observatory and the Dublin Dashboard. [email: [email protected]] References ). Al-Akkad, A., Ramirez, L., Denef, S., Boden, A., Wood, L., Buscher, M., & Zimmermann, A. ( 2013 ‘ Reconstructing normality : The use of infrastructure leftovers in crisis situations as inspiration for ’ the design of resilient technology. Proceedings of the 25th Australian Computer-Human 466). Interaction Conference: Augmentation, Application, Innovation, Collaboration (pp. 457 – New York, NY: ACM. Retrieved October 16, 2014, from http://dl.acm.org/citation.cfm?doid= 2541016.2541051 ). Biometric borders: Governing mobilities in the war on terror. Political Amoore, L. ( 2006 Geography 25 , 336 – 351. ,

15 R. KITCHIN 28 ). Algorithmic war: Everyday geographies of the war on terror. Antipode , 41 , 49 – Amoore, L. ( 2009 69. 2011 s vision of ). Deliberative, agonistic, and algorithmic audiences: Journalism Anderson C. W. ( ’ 5 Journal of Communication 547. its public in an age of audience. , 529 , – ). Computer algorithms, market manipulation and the institutionalization of high 2016 Arnoldi, J. ( – 52. , Theory, Culture & Society (1), 29 frequency trading. 33 ). Governing algorithms: A provocation piece. Retrieved 2013 Barocas, S., Hood, S., & Ziewitz, M. ( October 16, 2014, from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2245322 Beer, D. ( 2009 ). Power through the algorithm? Participatory Web cultures and the technological (6), 985 – 1002. unconscious. New Media and Society , 11 Want to be on the top? ). Algorithmic power and the threat of invisibility on Bucher, T. ( 2012 ‘ ’ (7), 1164 1180. 14 – , New Media and Society Facebook. Chandra, V. ( ction, coding software . London: Faber. 2013 ). Geek sublime: Writing fi Programmed visions Chun, W. H. K. ( 2011 ). . Cambridge: MIT Press. ). . Cambridge: MIT Press. Cox, G. ( 2013 Speaking code: Coding as aesthetic and political expression Algorithmic accountability reporting: On the investigation of black boxes 2013 .A Diakopoulos, N. ( ). Tow/Knight Brief. Tow Center for Digital Journalism, Columbia Journalism School. Retrieved August 21, 2014, from http://towcenter.org/algorithmic-accountability-2/ 2007 ). The automatic management of drivers and driving spaces. Dodge, M., & Kitchin, R. ( – 275. Geoforum , 38 (2), 264 Digital ). Performative materiality and theoretical approaches to interface. 2013 Drucker, J. ( (1). Retrieved June 5, 2014, from Humanities Quarterly , 7 http://www.digitalhumanities.org/ /1/000143/000143.html dhq/vol/7 Pattern Languages of Program Design , 4 , 654 – 692. Foote, B., & Yoder, J. ( 1997 ). Big Ball of Mud. Software studies – A lexicon (pp. 1 – 14). ). Introduction. In M. Fuller (Ed.), Fuller, M. ( 2008 Cambridge: MIT Press. ). . Minneapolis: University of 2006 Galloway, A. R. ( Gaming: Essays on algorithmic culture Minnesota Press. 2014 ). Bots, bespoke, code and the materiality of software platforms. Geiger, S. R. ( Information, (3), 342 , Communication & Society – 17 356. Gillespie, T. ( 2014a ). The relevance of algorithms. In T. Gillespie, P. J. Boczkowski, & K. A. Foot Media technologies: Essays on communication, materiality, and society – 193). (Eds.), (pp. 167 Cambridge: MIT Press. . Retrieved 2014b , June 25). Algorithm [draft] [#digitalkeyword]. Gillespie, T. ( Culture Digitally http://culturedigitally.org/2014/06/algorithm-draft-digitalkeyword/ October 16, 2014, from Software studies – A lexicon Goffey, A. ( – 20). 2008 ). Algorithm. In M. Fuller (Ed.), (pp. 15 Cambridge: MIT Press. 2006 ). Everyware: The dawning age of ubiquitous computing . Boston, MA: New fi Green eld, A. ( Riders. ). The data revolution: Big data, open data, data infrastructures and their conse- Kitchin, R. ( 2014 . London: Sage. quences 2011 Kitchin, R., & Dodge, M. ( Code/space: Software and everyday life . Cambridge: MIT Press. ). ). Algorithm = Logic + Control. Communications of the ACM Kowalski, R. ( 22 (7), 424 – 436. 1979 , 2008 Software studies – A lexicon (pp. 236 – Krysa, J., & Sedek, G. ( ). Source code. In M. Fuller (Ed.), 242). Cambridge: MIT Press. The freelance translation machine: Algorithmic culture and the invisible indus- Kushner, S. ( 2013 ). 15 – 1258. , (8), 1241 New Media & Society try. Theory, Culture & Society , 24 2007 Lash, S. ( ). Power after hegemony: Cultural studies in mutation. 78. (3), 55 – 2011 ). Con icting codes and codings: How algorithmic trading is reshaping fi nancial Lenglet, M. ( fl , 66. (6), 44 – Theory, Culture & Society regulation. 28 2013 Nine algorithms that changed the future: The ingenious ideas that drive MacCormick, J. ( ). today ’ s computers . Princeton, NJ: Princeton University Press.

16 29 INFORMATION, COMMUNICATION & SOCIETY ). The performativity of code: Software and cultures of circulation. Theory, Mackenzie, A. ( 2005 22 (1), 71 92. Culture & Society , – 2007 Mackenzie, A. ( ). Protocols and the irreducible traces of embodiment: The Viterbi algorithm and the mosaic of machine time. In R. Hassan & R. E. Purser (Eds.), 24/7: Time and temporality 106). Stanford, CA: Stanford University Press. – (pp. 89 in the network society Mackenzie, A., & Vurdubakis, T. ( 2011 fi cation, performativity ). Code and codings in Crisis: Signi – 23. 28 , Theory, Culture & Society and excess. (6), 3 . MacKenzie, D. ( 2014 ). A sociology of algorithms: High-frequency trading and the shaping of markets http://www.sps.ed.ac. Working paper, University of Edinburgh. Retrieved July 6, 2015, from le/0004/156298/Algorithms25.pdf uk/__data/assets/pdf_ fi Mager, A. ( Information, ). Algorithmic ideology: How capitalist society shapes search engines. 2012 Communication, & Society , (5), 769 – 787. 15 Mahnke, M., & Uprichard, E. ( 2014 ). Algorithming the algorithm. In R. König & M. Rasch (Eds.), (pp. 256 fl 270). Amsterdam: Institute of Society of the query reader: Re ections on web search – Network Cultures. ). Software takes control Manovich, L. ( 2013 . New York, NY: Bloomsbury. ). Algorhythmics: Understanding micro-temporality in computational cultures. 2012 Miyazaki, S. ( Computational Culture http://computationalculture.net/ , Issue 2. Retrieved June 25, 2014, from article/algorhythmics-understanding-micro-temporality-in-computational-cultures 2012 ). 10 Montfort, N., Baudoin, P., Bell, J., Bogost, I., Douglass, J., Marino, M. C., ... Vawter, N. ( . Cambridge: MIT Press. PRINT CHR$ (205.5 + RND (1)): GOTO 10 Internet Policy Review , 2 (3). Retrieved October 7, 2013 ). Governance by algorithms. Musiani, F. ( 2014, from http://policyreview.info/articles/analysis/governance-algorithms 2013 , May). The algorithm as institution: Toward a theoretical framework for auto- Napoli, P. M. ( mated media production and consumption . Paper presented at the Media in Transition Conference, Massachusetts Institute of Technology, Cambridge, MA. Retrieved from ssrn. com/abstract = 2260923 132. 119 – 2015 ). On organizing algorithms. Theory, Culture & Society Neyland, D. ( (1), , 32 The emperor ). fi nance sector . Pasquale, F. ( ’ 2014 s new codes: Reputation and search algorithms in the conference. Retrieved October 16, 2014, Governing Algorithms ‘ Draft for discussion at the NYU ’ from http://governingalgorithms.org/wp-content/uploads/2013/05/2-paper-pasquale.pdf The black box society: The secret algorithms that control money and information Pasquale, F. ( 2015 ). . Cambridge, MA: Harvard University Press. Trust in numbers: The pursuit of objectivity in science and public life . Princeton, 1995 Porter, T. M. ( ). NJ: Princeton University Press. 2007 Dreaming in code: Two dozen programmers, three years, 4,732 bugs, and one Rosenberg, S. ( ). quest for transcendent software . New York: Three Rivers Press. ). Knowing Algorithms . Media in Transition 8, Cambridge, MA. Retrieved August Seaver, N. ( 2013 21, 2014, from http://nickseaver.net/papers/seaverMiT8.pdf ). A speculative post on the idea of algorithmic authority . Shirky.com. Retrieved Shirky, C. ( 2009 October 7, 2014, from http://www.shirky.com/weblog/2009/11/a-speculative-post-on-the-idea- of-algorithmic-authority/ Automate this: How algorithms took over our markets, our jobs, and the world . Steiner, C. ( ). 2012 New York, NY: Portfolio. ). Coding places: Software practice in a South American City . Cambridge: MIT Takhteyev, Y. ( 2012 Press. ). Close to the machine . San Francisco, CA: City Lights Books. Ullman, E. ( 1997 , September 29). 2011 How to think about an algorithm? Notes from a not quite random Ziewitz, M. ( ‘ Knowledge Machines between Freedom and Control walk . . Discussion paper for Symposium on ’ http://ziewitz.org/papers/ziewitz_algorithm.pdf Retrieved August 21, 2014, from

Related documents

Automation, Big Data, and Politics: A Research Review

Automation, Big Data, and Politics: A Research Review

Internat (20 16 ), 5032 – 5055 1932 – 8036/2016 000 5 10 ional Journal of Communication Automation, Big Data, and Politics: A Research Review 1 SAMANTHA SHOREY University of Washington, USA PHILIP HOW...

More info »
Layout 1

Layout 1

Link . Learn . Lead . Live

More info »
10.toc

10.toc

CONTENTS • MAY 2019 • VOLUME 93, NO. 10 COVER IMAGE Cover photograph : Bluetongue virus (BTV), a complex nonenveloped double-stranded RNA virus, causes serious illness and death in sheep and other rum...

More info »
10.toc

10.toc

CONTENTS • MAY 2019 • VOLUME 93, NO. 10 COVER IMAGE Cover photograph : Bluetongue virus (BTV), a complex nonenveloped double-stranded RNA virus, causes serious illness and death in sheep and other rum...

More info »
10.toc

10.toc

CONTENTS • MAY 2019 • VOLUME 93, NO. 10 COVER IMAGE Cover photograph : Bluetongue virus (BTV), a complex nonenveloped double-stranded RNA virus, causes serious illness and death in sheep and other rum...

More info »
dac2011

dac2011

1 Direct Air Capture of CO with Chemicals June 1, 2011 2 Direct Air Capture of CO with Chemicals 2 A Technology Assessment for the APS Panel on Public Affairs June 1, 2011

More info »
Big Data and Climate Change

Big Data and Climate Change

big data and cognitive comp uting Review Big Data and Climate Change 2 3 1, and Emmanuel Silva * , Xu Huang Hossein Hassani 1 Research Institute of Energy Management and Planning, University of Tehran...

More info »
Wailea Dining • Art • Shopping • Lifestyle 2019 Meida Kit

Wailea Dining • Art • Shopping • Lifestyle 2019 Meida Kit

pril 2019 A ay 2018 M anuary 2018 J A ugust 2018 ebruary 2017 F WAILEA BIG ISLAND J uly 2019 Dining Lifestyle • Shopping • Art KIHEI Dining Art Dining & Shopping • Shopping HumuHumu at Grand Wailea WA...

More info »