A Framework for Governing Emerging Technologies
January 1, 2025
Executive Summary
Governing emerging technologies is challenging due to the rapid pace of technological advancement and the uncertainties in evaluating their opportunities and risks. As these technologies evolve, governance frameworks must adapt to address multiple competing objectives, such as fostering innovation, minimising risks and navigating geopolitical considerations. The growing gap between the pace of technological advancements and governance frameworks, referred to as the “pacing problem”, further complicates this process.
This paper presents a structured framework for governing emerging technologies, drawing on Eugene Bardach’s guide for policy analysis and Takshashila’s work on emerging technology governance. The framework begins with an in-depth understanding of different aspects of the technology, such as costs, supply chains, maturity, global competitiveness and societal impacts. This is followed by identifying stakeholders impacted by the technology and their concerns, alongside grappling with the moral dilemmas that arise from adopting it, as governance stems from needing to address these concerns. These steps are followed by narrowing down the governance objectives – such as addressing market failures or other priorities – and identifying suitable alternatives from a repository of instruments. The framework then evaluates these alternatives by projecting their outcomes, confronting the trade-offs involved, and deciding on an optimal policy. The final step addresses effectively communicating the policy to the intended audience.
This structured approach ensures that governance decisions are backed by analytical rigour to balance competing interests and address the pacing problem.
Authors
Bharath Reddy is an Associate Fellow with the High Tech Geopolitics programme at the Takshashila Institution.
Shambhavi Naik is the Head of Research and Chairperson of the Advanced Biology programme at the Takshashila Institution.
Disclosures
The authors have used Grammarly and ChatGPT for assistance in copy-editing
Acknowledgements
The authors would like to thank Nitin Pai and Pranay Kotasthane for their valuable comments and feedback
The Challenge with Governing Emerging Technologies
While there is no widely accepted definition of emerging technologies, there is a general understanding of the characteristics of emerging technologies. For instance, Rotolo et al.1 have characterised emerging technologies as having the attributes of radical novelty, relatively fast growth, coherence, prominent impact, and uncertainty and ambiguity of use and risks. The classification of a technology as “emerging” is not permanent. As the technology matures, its novelty fades, and its uncertainty and ambiguity also reduce.
Household electricity is an excellent example that illustrates this transition. When it was introduced in the 1880s, it was regarded as dangerous and unreliable compared to gas-powered lighting. At the time, there were no established standards for voltage or electricity, and very few appliances were powered by electricity. The Edison Electric Company had to manufacture everything, from the incandescent bulbs to the wires, switches, and generators needed to power them2. It required massive investments and could initially only be afforded by the wealthy. Today, we have near-universal electrification and access to it is considered a basic necessity. Contemporary emerging technologies such as artificial intelligence, advanced communications, and quantum technologies are likely to experience a similar transition.
Hagemann et al.3 highlight a “pacing problem” in technology policy, defined as the “gap between the introduction of a new technology and the establishment of laws, regulations, and oversight mechanisms for shaping its safe development.” As the rate of technology adoption increases, the gap between technology and regulation also widens. This pacing problem is likely to be a defining characteristic of the information age when compared to the industrial age, where the relative pace of technology adoption was slower. In the process of closing the gap, different stakeholders, each with their own interests, often try to influence the regulatory landscape around these technologies, which adds additional complexity to governing the technology.
In the post-pandemic world, the growing narratives around techno-strategic autonomy add a new dimension to governance. Supply chain security, protection of critical infrastructure, and technological competitiveness have become important governance considerations. Several states, including the US, Australia, the UK, and the EU, have published lists of critical and emerging technologies that they consider vital for national security. Thus, geopolitics and great power competition add additional governance imperatives.
This paper focuses on governing emerging technologies, including but not limited to those with national security considerations. The latter are strategic and might necessitate more active government involvement in their development and adoption.
Lastly, governance objectives also evolve across a technology’s lifecycle from scientific discovery to commercialisation. For instance, for gene editing, Chandavarkar, Kanisetti, et al.4 recommend that the safety guardrails become increasingly rigorous as the technology or application gets closer to being released to the public. This helps enable scientific progress while also minimising the risks of adopting an emerging technology.
Thus, governing emerging technologies requires balancing multiple and sometimes competing objectives, such as promoting economic growth, mitigating risks, enabling innovation, promoting strategic autonomy, setting standards, managing cultural impact, protecting local employment and addressing environmental concerns. This paper presents an approach to address this challenge, which is derived heavily from Eugene Bardach’s guide for policy analysis5 and the expertise of researchers at the Takshashila Institution in emerging technology governance.
The Framework
This framework provides a structured approach to policy development. It begins with understanding the technology and stakeholder concerns, consequently identifying possible policy alternatives. It then proceeds to square the trade-offs to select the optimal policy and concludes with effectively communicating the chosen policy. These are captured as eight steps, each of which is detailed below.
While Bardach’s framework is an excellent tool for policy analysis, there are specific challenges associated with governing emerging technologies. The proposed framework attempts to address these challenges, such as understanding the different facets of the technology, identifying stakeholder concerns, and confronting moral dilemmas that arise during its governance.
A representation of the steps in the proposed framework for governing emerging technologies.
The different steps involved in Bardach’s eightfold path are listed below:
- Define the Problem
- Assemble Some Evidence
- Construct the Alternatives
- Select the Criteria
- Project the Outcomes
- Confront the Trade-Offs
- Decide
- Tell Your Story
Step 1: Understand the Technology and its Ecosystem
Understanding the technology and its potential impact is a prerequisite to effective policy-making. This understanding can be approached through several key dimensions:
Costs, Inputs and Supply Chains
Understanding the costs, inputs and supply chains involved in developing or deploying the technology provides valuable insights that inform policy-making. For instance, the semiconductor industry is a highly specialised and interconnected global supply chain. Different companies worldwide specialise in distinct stages such as research, design, manufacturing, assembly, testing, and distribution. An appreciation of this complexity is essential for effective policy-making.
Technology Maturity
Technological progress over time typically follows an S curve, moving through the stages of scientific discovery, invention, commercialisation, adoption and commoditisation. Governance priorities evolve across this lifecycle. During the early stages of discovery and invention, opportunities and risks are often uncertain, and strict regulation can stifle innovation. However, as the technology is commercialised and adoption grows, there is a better understanding of the opportunities and risks involved and the scale of impact is also greater. This might necessitate more regulatory guardrails. NASA’s technology readiness levels (TRLs)6 is a widely used classification developed to provide a common language for discussing technology development stages and help guide decision-making.
NASA’s Technology Readiness Levels7
- TRL 9: Actual system “flight proven” through successful mission operations
- TRL 8: Actual system completed and “flight qualified” through test and demonstration (ground or space)
- TRL 7: System prototype demonstration in a space environment
- TRL 6: System/subsystem model or prototype demonstration in a relevant environment (ground or space)
- TRL 5: Component and/or breadboard validation in a relevant environment
- TRL 4: Component and/or breadboard validation in a laboratory environment
- TRL 3: Analytical and experimental critical function and/or characteristic proof-of-concept
- TRL 2: Technology concept and/or application formulated
- TRL 1: Basic principles observed and reported
Global Competitiveness
An important dimension in assessing a technology is a state’s competitive advantage, especially for critical technologies. Porter’s Diamond Framework8 helps us understand why some states innovate better than others in a specific industry. The framework proposes analysing the following four interrelated components that can enable a globally competitive environment:
Factor conditions include both basic and advanced inputs for technological development. Porter emphasises the importance of advanced inputs such as a skilled workforce, research and education facilities, digital infrastructure and sophisticated capital markets. These advanced inputs are typically cultivated factors that add more value to a state’s long-term competitiveness than inherited basic inputs such as natural resources or unskilled labour.
A sophisticated and demanding domestic market can drive companies to innovate faster than their global competitors.
The presence of competitive supporting industries and services helps create innovation clusters that strengthen the ecosystem.
The nature of domestic competition, governance structures, and management practices, as defined by the local firm’s characteristics, determine its ability to innovate.
Understanding these strengths and shortcomings helps enable a globally competitive environment.
Strategic Autonomy and National Security
As the narratives around techno-strategic autonomy and supply chain security are gathering steam, Drezner argues9 that post 9/11, the scope of “national security” has continuously expanded to include everything from climate change to artificial intelligence to critical minerals. Political and market incentives exist to label something as having national security implications, and bureaucratic incentives work against downgrading it. When everything is considered to be strategic, nothing can be prioritised.
Ding and Dafoe10 propose a useful framework for categorising a technology as strategic based on three criteria: importance (economic/military utility), externality (effects that firms/military organisations won’t optimise for by themselves), and nationalisation (how rivalrous these effects are between nations). The authors further identify three main types of strategic externalities: cumulative-strategic (high barriers to entry, first-mover advantages), infrastructure-strategic (positive spillovers across economy/military), and dependency-strategic (vulnerable supply with few substitutes).
Further, in his book “Technology and the Rise of Great Powers”, Ding discusses the importance of the diffusion of general-purpose technologies and their influence on the rise of great powers. Unlike the popular theory that technology confers advantages to those that develop it first, the diffusion theory suggests that the rapid and widespread adoption of general-purpose technologies is a better determinant of technological power. Such frameworks help bring analytical rigour in debates, classifying something as strategic instead of relying on instinct.
Comparison with Existing Technologies
Another important dimension of understanding technology is to benchmark it against existing solutions or those that it replaces. This could include considerations such as cost-effectiveness, long-term potential, or relative impact on the environment. Equally important is to anticipate what the technology might displace or render obsolete. As technology becomes part of the natural order of things, we risk losing older practices, knowledge, and systems that have stood the test of time.
Step 2: Identifying Stakeholders and Their Concerns
Policy-making operates within a complex adaptive system that involves dynamic interaction between various stakeholders. Such systems exhibit path dependence, where different initial conditions can lead to different outcomes. They can be unpredictable even if all the underlying parts are well understood. There are also self-correcting forces at play in order to keep overall system behaviour within a certain range.
Predicting how such a complex system will respond to a certain technology is difficult. However, it is useful to model such a system as it helps anticipate the response of various stakeholders, their interactions and the overall behaviour of the system. The model thus serves as an approximation of a complex reality.
In developing this model, it is essential to identify the various stakeholders who may either support or oppose the technology’s adoption based on their perceived impact. These stakeholders fall into four groups:
Individuals: When identifying stakeholders from this category, consideration must be given to factors such as income, education, social status, religion, location and caste, as these can influence their response to the technology.
Markets: This category refers to participants in the supply side of the exchange of goods and services. Markets are not monolithic. They consist of multiple participants with diverse interests.
Society: Stakeholders in this category are not a single large entity but should be viewed through various socio-economic lenses, similar to individuals, as that determines how different groups may react to the technology.
Government: This includes specific individuals or government bodies at union, state or local levels. Each of the stakeholders may have different interests regarding the technology. Different parts of the technology value chain may be governed by different agencies with overlapping or sometimes conflicting priorities that need to be harmonised.
The ISO 26000:2010 standard11 on social responsibility includes questions aimed at identifying organisations’ stakeholders. These have been reframed in the context of the particular technology to assist in identifying stakeholders.
Clause 5.3.2 of the ISO 26000:2010 standard on social responsibility
- To whom do creators or deployers of the technology owe legal obligations?
- Who might be positively or negatively affected by the technology?
- Who is likely to express concerns about the technology?
- Who has been involved in the past when similar concerns needed to be addressed?
- Who can help address the specific impacts of the technology?
- Who can affect the technology’s ability to meet its expectations?
- Who would be disadvantaged if excluded from accessing the technology?
- Who in the value chain is affected?
Once the stakeholders impacted by the technology have been identified, the next step involves identifying the main concerns of each category of stakeholders. This process requires the skills of anticipating and prioritising the responses of various stakeholders. Appendix 1 provides a comprehensive list of likely responses for each stakeholder category to assist with this task.
Upon completing this step, the desired outcome is to understand the perceptions or responses of the different groups of stakeholders. What we then have is a rudimentary model of the complex system, identifying its various actors and their interests.
Step 3: Confront Moral Dilemmas
The uncertainty associated with the impact of emerging technologies often leads to moral dilemmas. The lack of conclusive data on the future impact of adopting the technology during the policy-making process makes it difficult to resolve these dilemmas completely. Instead, governance should attempt to confront these dilemmas by directly addressing the more pressing matters and planning mitigation measures for others.
These dilemmas may range from challenges to personal beliefs and values to broader societal concerns. For instance, gene editing raises fundamental concerns about whether humans are transgressing against nature. Such concerns are rooted in professional codes, religious beliefs, or traditional values. On the other hand, broader societal concerns arise from the interactions between technology and society, such as fears of exacerbating inequalities or causing social disruptions. Some societal issues, such as ensuring equitable access to a new therapy, can be addressed through policy measures such as subsidies. However, more abstract questions about faith or morality cannot be resolved through policy alone. For such questions, public engagement is necessary to gauge public opinion, though it might not necessarily dictate policy decisions. The framework below presents an approach to understanding and addressing moral dilemmas in policy-making.
A representation of the stages in the framework for confronting moral dilemmas.
Identify moral dilemmas: The first step is to clearly outline the moral dilemmas associated with the technology.
Separate facts from assumptions: Given the uncertainty associated with emerging technologies and the prevalence of misinformation, some of the perceived risks may be based on unfounded assumptions. Separating facts from assumptions ensures that ethical concerns are grounded in evidence.
Recognise biases: Policymakers and analysts must be cognisant of biases that influence their analysis. While personal ethics are important, they should not dictate governance priorities or policies.
Assess risks: Risk can be evaluated on two key dimensions — the likelihood of the event occurring (probability) and the potential negative consequences if it does occur (impact). This evaluation should be informed by analysing how similar technologies have been deployed in other regions, considering whether ethical concerns raised elsewhere have materialised, the severity of the impact and the size of the population affected.
Determine the potential for mitigation through policy: As discussed above, while resolving an individual’s concerns through policy can be challenging, managing societal impacts can often be achieved through targeted policy interventions.
Prioritise concerns: This can be informed by the risk assessment conducted in the earlier step.
Identify root causes: For each prioritised concern, identify whether it stems from government, market or societal failures.
Gene editing is an example that illustrates these moral dilemmas. While some may view it as transgressing nature, the technology has several beneficial and potentially harmful applications. Confronting these dilemmas in a structured approach can enhance the understanding of the problem and help refine the policy priorities for the subsequent steps.
Step 4: Narrowing Down the Governance Objectives
Once the stakeholder concerns and ethical assessment are completed, they provide a good understanding of the broad objectives of governing the technology. While not all stakeholder responses will translate into the need for government action, some of them are indicative of market failures and other governance objectives. Market failures arise due to the inefficient allocation of goods and services by a free market, leading to sub-optimal outcomes for society. In addition, governments might have other goals they wish to achieve through technology governance. Some common governance objectives are listed below, each addressing specific market failures.
Disincentivise Negative Externalities
Some economic activities can negatively impact third parties not involved in the transaction. Pollution is an example of such an externality that negatively affects someone who might not be a producer or consumer in the economic activity. Policy interventions should aim to prevent these adverse outcomes.
Incentivise Positive Externalities and Public Goods
Some economic activities can also positively impact third parties, such as vaccination, which benefits the broader community, including those who are not vaccinated. Public goods, which are non-rival and non-excludable, are often underprovided by the market. The policy instruments in this area aim to incentivise the production of goods with positive externalities and the creation of public goods.
Address Market Power Concerns
Market domination by monopolies or oligopolies can lead to sub-optimal outcomes for society due to practices such as price manipulation or reduced outputs compared to a competitive market. The policy instruments here aim to create checks and balances to prevent the concentration and abuse of market power.
Addressing Information Asymmetry
Where one party in a transaction has more information than the other, it can lead to problems such as moral hazards or adverse selection, thus disadvantaging the less informed party. The policy instruments here are aimed at increasing disclosure and promoting accountability.
Economic Growth
Barriers such as inadequate infrastructure, regulatory uncertainty, or a lack of a skilled workforce can hinder the adoption of new technologies and the realisation of the productivity gains from them. Governments may choose to create an enabling environment by facilitating the adoption of technology.
Strategic Autonomy
These are governance implications stemming from geopolitics and competition increasingly associated with technology. The governance objectives are divided into three broad dimensions of strategic autonomy—ensuring the trustworthiness of critical or national security infrastructure, promoting supply chain security, and enhancing technology competitiveness.
The governance of a technology might have one or more of these objectives. It is now necessary to frame each objective within the specific context of this particular technology’s governance. The framing should not contain a pre-defined solution. A good test is to check if there can be at least three mutually exclusive solutions to the objective. It is also useful to frame it in degrees rather than absolute terms, and to narrow it down by factors such as geography, time scale, etc.
Step 5: Identify Suitable Policy Alternatives and Select Evaluation Criteria
The next step involves identifying suitable policy alternatives for each governance objective identified. Appendix 2 lists several policy instruments and associated unintended or underrated repercussions, which need to be considered while making the choice. While different policy instruments could help attain a certain objective, they can make use of the following broad principles for guidance.
Governing a technology can have multiple objectives. This step involves identifying suitable policy alternatives for each governance objective identified.
Policy-making Lifecycle
While not strictly linear or clearly demarcated, policy-making progresses through distinct phases. Technologies typically exist outside the policy radar until emerging trends, research, or public opinion bring them into focus. The process then moves to the phase of debate and policy formulation, characterised by evidence gathering, stakeholder consultations, and iterative drafting of policy solutions. Implementation is the next stage, where the focus turns to transforming written policy into action. Finally, the policy enters an assessment phase, where its effectiveness is monitored and evaluated. Rather than always reinventing the wheel, policymakers and analysts can learn from experiences in other jurisdictions and from past experiences governing similar technologies.
Introducing Competition and Market-like Mechanisms
Emerging technologies face a lot of uncertainty stemming from the technology itself, the regulatory landscape or access to talent, materials, infrastructure and capital. Markets typically have better incentives to respond to such conditions. Meanwhile, the state can focus its time and resources on addressing market failures and streamlining regulations so that the private sector can take the lead on innovation.
In some sectors, the state often plays multiple roles, such as policymaker, regulator, and service provider, which might be in conflict with each other. Osborne and Plastrik’s recommendations for uncoupling steering (deciding the priorities) and rowing (executing the tasks)12 are equally important for emerging technologies. A notable example of such reforms is the liberalisation of the telecommunications sector in India. Post the reforms, the Telecom Regulatory Authority of India (TRAI) manages regulation, the Telecom Disputes Settlement and Appellate Tribunal (TDSAT) oversees dispute resolution, Bharat Sanchar Nigam Limited (BSNL) handles service delivery and the Department of Telecommunications (DoT) focuses on policy-making. Prior to the reforms, all these functions were managed by the DoT.
Soft Law
Hagemann et al.13 argue for an amorphous and constantly evolving set of informal “soft law” mechanisms for governing emerging technologies. As the understanding of opportunities and risks evolves, multiple stakeholders aim to shape the regulatory landscape in their interests. As discussed in the previous sections, there might also be varied governance objectives for the technology that may conflict with each other. Emerging technologies also further exacerbate the challenge of regulation keeping pace with rapid technological advancements. Considering these challenges, Hagemann et al. argue for more collaborative, transparent, and adaptable systems of technological governance that accomplish their goals without stifling innovation. Multistakeholder efforts such as consultations and sandboxing are the core of a soft law approach to governance.
Once we have the policy alternatives for each governance objective, the next step is to establish criteria for evaluating the outcomes of their implementation.
The criteria should assess different aspects of the problem definition, such as the extent to which the problem has been addressed, the resources required, the timeline, and whether the solution is equitable. It is essential to focus on evaluating the outcomes rather than the policies themselves. Therefore, the criteria should be based on the problem statement as opposed to being tailored to favour any particular policy.
The criteria should be mutually exclusive as far as possible to avoid double counting certain aspects of the outcomes. Some standard criteria for evaluation are effectiveness, efficiency, timeliness, administrative capacity, and equity. These criteria work well in measuring different aspects of the policy’s outcomes. Depending on the governance objective, additional specific criteria may be necessary. For example, fairness or transparency might be an important criterion when trying to address risks posed by AI.
Step 6: Project Policy Outcomes
This step can be particularly challenging as it involves predicting how a policy will perform within a complex adaptive system. The goal is to anticipate the outcomes of the policies across a long enough time horizon. This involves estimating how the different policies will measure up against the identified criteria, understanding stakeholder responses, identifying the potential winners and losers, and considering possible unintended consequences.
A useful framework for projecting the outcomes involves examining inputs, activities, outputs and outcomes as dimensions for analysis in the policy implementation roadmap. This approach helps project the outcomes effectively. Key questions to consider include: What inputs are required, such as human resources, materials, capital, and infrastructure? Is there sufficient state capacity for implementation? What are the specific activities that will be carried out, and by whom? What are the timelines for implementation? Who are the policy actors at various levels, and what are their incentives?
This process also helps identify potential winners and losers from a policy, allowing for the development of strategies to compensate those adversely affected. It also helps anticipate unintended consequences such as moral hazards, rent-seeking, and over-regulation and devise possible mitigation strategies.
Data is extremely valuable at this stage. Analysing how similar policies have performed in the past or in other regions can offer valuable insights. However, data is often unavailable or noisy in the policy environment. In such cases, proxies can serve as substitutes.
Having a baseline scenario for comparison is helpful. This can be the current status quo or a future scenario without the policy. This helps in understanding the incremental impact of the different policies. Exploring different potential outcomes of a policy, such as best-case, worst-case and most likely scenarios, also highlight likely risks and opportunities.
This step’s end goal is to thoroughly consider the implementation details of the policy alternatives, project their outcomes, and estimate how each alternative performs against the different criteria identified.
Step 7: Confront Trade-offs and Decide
This step involves confronting the trade-offs at two levels. At the first level, the different alternatives might outperform one another on different criteria. This necessitates examining the trade-offs between their outcomes within the constraints of the policy environment. For instance, for the objective of safeguarding consumers from the risks of AI, there can be different alternatives, such as encouraging disclosures, setting standards, imposing regulations that require certifications or audits, and imposing fines. The outcomes of these alternatives need to be compared against each other based on the evaluation criteria identified to select the best alternative.
At the second level, multiple governance objectives might also conflict, requiring trade-offs between them. For instance, in governing AI, there can be different objectives for the government that might be in tension with each other, such as protecting consumers from harm, strategic autonomy, and enabling globally competitive industries. Maximising some of these objectives would require compromising on the others; therefore, some trade-offs between the different objectives are necessary. By addressing the trade-offs at both levels, the various policy alternatives within and across governance objectives can be combined to determine the preferred policy proposal.
Step 8: Tell the Story
The key to effective policy communication is understanding the audience, which might include policymakers or other interested stakeholders. Each might be interested in different aspects of the policy proposal, which must be communicated accordingly.
Leading with the main conclusion and most important information—a communication strategy known as the “bottom line up front”—helps orient the audience and keep them engaged. This ensures that the audience can quickly grasp the core ideas without being bogged down by the technical or process details. The message must be accessible to a curious and intelligent reader, including those unfamiliar with the technology.
While developing the policy proposal involves a structured analytical process, detailing this background work detracts from effective communication. It might not always be necessary to showcase the stakeholder concerns, alternatives considered, or evaluation criteria. The focus should instead be on the final recommendation and supporting evidence.
Conclusion
Governance of emerging technologies presents policymakers with a complex challenge. They must govern a technology whose opportunities and risks are uncertain, while stakeholders with vested interests try to shape the regulatory environment in their favour. The paper presents a structured approach to addressing these challenges. It begins by developing a comprehensive understanding of the technology and anticipating stakeholder concerns and ethical considerations. Next, it identifies possible policy alternatives and projects their outcomes. The process concludes by squaring the trade-offs to select the optimal policy and effectively communicating the chosen approach.
While the structured approach helps guide the policy-making process, it requires a multidisciplinary perspective that can anticipate impacts across various domains. This involves coordinating the interests of multiple stakeholders and balancing competing interests, which is more of an art than a science. This framework can be viewed as a starting point for developing more sophisticated approaches to keep up with the rapid pace of technological change, while prioritising public interest.
Footnotes
Rotolo, Daniele, Diana Hicks, and Ben Martin. “What Is an Emerging Technology?” SWPS 2015-06, February 11, 2015. https://doi.org/10.2139/ssrn.2743186.↩︎
Rees, Jonathan. “Industrialization and Urbanization in the United States, 1880–1929.” In Oxford Research Encyclopaedia of American History. Oxford University Press, 2016. https://doi.org/10.1093/acrefore/9780199329175.013.327.↩︎
Hagemann, Ryan, Jennifer Huddleston, and Adam D. Thierer. ‘Soft Law For Hard Problems: The Governance of Emerging Technologies in an Uncertain Future’. Colorado Technology Law Journal (2 May 2018). https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3118539.↩︎
Madhav Chandavarkar, Anirudh Kanisetti, et al., A Framework For Governing Gene Editing, Takshashila Discussion Document. The Takshashila Institution (2 October, 2017) https://takshashila.org.in/research/discussion-document-framework-governing-gene-editing.↩︎
Bardach, Eugene. A Practical Guide for Policy Analysis: The Eightfold Path to More Effective Problem Solving. 4th ed. SAGE, 2012.↩︎
Manning, Catherine G. “Technology Readiness Levels.” NASA, September 27, 2023. https://www.nasa.gov/directorates/somd/space-communications-navigation-program/technology-readiness-levels/.↩︎
Manning, Catherine G. “Technology Readiness Levels.” NASA, September 27, 2023. https://www.nasa.gov/directorates/somd/space-communications-navigation-program/technology-readiness-levels/.↩︎
Porter, Michael E. Competitive Advantage: Creating and Sustaining Superior Performance. Simon and Schuster, 2008. https://www.simonandschuster.co.in/books/Competitive-Advantage/Michael-E-Porter/9780743260879.↩︎
Drezner, Daniel W. “How Everything Became National Security.” Foreign Affairs, August 12, 2024. https://www.foreignaffairs.com/united-states/how-everything-became-national-security-drezner.↩︎
Ding, Jeffrey, and Allan Dafoe. “The Logic of Strategic Assets: From Oil to AI.” Security Studies 30, no. 2 (June 3, 2021): 1–31. https://doi.org/10.1080/09636412.2021.1915583.↩︎
ISO/TMBG Technical Management Board - groups. “ISO 26000:2010 Guidance on Social Responsibility.” ISO. International Organization for Standardization, 2010. https://www.iso.org/standard/42546.html.↩︎
Osborne, David, and Peter Plastrik. Banishing Bureaucracy: The Five Strategies for Reinventing Government. Plume Books, 1998.↩︎
Hagemann, Ryan, Jennifer Huddleston, and Adam D. Thierer. ‘Soft Law For Hard Problems: The Governance of Emerging Technologies in an Uncertain Future’. Colorado Technology Law Journal (2 May 2018). https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3118539.↩︎
Social and Environmental Impact
The adoption of new technologies may have significant effects on society and the environment. Its adoption might lead to winners and losers. Some groups might benefit from the increased productivity and opportunities, whereas others might face reduced opportunities or job displacement. It might also exacerbate inequalities along existing fault lines, such as socio-economic, gender, or digital divides. Beyond economic impacts, technologies also influence social, behavioural or cultural norms in ways that are not always obvious. This is clearly demonstrated by social media’s impact on society and politics.
Similarly, the environmental impact is equally significant — technologies require raw materials, energy, water and other resources and might generate pollution during their lifecycle. The ecological footprint might disrupt ecosystems and threaten biodiversity. Understanding both the negative and positive impacts of technology is essential for responsible adoption.