How the United States became a world power in science and technology

By: jjniyagewbusmk May. 25,2023

This paper analyzes the factors that made the U.S. a science and technology power from three aspects: historical development, government policies and institutional innovation, and points out that: the U.S. became a science and technology power with a historical development foundation; the government's policies and strong support after World War II promoted the U.S. to become a world leader in science and technology and made the U.S. a science and technology power; in response to changes and challenges, the U.S. maintained its leadership in science and technology through local institutional innovation. In response to changes and challenges, the U.S. has been able to maintain its leading position in science and technology by driving the entire innovation system through local institutional innovation. The U.S. experience can help China build a strong science and technology nation, both in terms of direct reference and inspiration.

  The United States is the world's number one science and technology power today. Since World War II, the United States has been the world's number one country in terms of the number of Nobel Prize winners and other awards, the number of scientific papers and citations, the number of overseas students studying in the United States, and the number of high-technology companies founded by universities. "The new cutting-edge technologies that emerged during World War II and the ensuing Cold War became the high technology that fueled the growth of the U.S. economy and the development of the world economy and society: electronic computers, commercial transport aircraft, semiconductors, solid-state electronics, integrated circuits, nuclear energy, lasers, satellite communications, microwave communications, radar applications (such as navigation control), antibiotics, pesticides, new materials (such as high-strength ferrous alloys, titanium, high-temperature ceramics, fiber-reinforced plastics, composite materials), new methods of manufacturing and machining metals (such as CNC machines), and today's widespread use of the Internet. At the outbreak of World War II, the United States was still far behind Germany, far behind Britain, and behind France in terms of the number of Nobel Prize winners. More than a decade earlier, until the early 1930s when Hitler came to power, the brightest and most ambitious young people in the United States would go as far as Germany to study for doctoral degrees at universities in cities like Heidelberg, Leipzig and Argentina. What made America's scientific and technological prowess leap forward and remain strong for so long?

  The experience of American science and technology innovation is worthy of attention. According to American scholar D. Hart, "The importance of the American experience derives from the United States' leadership in the global economy, both in the highly innovative industrial sector and in scientific research. Unless scholars understand the U.S. innovation process, they will have difficulty understanding the world's innovation process as a whole." He noted that the U.S. case also has cognitive and analytical importance, which stems from the vastness and institutional complexity of the U.S. innovation system. The emergence of larger, complex innovation systems in Europe, China, and India more prominently illustrates the need for a better understanding of the U.S. innovation process. [1] The purpose of this paper is to explore how the United States became a technological powerhouse, analyzing the factors that made the United States a technological powerhouse from three perspectives: historical roots, policy, and institutional innovation, and suggesting what can be learned and replicated.

I. The Roots of History: The Formation of the U.S. Science and Technology Innovation System

  1 From Learning to Self-Reliance

  Science in the North American colonies originated in Europe, and American science began to grow during the process of scientific exchange with Europe. Between American independence and the end of the Civil War (1776-1865), American science became independent from European and especially British affiliation, although still on a small scale.

  Although there was much research on natural areas in the early North American colonies, the researchers were mostly amateurs, and a great deal of the research was descriptive studies of natural history. In the 18th century, the American colonies produced a truly significant scientific contribution, namely Benjamin Franklin's work on electric charge, which earned him real recognition as a scientist in the world. However, the scale of American scientists was very small, and there were few high-level scientists. Henry, the most talented American scientist in the 19th century, who contributed most to the development of electromagnetism, was still inferior to his British contemporaries, Faraday and Maxwell. Until the end of the 19th century, the development of science and technology in the United States was dominated by practical. On the one hand, after independence, the country's emphasis on resource exploration, geographic expeditions and agricultural improvements promoted the development of applied sciences related to natural resources and geography, etc.; on the other hand, industrial development brought about a large number of technological inventions and innovations. As Tocqueville observed when he visited the United States in the 1830s: in America, people were only interested in the purely applied part of science. He doubted that the purely theoretical research of European societies could take root in a nascent country like the United States.

However, in fact, the application of science showed its importance in 1840, such as the invention of the telegraph. in the mid-19th century, Harvard and Yale began to value science and appreciate its application. in 1862, Congress passed The Morrill Act, which established land grant colleges (land grant colleges), which favored applied disciplines such as agriculture and mechanics and became part of state universities. In 1876, Johns Hopkins University was founded, putting graduate education and academic research first and ushering in the era of the American research university. With the development of research universities, science grew in American universities. By the late 19th and early 20th centuries, the spirit of American science emphasized indigenization and independence from European science. 1907 saw the birth of physicist Michelson, the first Nobel laureate, marking the beginning of American science on its own. By the 1930s, American science and technology had become dominant in certain fields, for example, physics had begun to achieve worldwide fame, and a number of world-class scientists such as Milligan and Compton had emerged.

  2 The American University System and Scientific Research

  Most of the early American universities were established for local practical needs, except for Harvard (1636) and Yale (1701), which were modeled after the old English tradition. public universities (state universities) began to develop in the early 19th century. science was emphasized in universities in the mid-19th century. But in the mid-19th century, American universities were relatively backward. At that time, young Americans were going to Europe, especially to Germany, to obtain their doctoral degrees. When they returned to the United States, they brought back that German way of combining research and education and developed it, contributing to the development of American universities, including the creation of some new universities (Johns Hopkins, University of Chicago, etc.) and new developments in older universities. By the 1870s, encouraging faculty to engage in scholarly research and preparing students through research became the practice of many universities, the value of research as education was fully realized, and the American research university began to emerge. By 1920, the modern form of the American research university had taken shape, dominating American higher education for the first four decades of the 20th century and maintaining and flourishing in the later years of the 20th century, becoming a dominant force not only in the development of science and education but also greatly influencing the economic and social development of the United States.

  In his famous book "The Role of Scientists in Society," sociologist of science Ben David points out that the American university system innovates with the system's own dynamics. American universities have their own unique characteristics: First, American universities are a highly discrete and competitive system. Unlike many European countries where there is a central decision-making body (the Ministry of Education) that determines university policy, the authority for higher education in the United States is vested in the states, rather than the federal government, which in many ways imposes uniform regulation on individual universities. Each state can manage the development of universities according to its own realities, emphasizing the autonomy of university operation. With resources from private donations, philanthropic foundations, state governments, and student tuition, universities are managed with considerable autonomy. Although there are many state universities in the United States, they are far from dominant in the overall university system, and the most prestigious and wealthy are the private universities. The result of decentralization is that universities compete with each other, not only among private universities and among state universities, but state universities must also compete with private universities. [3] In such an autonomous, decentralized, and competitive system, scientists are free to choose the questions they want to study according to their own scientific value judgments, and to reward their peers for what they consider to be high level research work. This promotes university scholarship and the advancement of science. In this way, a geographically distributed, highly autonomous, yet competing scientific community of scientists matured and advanced the basic sciences in the United States. According to Ben David's study of the results of medical science in England, France, Germany, and the United States from 1800-1926, the key element that enabled Germany, and later the United States, to take the lead in medical science was the discrete and competitive nature of the university system. [4] The second characteristic is pragmatism. American universities were developed in active response to the needs of local economic and industrial development, and the development of universities went hand in hand with the development of industry. Not only were some private universities established in connection with industry, but state government support for state universities was also closely tied to local development. in the first half of the 20th century, new engineering disciplines were institutionalized in universities around the development of emerging industries, linking universities to the development of emerging industries. As a result of these characteristics, U.S. universities stand out for their faster and broader responsiveness to changes in their economic and social environments.
3 Important role of industrial research laboratories

  An industrial research laboratory is a research and development (R&D) institution established within a company in accordance with its business strategy, with the aim of engaging in R&D activities related to corporate development. Industrial research laboratories began in the late 19th century in the German chemical industry. Subsequently, Kodak (1893), General Electric (1900), DuPont (1901), and Bell Telephone System (1907) in the United States also established their own industrial laboratories one after another. The establishment of industrial research laboratories marked the beginning of a new period in which the invention of industrial technology moved away from complete dependence on individual inventors, thus making innovation a self-sustainable system. By the middle of the 20th century, a large number of research laboratories were established in the United States in such industrial fields as chemistry, rubber and petroleum, and electricity, such as the famous DuPont, AT&T, and General Electric. By 1930 to 1940, industrial research laboratories had become the main body of innovation in the United States, and the sectoral ratios of R&D funding throughout the period were 12-19% for government, 63-70% for industry, and 9-13% for universities. 

  Professor N. Rosenberg, one of the pioneers of technological innovation research, pointed out that the industrial research laboratory, if not the most important institutional innovation of the 20th century, is one of the most important institutional innovations. Although not the first invention of the United States, this institution has had a wider diffusion and stronger impact in the U.S. economy than in other countries. [6]

  Industrial research made the development of science and technology endogenous to the development of the economy, allowing the generation of new knowledge to be organically integrated with the application of new knowledge. At the same time, the establishment of industrial research also allows enterprises to establish equal communication and cooperation with universities and research institutes, which not only enables enterprises to obtain external scientific and technological resources more extensively and effectively and enhances their survival and development, but also makes the national science and technology system a sound and effective system as a whole. As the famous American management historian Chandler pointed out, industrial research influences the healthy development of the whole national economic system. A vivid example of this is the history of the decline of Britain in the late 19th and early 20th centuries, which is amply illustrated. One of the major causes of the "British disease" was the failure of the British manufacturing industry to establish the necessary organizational settings and links to effectively exploit the commercial potential of scientific research, which prevented the effective use of the country's other scientific resources (e.g., universities) and led to the loss of British competitiveness in international markets. [7] In contrast, the United States grew strong because of the role of its industrial laboratories.

  In short, before World War II, the United States had formed a scientific and technological innovation system based on universities and industrial research laboratories, which was based on the market competition mechanism, responded positively to the development needs of the economy and society, was highly flexible, had a natural connection and full mobility within the system, and emphasized This system is based on market competition, responds to the needs of economic development and social development, is highly flexible, has natural connections and full mobility within the system, and emphasizes the spirit of bottom-up innovation, which has laid the foundation for the greater development of science and technology in the United States.
Second, post-war science and technology policies have contributed to the transformation and development of the modern U.S. science and technology innovation system

  1 Transformation of the role of government support for science and technology

  The Second World War had a profound impact on the development of science and technology in the United States. Before World War II, the federal government basically did not take on the responsibility of supporting the development of science. During the war, the federal government formed a new partnership with science. After the war, the federal government became a major player in supporting science and technology, and over the next decade or so supported the establishment of a modern national science and technology system that made the United States a world leader in science and technology.

  During the war, inventions such as the atomic bomb, radar, and penicillin helped the United States win the war, convincingly demonstrating to the world the enormous power of science and technology. The main reason for these outstanding achievements was the government's extensive mobilization of the nation's civilian scientific and technological forces, including universities and businesses, to participate in the establishment of a national system of innovation that combined laboratory research, mass production, tactics on the battlefield, and strategy in command. The experience gained in organizing and managing scientific research during the war provided the basis for designing science and technology policy after the war.

  At the request of President Roosevelt, Vannevar Bush completed the report Science - The Never-Ending Frontier, which showed the promise of science - as the "never-ending frontier" of Science will replace the physical frontier of the American West as a new engine of economic development, higher standards of living, and social progress for the nation. This report has several basic ideas: (1) scientific progress is indispensable to ensure the health of the people, national security, and public welfare; (2) basic research is the source of all knowledge, and its development will inevitably bring broad benefits to society; (3) the scientific community needs to maintain its relative autonomy and freedom of inquiry in order to avoid pressure from political and other interest groups and to ensure the progress of scientific knowledge. progress. Accordingly, the report suggests that the federal government should assume responsibility for sustaining the progress of scientific knowledge and fostering a nascent scientific force. The report recommended the creation of a National Research Foundation (the original name for the National Science Foundation)-a funding agency that would comprehensively encompass all areas of the natural sciences and include a division to support long-term military research. Bush made universities the center of postwar science policy.  Bush's report laid the foundation for U.S. science and technology policy from the postwar period to the present day.

After the war, there was a heated debate between parties with different views on science and technology policy. Ultimately, Bush's ideas about government support for science prevailed - science has an important place in government; however, his specific vision - the creation of a unified, national body to support the development of science - the -the National Research Foundation-did not succeed. During the five-year debate over the establishment of the National Science Foundation from 1945 to 1950, the Office of the Navy, the Atomic Energy Commission (the predecessor of the Department of Energy), and the National Institutes of Health (NIH) began supporting scientific research one after another. By the time the National Science Foundation (NSF) was established in 1950, it was only one of a number of departments and agencies of the federal government that supported scientific research, and it was one of the smaller ones. The United States in fact developed a diverse system of funding. Between 1945-1957, various government departments and agencies supported research by universities and businesses, and a number of national laboratories and research institutes were established by the Atomic Energy Commission, the Department of Defense (DOD), and the NIH.

  In 1957, the Soviet Union launched Sputnik, the first artificial Earth satellite to usher in the human space age, greatly stimulating the United States. An alarmed American court quickly began to react, mobilizing enormous national resources to meet the Soviet threat. In just one year, from late 1957 to 1958, the U.S. established the National Aeronautics and Space Administration (NASA) to develop and implement a national space development program; the Department of Defense established the Advanced Research Projects Agency (ARPA) with the goal of ensuring that advanced defense R&D was conducted; the National Science The National Defense Education Act, passed by Congress in November 1958, greatly enhanced U.S. government support for science education at all levels. on January 31, 1958, the United States also successfully launched an artificial earth satellite. From 1957 to 1968, the United States entered a golden period of scientific and technological development

  2 The country invested heavily in science and technology

  After the war, two distinctive features of U.S. research and development (R&D) spending were the total amount of national R&D investment and the size of the federal R&D budget. In the early years, total U.S. R&D spending remained at a share of slightly more than 1 percent of GNP, and this share gained rapid momentum in the second half of the 1950s, peaking at 3 percent in the mid-1960s. In 1969, U.S. R&D investment was $25.6 billion, far exceeding the combined R&D spending of the largest foreign economies (the Federal Republic of Germany, France, the United Kingdom and Japan) by $11.3 billion. Federal funding reached 1/2 to 2/3 of the overall national R&D investment. reaching a 2/3 share of total R&D expenditures in the mid-1960s. [9] Beginning in the 1980s, federal government investment began to lag behind industry.

  Federal funding for universities increased significantly: in the mid-1930s, federal funding for university research accounted for roughly 1/4 of its total funding, and in 1960 it was over 60%. From 1935 to 1960, funding for university research as a whole increased tenfold, and tripled again by 1965. [10]

  In the context of the Cold War, most of the U.S. national R&D investment was invested in defense and space-related fields, resulting in many advanced technologies. 1962 saw President Kennedy propose the Apollo moon landing program, and in 1969 U.S. astronauts successfully landed on the moon. The moon landing program inspired the best generation of young Americans to invest and train them to become excellent scientists and engineers.

3 Government Funding of Scientific Research: Mission-Oriented Basic Research

  In the postwar period, federal funding was conducted under a diversified funding system, i.e., decentralized to various federal government departments and agencies, rather than centralized federal investment. Of the more than 10 U.S. government departments and agencies involved in funding R&D, DOD, the Department of Health and Public Welfare (primarily NIH), NASA, the Department of Energy (DOE), NSF, and the Department of Agriculture (USDA) account for 90% of total federal R&D funding expenditures). According to FY 2013 data, the percentages of R&D for these six major agencies are 51% for DOD, 23% for HHS, 8% for NASA, 8% for DOE, 4% for NSF, and 2% for USDA. [11]

  After the war, the focus of U.S. S&T policy was on basic research and defense technology. There is a relatively widespread misconception in China that basic research in U.S. universities is entirely free inquiry without applied goals. In fact, mission-oriented funded basic research is not unrelated to applications. The focus areas of postwar U.S. funding were computer, electronics, materials science, and military-related applied science and engineering, as well as medicine and life sciences, funded on the principle that basic research should ultimately yield benefits, reflect goal convergence, and focus on areas of interest to the funder. That is, intrinsic criteria and possible contributions: basic research expresses scientific progress at a distance from direct application, but not without application considerations. [12] As a result, diversified government investments in universities have created substantial strength in areas such as cutting-edge electronics, space technology, and medicine, generating significant scientific and technological strength and economic benefits.

  4 Formation of Modern Science and Technology System

  The strong postwar government support for science and technology created a modern, efficient, and dynamic U.S. science and technology innovation system based on the universities and industrial laboratories that existed before the war.

  The federal government's strong support for universities. Since World War II, the federal government has invested heavily in universities for a variety of reasons, including strategic and military considerations and, more recently, health-related issues. This significant government investment in universities has strengthened the scientific research staff and provided the physical instruments and tools needed for high-quality research. By providing support for both university education and university research, the federal government has strengthened the university's obligation to support research and has enhanced the research link between research and teaching, making U.S. universities a world center for basic research and graduate education. The postwar community created a consensus and atmosphere that basic research is what universities should be doing and that it is respectable to engage in basic research. It is worth pointing out that the federal government, while giving substantial funding to universities, respected the freedom of scientists to explore and did not interfere, but rather encouraged scientists to do research that they thought was worth doing. The invention of the laser demonstrates the importance of the freedom of scientists to explore.

  After the war, large corporations continued to be an important part of American scientific and technological innovation, and companies such as General Electric, DuPont, AT&T, and Kodak continued to contribute greatly to the nation's defense and related industries after the war. Many major inventions were the result of industrial research, such as the invention of the transistor at Bell Labs in 1947 and the manufacture of the first laser at Hughes Research Laboratories in 1960. For a considerable period of time, most of the federal government's R&D funding was invested in business.

  Emerging high-technology small businesses played an important role in the commercial aspects of new technologies (in the fields of semiconductors, electronics, biotechnology, and medicine) and contributed to the growth of the U.S. economy, a distinguishing feature of the development of the postwar U.S. innovation system that distinguished it from the prewar period and from other developed countries. Several factors were involved: (1) government funding for emerging fields, which facilitated the commercialization of basic research results; (2) defense procurement policies that lowered market-based barriers to entry and facilitated the development of small businesses. ; (3) innovations in financial markets, including venture capital, nurture the growth of small businesses; and (4) an appropriate innovation environment enhances the innovation capabilities of small businesses. The representative place for emerging high-tech small businesses is Silicon Valley, where a large number of startups are clustered around Stanford University, creating an innovative environment full of fierce competition and high mobility of people, resulting in a number of high-tech small companies that have influenced the development of the United States and the world (such as Hewlett-Packard, Apple, etc.)

After the war, the federal government vigorously built a number of national laboratories and research centers in the DOD, DOE, and NASA systems, which, together with the NIH internal research institutions, formed a national scientific research institution dedicated to national security interests and related fields.

  By the mid-1960s, against the backdrop of the Cold War, and with the primary goal of safeguarding national interests and national security, the United States' vigorous and sustained support for science and technology created a competitive and efficient science and technology innovation system, and the United States reached the world's peak in science and technology. The United States played a leading role in most areas of world science and technology, not only in terms of the number of Nobel Prizes in science increasing to the highest in the world, but also in terms of the massive influx of European students to the United States, in stark contrast to the pre-war period. Government policies strengthened and expanded the ties that existed within the universities of the pre-war science and technology system and industry, and created new national research institutions. By supporting science and technology geared toward long-term national development, government, universities, and industry formed a good partnership, to use the Americans' own generalization. There is a complex interplay between industry, universities, and government as some major technological developments move from research to market, and there is a rich flow of ideas and people between university research, industrial research, and product development.

Third, institutional boundary changes promote the overall functioning of the national innovation system

  The basic framework of the U.S. science and technology system and research system was already formed in the late 1960s: (1) At the highest level of government there are a coordinating and advisory body for science and technology policy decisions - the White House Office of Science and Technology (OST) (now renamed the White House Office of Science and Technology Policy, OSTP), the Federal Council on Science and Technology ( FCST) (now the National Science and Technology Council, NSTC) and the President's Science Advisory Council (PSAC) (now the President's Science and Technology Advisory Council, PAST); and (2) a diverse funding system. The six major departments and agencies closely involved in science and technology today: the Department of Defense, the Department of Health and Public Welfare (primarily the National Institutes of Health), NASA, the Department of Energy, the National Science Foundation, and the Department of Agriculture were all formed during this period; (3) a research system with a clear division of labor among its components: universities are primarily responsible for basic research, government research institutions are primarily responsible for applied and major scientific research, and corporate research institutions are primarily responsible for applied and major scientific research. Corporate research institutions were primarily responsible for applied research and experimental development.

Since the 1970s, as the international environment has changed, the United States has been subject to the impact and challenges of external competition: the oil crisis in the 1970s, the economic challenges of Japan in the 1980s, the September 11 terrorist attacks in 2001, and the rise of Asia represented by China today, the overall strength and leadership of the United States in science and technology The overall strength and leadership of the United States in science and technology has declined relatively. At the same time, there have been many changes in the form and organization of science and technology innovation. Throughout this process, Americans have continued to innovate in order to adapt to the changing situation, while still maintaining a very strong momentum by learning from competitors such as Japan. For example, after being seriously challenged by Japan in the 1980s, the U.S. worked tirelessly to bring about a boom period of economic and technological development in the 1990s.

  One of the strengths of change in the U.S. STI system, as Hart points out, is not the reconstruction or reorganization of the entire system, but a bounded change: "the innovation or reconstruction of certain central institutions, relationships, and expectations within the innovation system, but this change does not amount to a transformation of the entire system. A new logic of action is adopted by the key players in this system." [12] For example, the 1980 Baidoo Act led to important changes in university behavior, with some universities becoming more closely tied to industry than before and some more oriented toward commercial development, but many universities remaining in their previous ways. The boundary change is the ability of the entire innovation system to maintain its traditions and strengths, while at the same time having innovative parts that adapt or lead new developments. Below we give two examples, APRA, which represents innovation in new elements of the U.S. funding system, and the National Nanotechnology Initiative (NNI), which represents innovation in relationships in the innovation system.

  1 ARPA/DARPA

  ARPA was established in 1958 in response to the Soviet satellite Sputnik and initially focused on space technology. 1960 saw ARPA positioned as basic research. 1970s saw ARPA shift to a military mission and the name was changed to Defense Advanced Research Projects Agency (DARPA) in 1972. The great success of ARPA/DARPA has been emulated by other agencies, such as the Department of Homeland Security with HARPA and the Department of Energy with ARPA-E.

  ARPA created a critical organizational management structure, led by a high-quality management team, making extensive use of scientists who moved between industry and science, leveraging existing research labs and collaborative mechanisms (rather than creating new research centers), and funding projects in new and complex areas of more long-term future significance. ARPA was given considerable autonomy to focus "resources on centers of research excellence (such as MIT, Carnegie Mellon, and Stanford) without regard to the geographic distribution issues that NSF must consider. Such an approach helps create university-based research communities with the scale effects and stability necessary to create the necessary progress in a particular field." [14] Moreover, it is free to award multi-year block grants (block funding) to fund research of a high-risk nature.

  2 The National Nano Initiative (NNI)

  The NNI was proposed by the Clinton administration in 2001. Contrary to a widespread national misconception, NNI is not a central program with a separate budget at the national level to support nanotechnology, but rather a coordinating initiative: to promote cooperation among the various relevant departments and agencies of the federal government that support nanotechnology through a variety of mechanisms such as concept, planning documents, communication, dialogue and evaluation, and to coordinate the relevant national efforts to promote the development of national nanotechnology as a whole. The NNI is coordinated by the Nanoscale Science Engineering Technology Subcommittee (NSET) under the National Science and Technology Council and the White House Office of Science and Technology Policy, working with various departments and agencies to establish NNI priority areas and criteria for evaluating various activities.

  In 2014, NNI completed the first phase of its mission.PACST's fifth evaluation of NNI states, "Since NNI was launched in 2001, the federal government has brought together a growing number of cross-agency nanotechnology activities. From FY 2005 to FY 2014, 638 interagency collaborations were generated, growing from 35 collaborations in FY 2005 to 159 collaborations in FY 2013. In conjunction with these individual coordination activities, in 2010 the federal government's cross-agency collaborations began to focus on the 'Nanotechnology Signature Initiative' (NSI), a collaborative initiative with at least three federal agencies investing and coordinating around areas of significant national interest. There is a boom in collaboration in some of these important areas of NSI." [15] The PACST report concluded that the NNI was successful in its first phase.

IV. What can we learn from the United States?

  1 Summary of the U.S. Experience

  The development of science and technology in the United States is rooted in both history and culture, as well as the result of government policy guidance and support. To summarize, there are several factors that make the United States a science and technology powerhouse:

  (1) An industrial research system and university system that is compatible with market mechanisms;

  (2) Appropriate government positioning and long-term and sustained support in science and technology development;

  (3) A tri-partite partnership between government, universities and business;

  (4) An emphasis on bottom-up creativity and autonomy;

  (5) Adequacy of support systems (e.g., venture capital);

  (6) the capacity for institutional innovation.

  Last but not least, it is important to mention talents. The United States has become a scientific and technological powerhouse not only because it has trained a large number of talented scientists and engineers, but also because of a group of excellent European scientists who came to the United States during World War II due to persecution, as well as a variety of talents from all over the world, including China. The United States has benefited from a large number of talented scientists and engineers who came to the United States as a result of persecution during World War II, as well as from a wide variety of scientists and engineers from around the world, including China.

  2 What does the American experience teach us?

  The U.S. development experience has its own national conditions and culture, which are different from China's. Many practices cannot be directly borrowed or imitated, let alone copied. However, there are general rules for the development of science and technology and the use of science and technology to serve national economic and social development; moreover, as a large country that also develops science and technology for national goals, the experience of the pioneers must have something to learn and inspire. This paper believes that the experience of the United States can help China build a strong science and technology country in two aspects: First, some good experiences and practices can be directly borrowed and applied. In fact, since the reform and opening up, China has learned and borrowed a lot of good experiences from the United States, which have played an important role in promoting the development of science and technology in China, such as the establishment of the National Natural Science Foundation, the establishment of the Small Business Innovation Fund and the construction of large scientific facilities; secondly, in the face of the same or similar problems, the U.S. experience can be used as a frame of reference to help us think about solving our problems. For example, the following important issues: (1) the relationship between the government and the market; (2) the creativity of scientific and technological talents and team building; (3) enterprises becoming the main body of innovation; (4) the construction and development of research universities; (5) the cooperation between industry, university and research; (6) the construction and management of national laboratories; (7) the coordination of major scientific and technological innovation activities; and (8) the scientific consulting system to support scientific and technological decision-making.



Share to your social circle,
so that more people can read.

Recommend