US lead on AI will shrink without much more funding and education

The hearing’s only pleasant surprise was its bipartisan support. Senators from both sides of the aisle, along with Cruz, all took the expert panel’s testimony seriously. Granted, AI still has the the new-car smell of a nascent field with great potential, which could boost US labor productivity by 40%, Cruz said in his introductory remarks. Golden bullet it might seem, but even the current experiments using AI to assist or solely could take a chunk out of the 35,000 annual vehicular deaths, 94% of which are caused by human error, committee member Senator Gary Peters noted.

Artificial intelligence could save even more lives, said the hearing’s first witness, Microsoft Research Lab’s managing director Eric Horvitz. AI could sift through vast quantities of medical data and catch things human doctors miss, as IBM’s Watson did back in August when it identified a rare form of leukemia and saved a patient’s life.

When people think of the cost benefits of AI, they think of automation. But reducing death and debilitating injury affects the overall economy, too: AI-assisted driving could also cut down on the 300,000 incapacitating vehicular injuries every year, which means more people remaining in the workforce and less time and money spent finding and training temporary or permanent replacements.

The looming fear over the hearing was China and India’s ever-greater competition in AI R&D. Logically, America’s lead on China and India could shrink simply due to how many more computer scientists they can train per their colossal populations. But letting US artificial intelligence slide could also be dangerous to national security. Back in August, the Defense Department suggested “immediate action” should be taken to boost development of AI war technology.

We can retain our lead to keep pioneering artificial intelligence by training America’s youth in AI programming as early as middle school, recommended the hearing’s second witness, Dean of the school of Computer Science at Carnegie Mellon University Andrew Moore. In his opinion, there’s a staggering amount of work and not enough trained computer scientists to perform it. Train a million middle school kids in AI, perhaps 1% stick with it, and even if you ended up with 400 experts at the level of Moore and his fellows at the hearing, there would still be too much work to do, Moore said. Pumping out more AI professionals won’t just be a smart move to fill a wanting workforce: for every programmer trained in artificial intelligence a tech company hires, Moore estimates, they earn $ 5 to $ 10 million more.

Collaboration could also help the US keep its lead, said the third witness, cofounder of the nonprofit OpenAI Greg Brockman. Making more AI systems open source drives innovation, Brockman said, along with unlocking datasets for anyone to use. But it’s not just amateurs and corporations working together: The tech industry, the government and academia should coordinate to establish standards of safety, security and ethics.

The last witness, senior research scientist at NASA’s Jet Propulsion Laboratory Steve Chien, noted that the space agency put an AI-controlled spacecraft in orbit to track earthbound phenomena — which has been continuously snapping photos from the high atmosphere for a dozen years. Many of NASA’s vehicles, including its Mars rovers, rely on AI to navigate and triage environmental conditions.

With technological possibilities come dangers, and AI is no exception. Cruz’s limp Skynet joke aside, the pressing concern with creating more complex and prevalent artificial intelligence is the subsequent increase in cyber vulnerabilities. We don’t have to look farther than the last year to see government and political agencies hacked by foreign independent and state agents.

But even things as mundane as liability could get in the way of AI application progress here in the US. The prospect of AI-controlled cars getting into collisions could lead to a legal impasse between carmakers, insurance companies and citizens as fault becomes uncertain. Public uncertainty or displeasure could derail AI implementation in those applications, too.

To avoid the US slipping out of first place in the AI race, the panel of witnesses ultimately recommended more investment and collaboration. That means far more emphasis on AI programming earlier in education, as Moore points out, but also simply more money injected into research: Government investment in AI over the past year was $ 1 billion, while the tech industry spent $ 8 billion, Brockman pointed out. That funding will likely help us make the roads safer and people healthier, but as Chien stated, it will also help us discover the deep space answers to a few questions that have bothered mankind for eons — namely, how did life form along with the universe around it?

Engadget RSS Feed

Falling Into the Wormhole Connecting Physics and Education

Ahead of Helen Quinn was a well-known theoretical physicist, she believed about becoming a teacher. Now, in the second act of her career, she has come full circle, helping to craft the Next Generation Science Standards, which have been adopted by 17 states plus the District of Columbia. But her path to becoming each a globe-class physicist and a leader of science education reform was one particular she practically didn’t take.

Quanta Magazine


About

Original story reprinted with permission from Quanta Magazine, an editorially independent division of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences


Quinn, who is now 73, grew up in Australia, where she had to decide on an academic concentrate by her sophomore year in higher college. Her father was an engineer, and family conversations usually revolved around how factors work. “The kind of dilemma solving that I advocate as valuable for learning science was part of our family members culture,” she said.

She recalled how a high school teacher encouraged her to turn into a mathematician, telling her, “Because you’re so lazy, you will never ever resolve a problem the difficult way. You usually have to figure out a clever way.” But in the 1950s, she said, “the concept that a lady could be an engineer was nonexistent. I as soon as walked into the engineering college at the University of Melbourne, and 1 guy stated, ‘Look what’s got in here,’ and the other 1 says, ‘You consider it’s real?’”

Following Quinn transferred to Stanford University in 1962, her adviser encouraged her to contemplate graduate college, even although, as he explained, “graduate schools are generally reluctant to accept females since they get married and they don’t finish. But I don’t believe we want to worry about that with you.” Which made her wonder: “Is he telling me I’m never ever going to get married?”

Helen Quinn in her Stanford Linear Accelerator Center office around 1977.Helen Quinn in her Stanford Linear Accelerator Center workplace around 1977.Courtesy of Helen Quinn

Quinn applied to graduate college, but she hedged her bets. “There have been no women in the faculty at Stanford at that time in the physics division,” she said. “I didn’t see myself there.” She believed she would “apply for Ph.D. programs because very good universities do not offer master’s degrees in physics, but truly I’d do a master’s degree and then go take education courses and be a higher school teacher.”

Instead, she went on to make seminal contributions to our understanding of basic particle interactions. In the 1970s, she worked with Roberto Peccei on a proposed answer to the robust charge-parity (CP) issue. The puzzle has to do with why a kind of symmetry amongst matter and antimatter is broken in weak interactions, which drive nuclear decay, but not in robust interactions, which hold matter together. Peccei and Quinn’s resolution, known as the Peccei-Quinn mechanism, implies a new kind of symmetry that predicts the existence of an “axion” field, and hence a hypothetical axion particle. Axions have been invoked in theories of supersymmetry and cosmic inflation, and have been proposed as a candidate for dark matter. Physicists are looking high and low for the elusive particle.

Her work on the robust CP problem and other contributions to particle physics have been recognized with prestigious awards which includes the Dirac Medal, the J.J. Sakurai Prize, the Klein Medal and the Compton Medal. Meanwhile her attention has shifted back to science education. Beginning in the late 1980s she led the science education outreach effort at the Stanford Linear Accelerator Center (SLAC), and she later chaired the National Investigation Council’s Board on Science Education, which created the framework that led to the Subsequent Generation Science Requirements. Quanta Magazine caught up with Quinn at final year’s International Teacher-Scientist Partnership Conference in San Francisco. An edited and condensed version of the conversation follows.

QUANTA MAGAZINE: What was it like entering the field of particle physics in the 1960s?

HELEN QUINN: It was a quite exciting time. The issue we now get in touch with the Regular Model was just starting to take shape, and SLAC had just been constructed at Stanford. In truth, the explanation I became a particle physicist is most likely since there had been so a lot of people around me who had been so excited about the science. But I never at any point stated, “I’m going to be a physicist. That is what I want to do.” It just sort of grew on me as I discovered much more about it.

You did a year of student teaching.

I did my Ph.D. in 4 years, and it was an exciting piece of function that got noticed. In the course of graduate school, I’d married. My husband was one more physicist, and we took postdocs in Germany. Coming back, my husband was presented a faculty position at Tufts, and I stated, “Well, if there’s any town in the country where there ought to be yet another job, it’s Boston because there are seven universities in the Boston location, or possibly more.” But I didn’t get a job.

I believed, “OK, I’ll fall back and I’ll be a teacher,” and I took education courses at Tufts and did the student teaching.

Then what happened?

For the duration of that semester when I was carrying out the student teaching, I occurred to run into a single of my graduate school friends, Joel Primack, who was then a junior fellow at Harvard, and he stated, “Why do not you come talk to us at Harvard sometime?” At that moment, a piece of research came along which was really basic to the development of the Common Model. Gerard ’t Hooft and Martinus Veltman [who shared the 1999 Nobel Prize in physics] supplied a technique for calculating the mathematics in gauge theories, which underlie the Common Model. So I began working with my pal and 1 other junior faculty member at Harvard, Tom Appelquist, on applying that strategy to what we contact one particular-loop calculation.

Before the Regular Model, there was a problem with weak interaction theory. You could do the very first-order calculation, but the next order (the one particular-loop calculation) was infinite. So the theory was not well-defined and not steady. We did the very first finite a single-loop calculation of weak interactions using the new theory. At that point I realized this is drawing me in a lot more than the teaching.

You didn’t like teaching?

I loved the teaching. I hated supervising study hall and the intellectual atmosphere of the high college. So it was not the teaching that put me off as considerably as it was the intellectual draw of anything truly thrilling going on straight in my field, in my location of interest in physics, that essentially was the beginning of the improvement of the Regular Model. It was an opportunity that I couldn’t turn down.

Later in your career, why did you become involved in trying to repair science education?

Right after I was elected to the National Academy of Sciences my background in education outreach work meant I was invited to join the Board on Science Education. The opportunity this offered to be involved with science education a lot more broadly was appealing, but a lot more than that it was a likelihood to discover some exciting things about teaching and learning. As a scientist, if you believe you know anything without obtaining carried out any study on it, you most likely do not know. So I mentioned, “Who does understand what’s effective in teaching science?”

There was a study named “Taking Science to School” for which I was portion of a committee with folks who research understanding. I was able to find out how they studied the query: What is most efficient in teaching science? That was the beginning of my education about research on studying.

The challenge for me was to recognize what the other people in the room have been arguing about. At the beginning of that study, I was the physicist, and these have been education researchers. And they were obtaining an argument, and I did not know what they were arguing about. I couldn’t discern the differences in their positions since I didn’t know the history.

Later, after the Common Core came along and 47 states adopted widespread requirements in math and language arts, the Carnegie Corporation of New York came to the Board on Science Education and said, “We should be performing this for science, too.” If numerous states are performing common issues in math and language arts, why not think about what they could do in common in science?

You have been the chair of the Board on Science Education by that time. What locations of science education did you consider needed to increase?

The common conclusion genuinely is embodied in the “Framework for K-12 Science Education” we developed: You have to engage the students in undertaking items in order for the studying to turn into meaningful. Just memorizing the knowledge that other individuals have produced doesn’t genuinely lead to transferable knowledge. The big issue is information that you can apply.

The question is: How do you alter understanding so that the expertise becomes a lot much more integrated into the way a particular person approaches troubles outdoors of college?

What was the greatest challenge for you in developing these requirements?

The challenge, but also the enjoyable, of undertaking it is to try and take a group of people, all of whom have knowledge in diverse areas, and come up with a frequent view that is primarily based solidly adequate on everyone’s experience that other people will get into it and carry it forward. And I consider we succeeded with the Framework. Science teachers are typically enthusiastic about the picture we’re placing just before them. When I speak to scientists, they’re generally enthusiastic about this way of describing science. So the synthesis operates, but reaching it is a group work. Chairing such a group and bringing it to consensus is a challenging but rewarding procedure.

And so in some sense, the typical view that came out of the Framework became the Next Generation Science Standards.

The standards are based on the Framework, and it assists to study the Framework to understand the intent of them. Requirements are, by their quite nature, understanding in pieces. A standard has to be some thing exactly where you can say: Can the student do that or not?

Primarily, requirements are the basis on which you build assessments, and they’re a set of guideposts for teachers and curriculum developers. So requirements are in reality not the way to convey the bigger vision. They’re all the little bits and pieces students need to know or be capable to do, and in and of themselves, they don’t make sense. Unless they’re constructed on a bigger vision and unless you have some idea of what that vision is, reading the standards is confusing.

So the Framework is the vision.

The Framework is the vision. The requirements are a set of stakes in the ground. If students can do this in third grade, if students can do that in fifth grade, if students can do that in 12th grade, then they have discovered enough science.

Helen Quinn giving her Dirac Medal lecture in 2000.Helen Quinn providing her Dirac Medal lecture in 2000.Courtesy of Helen Quinn

You describe the Next Generation Science Requirements as three-dimensional science finding out. What does that imply?

What I imply is that to find out science, you have to find out core concepts from the disciplines of science. [In physical science, these concepts incorporate matter and its interactions, motion and stability, and power.] But you also have to learn how these ideas were arrived at, what scientists do, the practices of science, and the practices of engineering, each in order to recognize the nature of science and in order to engage in those practices to make the learning your personal. That is a second dimension to science understanding. And finally, the third dimension is that there are some massive concepts which you want in order to know exactly where you’re going and to know which types of concerns to ask when you are looking at a new issue. These are concepts such as the reality that explanations in science are about cause and impact mechanisms, or that, in order to decipher these mechanisms, it is beneficial to define and make a model for the technique in which a phenomenon occurs. And these huge concepts are really often not taught. Students are sort of expected to get them as a side effect of performing factors more than and more than once again.

And you call that third dimension “crosscutting.” Is that meant to imply that you’re cutting across distinct disciplines?

Proper. These are the concepts which apply no matter whether you’re carrying out physics, chemistry, biology, earth science or any other area of science. These will be valuable lenses to appear at a issue with.

Is not it harder to assess no matter whether students have learned crosscutting concepts and the procedure of science?

Queensland and other states in Australia in fact do this. Some part of the state assessment is an external exam, but portion of it is overall performance assessments in the classroom that are graded by the teachers. First of all, this strategy trusts teachers as experts, but secondly, it has a cross-verify technique. If there’s an imbalance among the external testing portion and the teacher’s grading of the in-class element, then inspectors come and they watch. So there’s a entire structure created about possessing the teachers be component of a expert technique and monitoring that program.

In the US, we have adopted a method of drop-in-from-the-sky external testing exactly where the teachers play no role in it. That is actually a extremely inefficient model since the teachers know a lot more about the students than any drop-in test can uncover. Assessments that drop in from the sky are made to be low-cost and to be scored by machine it’s really restricted. Largely it just tests what has been memorized. And getting our whole education program developed to have students be able to get higher scores on these tests is counterproductive. It drives all the incorrect behaviors into the classroom. So we need new varieties of testing tasks to test regardless of whether students have accomplished these new three-dimensional standards and to drive the teaching and understanding behaviors that we know are more productive.

Now that the requirements are out there, what are you focusing on?

My term on the Board on Science Education is up, so I no longer have that distinct platform to work from. I go exactly where I’m invited to give talks, to sort of wave the flag and talk at the county level or the state level about what the standards are and why they were developed, and to aid men and women understand how to implement them.

When you talk to teachers, what suggestions do you give them about creating science more fascinating for students?

There are two factors: first, developing studying around observable events or phenomena. And, second, acquiring students engaged with a question prior to you give them the answer. We all get considerably much more interested in issues if we have a question about it than if somebody is telling us anything that we haven’t any reason to know we want to know.

What’s the endgame?

I want educated individuals. I want citizens who can appear at a issue in their community and feel like a scientist about the component of the dilemma that is science. I want high school and college graduates with capabilities that employers want, whether or not they come from well-educated families or not. I want them to be capable to take on a problem and resolve it because that is what employers are looking for. They want you to be able to work on a team to be provided some details, interpret it and use it to not have to be told: “This is what you do tomorrow.” And all of these issues require anything far more than just becoming in a position to repeat back what you have been told. So that is exactly where I’m going. I consider it is a massive equity problem.

Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to boost public understanding of science by covering investigation developments and trends in mathematics and the physical and life sciences.

Go Back to Leading. Skip To: Begin of Post.

WIRED