Digital natives have an advantage as part of government AI engineering teams

John P. Desmond, AI Trends editor

AI is more accessible to young people in the workforce who have grown up as “digital natives” with Alexa and self-driving cars as part of the landscape, giving them expectations based on their experiences of what is possible.

That idea set the stage for the panel discussion AI World Government Thinking about the needs and skillset myths for AI engineering teams held virtually and in person this week in Alexandria, Va.

Dorothy Aronson, CIO and Chief Data Officer, National Science Foundation

“People feel that artificial intelligence is available because the technology is available, but the technology is ahead of our cultural maturity,” said committee member Dorothy Aronson, CIO and Chief Data Officer of the National Science Foundation. “It’s like giving a child a sharp object. We may have access to big data, but it may not be right to work with it in all cases.

Things are speeding up, which increases expectations. When panelist Vivek Rao, a lecturer and researcher at the University of California, Berkeley, was working on his PhD, a paper on natural language processing could have been a master’s thesis. “Now we assign it as homework with a two-day turnaround. We have tremendous amounts of computing power that weren’t available even two years ago,” he said of his students, whom he described as “digital natives” with high expectations for what artificial intelligence makes possible.

Rachel Dzombak, Head of Digital Transformation, Software Engineering Institute, Carnegie Mellon University

Panel moderated by Rachel Dzombak, Digital Transformation Lead Institute of Software Engineering Carnegie Mellon asked university staff what is unique about working on artificial intelligence in government.

Aronson said the government can’t go too far with technology or users won’t know how to interact with it. “We don’t build iPhones,” he said. “We have experiments and we are always looking ahead, anticipating the future so that we can make the most effective decisions. In the government, at this very moment, we see the convergence of the emerging and retiring generations, whom we also have to serve.”

At the beginning of his career, Aronson did not want to work in government. “I thought that meant you were either in the armed forces or the Peace Corps,” he said. “But what I’ve learned over time is what motivates federal employees to serve larger, problem-solving institutions. We’re trying to solve really big problems of equity and diversity and getting people food and keeping people safe. People who work for the government are dedicated to those missions.”

He referred to his two children in their 20s, who love the idea of ​​service, but in “bits and pieces,” ie. . They see it as a blocking situation. But in reality it is not so.’

Berkeley students learn about government’s role in disaster response

Berkeley’s Rao said her students see wildfires in California and ask who is working on the challenge of doing something about them. When he tells them it’s almost always local, state and federal government agencies, “Students are generally surprised to find that out.”

In one example, he developed a disaster response innovation course in collaboration with CMU and the Department of Defense, Army Futures Lab, and Coast Guard Search and Rescue. “This was an eye-opener for the students,” he said. Initially, two of the 35 students expressed interest in a career in the federal government. At the end of the course, 10 out of 35 students expressed interest. One of them was hired as a software engineer at the Naval Surface Warfare Center out of Corona, Calif., Rao said.

Aronson described the onboarding process for new federal employees as an “uphill climb,” suggesting “if we could have prepared ahead of time, it would have moved a lot faster.”

Brian Lane, Director of Data and AI, General Services Administration

When asked by Dzombak what skills and mindsets are considered for AI engineering teams, committee member Brian Lane, director of data and AI at the General Services Administration (who announced during the hearing that he is taking on a new role at the FDIC), said that flexibility is a necessary quality.

Lane is a technology executive in the GSA IT Modernization Center of Excellence (CoE) with more than 15 years of experience leading analytics and technology initiatives. He led the GSA partnership with the DoD Joint Artificial Intelligence Center (JAIC). [Ed. Note: Known as “the Jake.”] Lane is also the founder DATA XD. He also has experience in the industry, managing acquisition portfolios.

“The most important thing for agile teams going into the AI ​​journey is that you have to be prepared for the unexpected and the mission is ongoing,” he said. “If you all agree on the importance of the mission, the team can stick together.”

A good sign that team members admit they’ve “never done this before”

As for the mindset, he said more of his team members come to him and say, “I’ve never done anything like this before.” He considers it a good sign that gives an opportunity to talk about risks and alternative solutions. “When your team has the psychological safety to say they don’t know something,” Lane sees that as a positive. “The focus is always on what you’ve done and what you’ve delivered. It’s rare to focus on what you haven’t done before and what you want to grow into,” he said.

Aronson found it difficult to get AI projects off the ground. “It’s hard to tell management that you have a use case or a problem to solve and you want to deal with it, and there’s a 50-50 chance it’s going to happen, and you don’t know how much it’s going to cost.” he said. “It comes down to clarifying the rationale and convincing others that it’s the right thing to do moving forward.”

Rao said he talks to students about experimentation and having an experimental mindset. “AI tools may be readily available, but they can mask the challenges you may face. When you apply the vision API, for example, in the context of your business or government agency challenges, things may not be smooth sailing,” he said.

Moderator Dzombak asked the participants how they build teams. The fireman said. “You need a mix of people.” He tried “communities of practice” around solving specific problems where people could come and go. “You’re bringing people together around a problem, not a tool,” he said.

Lane seconded it. “I’ve really stopped focusing on instruments at all,” he said. He has conducted experiments at JAIC in accounting, finance and other fields. “We found it’s not really about the tools. It’s about getting the right people together to understand the problems, and then looking at the tools available,” he said.

Lane said he creates “cross-functional teams” that are “a little more formal than a community of interest.” He has found them effective for maybe 45 days of working together on a problem. He also enjoys working with service clients within the organization and has seen clients learn about data management and AI as a result. “We’ll pick one or two along the way who will become advocates for accelerating AI across the organization,” Lane said.

Lane believes it takes five years to think, work, and develop proven best-practice methods to develop AI systems to serve the government. He mentioned The Opportunity Project of the US Census Bureau (TOP), launched in 2016 to work on challenges such as ocean plastic pollution, economic recovery from COVID-19, and disaster response. In that time, TOP has been involved in more than 135 public projects and has more than 1,300 alumni, including developers, designers, community leaders, data and policy experts, students, and government agencies.

“It’s based on mindset and the way you organize the work,” Lane said. “We need to expand the delivery model, but after five years we will have enough proof of concept to know what works and what doesn’t.”

Learn more here AI World Governmentthe time Institute of Software Engineeringthe time DATA XD and the time The Opportunity Project.

Source link