ASMC The Business of Defense - Federal News Network https://federalnewsnetwork.com Helping feds meet their mission. Wed, 10 Apr 2024 20:41:23 +0000 en-US hourly 1 https://federalnewsnetwork.com/wp-content/uploads/2017/12/cropped-icon-512x512-1-60x60.png ASMC The Business of Defense - Federal News Network https://federalnewsnetwork.com 32 32 Ask the CIO: Federal Emergency Management Agency https://federalnewsnetwork.com/cme-event/federal-insights/ask-the-cio-federal-emergency-management-agency/ Wed, 10 Apr 2024 20:41:23 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4957819 How is digital transformation impacting the mission at FEMA?

The post Ask the CIO: Federal Emergency Management Agency first appeared on Federal News Network.

]]>
In this exclusive webinar edition of Ask the CIO, host Jason Miller and his guest, Charlie Armstrong, chief information officer at FEMA will discuss the how digital transformation is supporting the mission at FEMA. In addition, Don Wiggins, senior solutions global architect at Equinix will provide an industry perspective.

Learning Objectives:

  • Digital transformation at FEMA
  • Shifting FEMA to the cloud
  • Edge computing for the future
  • Employing artificial intelligence
  • Industry analysis

The post Ask the CIO: Federal Emergency Management Agency first appeared on Federal News Network.

]]>
Federal Executive Forum Zero Trust Strategies in Government Progress and Best Practices 2024 https://federalnewsnetwork.com/cme-event/federal-executive-forum/federal-executive-forum-zero-trust-strategies-in-government-progress-and-best-practices-2024/ Tue, 09 Apr 2024 15:39:21 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4955632 How are strategies evolving to stay ahead of tomorrow's cyber threats?

The post Federal Executive Forum Zero Trust Strategies in Government Progress and Best Practices 2024 first appeared on Federal News Network.

]]>
Zero trust continues to be a crucial piece of cybersecurity initiatives. But how are strategies evolving to stay ahead of tomorrow’s cyber threats?

During this webinar, you will gain the unique perspective of top government cybersecurity experts:

  • Sean Connelly, Federal Zero Trust Technical Architect, Cybersecurity and Infrastructure Security Agency
  • Roy Luongo, CISO, US Secret Service, Department of Homeland Security
  • Louis Eichenbaum, Zero Trust Program Manager, Department of the Interior
  • Chris Roberts, Director, Federal Sales Engineering, Public Sector, Quest Software
  • Steve Faehl, Federal Chief Technology Officer, Microsoft
  • Wes Withrow, Senior Client Executive, Cybersecurity, Verizon
  • Moderator: Luke McCormack, Host of the Federal Executive Forum

Panelists also will share lessons learned, challenges and solutions, and a vision for the future.

The post Federal Executive Forum Zero Trust Strategies in Government Progress and Best Practices 2024 first appeared on Federal News Network.

]]>
Securing the Nation: Deep dive into federal SOCs https://federalnewsnetwork.com/cme-event/federal-insights/securing-the-nation-deep-dive-into-federal-socs/ Fri, 05 Apr 2024 18:18:36 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4951854 On the cyber frontlines with federal SOCs

The post Securing the Nation: Deep dive into federal SOCs first appeared on Federal News Network.

]]>
Discover how the government’s security operations centers continue to evolve to stay ahead of cyberthreats, how they collaborate closely with industry to staff their operations and how that collaboration helps agencies modernize their cybersecurity toolkits.

Download this exclusive Federal News Network Expert Edition now!

The post Securing the Nation: Deep dive into federal SOCs first appeared on Federal News Network.

]]>
NASA tech-forward inventory system supports unique mission https://federalnewsnetwork.com/federal-insights/2024/04/nasa-tech-forward-inventory-system-supports-unique-mission/ https://federalnewsnetwork.com/federal-insights/2024/04/nasa-tech-forward-inventory-system-supports-unique-mission/#respond Thu, 04 Apr 2024 16:46:47 +0000 https://federalnewsnetwork.com/?p=4950360 Its mission requires NASA to keep a huge inventory of specialized items on hand; luckily, it has a long history of being on the forefront of technology.

The post NASA tech-forward inventory system supports unique mission first appeared on Federal News Network.

]]>
NASA has a unique mission that requires it to keep a huge inventory of specialized and varied items on hand. Those can range from high-quality aircraft parts and bespoke spacecraft parts to perishable food products and biological research materials. Sharrief Wilson, deputy director of NASA’s logistics management division, said that inventory averages around $6 billion in value across the agency. As a result, NASA requires a robust warehousing system; luckily, the agency has a long history of being on the forefront of technology.

“We were an early adopter of RFID. So we’ve implemented that and we’ve been using RFID to do inventories, I believe, over the past ten years. So we’ve always looked for a new way. Even upgrades, even within that technology,” Wilson said on The Modernized Warehouse. “Our partners at Caltech – that’s at [Nasa’s Jet Propulsion Laboratory] – they’re even going a little bit further with having readers that are attached to the warehouses. So it’s getting pinged as things are entering and leaving the warehouses. So we were looking at some case studies to see if that would work across the full agency. And then also NASA’s using RFID on the International Space Station. So they’re tracking inventories of the things on the space station using RFID as well.”

Other agencies are only just beginning to explore RFID and pilot its use in their warehouses. But NASA is already looking to next steps, beyond even integrated readers in the warehouses. NASA is looking into investing in the production of the technology itself, rather than simply remaining a consumer.

“Right now, we were purchasing from commercial vendors of tags. But then we’ve also started to get more advancements in the technologies where we’re creating our own tags and printing our own tags now,” Wilson told the Federal Drive with Tom Temin. “So we were looking to expanding that, one, as a way to centralize some of that capability, but also as a cost saving. We think that we could get a return on investment over the next 5 to 10 years. If we invested in creating our own tags, then we would save on new procurements of tags.”

Tracking the data

All of the data gathered from those RFID tags feeds into NASA’s enterprise inventory solution, SAP. That lets them track the items, their quantities and the total value of the property. From there, smaller subsystems plug into SAP to create a better front-end user experience. That gives everyone from end users to logistics teams to the chief financial officer and the Office of the Chief Information Officer the capability to enter, track and ensure the quality of the data.

That helps NASA maintain a full audit trail as well, because sometimes it has to do research for property accountability reasons. For example, if a part fails on a spacecraft, NASA needs to know everything about where that part came from and how it was manufactured to ensure it doesn’t happen again.

Finally, it helps the agency better manage its supply stock, so it can anticipate the needs of its various components and laboratories.

“So we try to take an inventory of what our mission customers are using and then how much we should keep on hand to give them a very fast capability to get that from us,” Wilson said. “And then we manage to reorder points to ensure that we have the correct level of stocking of those supplies and parts of materials that they may need on a fast and regular basis.”

NASA’s unique requirements

NASA also has some specialized requirements for the way certain products are stored. While many agencies have perishable items that require cold storage, few have as low a tolerance for imperfections or contamination. After all, making repairs to something in space is extremely difficult and expensive; making repairs to something on another planet is downright impossible.

“We go through a lot of effort to ensure that we’re not degrading or damaging the property. Sometimes there’s other electrostatic sensitivities to property as well,” Wilson said. “So we go through a lot of detail to ensure that when we’re handling it … we’re protecting the property that supports the mission.”

The post NASA tech-forward inventory system supports unique mission first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/04/nasa-tech-forward-inventory-system-supports-unique-mission/feed/ 0
Defending critical assets from increasing security threats https://federalnewsnetwork.com/cme-event/federal-insights/defending-critical-assets-from-increasing-security-threats/ Wed, 03 Apr 2024 19:05:59 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4949035 Keeping ahead of cyberthreats while implementing zero trust

The post Defending critical assets from increasing security threats first appeared on Federal News Network.

]]>
Keeping ahead of cyberthreats while implementing zero trust

In an exclusive new ebook, discover how the Consumer Financial Protection Bureau, the Department of Health and Human Services and Akamai are tackling zero trust. Also, learn how cyber research builds on zero trust at the Pacific Northwest National Laboratory.

Download the ebook now!

The post Defending critical assets from increasing security threats first appeared on Federal News Network.

]]>
Take a use case-driven approach to artificial intelligence https://federalnewsnetwork.com/federal-insights/2024/04/take-a-use-case-driven-approach-to-artificial-intelligence/ https://federalnewsnetwork.com/federal-insights/2024/04/take-a-use-case-driven-approach-to-artificial-intelligence/#respond Mon, 01 Apr 2024 12:50:38 +0000 https://federalnewsnetwork.com/?p=4945866 The most promising applications of artificial intelligence let people do more analytical and strategic work.

The post Take a use case-driven approach to artificial intelligence first appeared on Federal News Network.

]]>

Artificial intelligence (AI) has come into its own for federal agencies. Federal IT professionals have acquired practical understanding of AI technology, and now they can concentrate on identifying use cases and employing  the appropriate AI technology for the job.

Those are among the observations of two of IBM’s top U.S. federal market executives.

“We’ve gotten past the ‘help us understand the technology’ to agencies really beginning to get hands-on with the technology to understand and imagine what the future looks like,” said Susan Wedge, managing partner for the U.S. public and federal market at IBM Consulting. Now, she said, agencies are thinking about “how can they reimagine delivery of their mission, the outcomes that they can achieve.”

“The AI executive order certainly put into focus how agencies need to be thinking about AI and the adoption of AI,” Wedge said. “And we’re really seeing agencies make a shift.”

Generative AI operates differently than what you might call traditional AI. Therefore, said Mark Johnson, vice president of technology for the U.S. federal market at IBM, agencies should take a step-by-step approach to generative AI. The process involves “finding those use cases, applying generative AI [or other AI technologies] and seeing what comes out,” Johnson said. “Then iterating back again, as we discover some interesting things, and we realize we want to know more [about new] questions.”

For example, Johnson cited human resources and its sometimes-convoluted processes. Generative AI, he said, can reveal ways to simplify or re-engineer HR processes and make operations more efficient for HR practitioners. IBM has had success with AI in its own HR function, to the point that 94% of employee questions are successfully answered by the technology.

“That doesn’t mean there’s not a human in the loop,” Wedge said. “It means that a human is there to handle the more complex, more strategic issues.”

In all use cases, success in AI requires careful curation and handling of training data. Moreover, Johnson said, the algorithm or large language model you train must itself have guard rails to protect data.

“You don’t want to go just throwing [your data] out there onto the Internet, into some large language model that you don’t know the provenance of,” Johnson said.

More than software development

AI projects have some characteristics in common with software development, Wedge suggested. As with software development, it’s “important to curate the stakeholders that participate within those pilots or proofs of technology.” More than simply a technology and data exercise, AI projects must pull in a cross section of program managers and anyone else with an interest in performance, safety and efficiency of mission delivery, Wedge said.

Johnson said that, to a greater extent than in pure coding, you must involve users throughout the process. AI touches “the mission of the agency,” he said. “And that’s where you must get it in the hands of those folks who know what they want the outcome to be. And then let them play with it.”

A crucial best practice, Johnson said, establishes oversight of the ethics and fairness of AI as deployed. He noted that IBM has its own internal AI ethics board.

Equally important: a governance setup to ensure AI outcomes stay within acceptable ranges, and avoiding the kind of drift that can affect generative AI such that at some point, one plus one fails to equal two, Wedge and Johnson said.

The most promising use cases “are not about the technology doing the work of a human, but about making the human more productive,” Wedge said. Case management provides another rich possibility, aside from HR.

“Multiple federal agencies are responsible for managing, responding to, engaging on various cases,” Wedge said. “Imagine if you could use generative AI to generate a summary of the case, and then enable that caseworker to drill down in specific areas.”

The post Take a use case-driven approach to artificial intelligence first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/04/take-a-use-case-driven-approach-to-artificial-intelligence/feed/ 0
How CDC’s data office is applying AI to public health https://federalnewsnetwork.com/artificial-intelligence/2024/03/how-cdcs-data-office-is-applying-ai-to-public-health/ https://federalnewsnetwork.com/artificial-intelligence/2024/03/how-cdcs-data-office-is-applying-ai-to-public-health/#respond Tue, 26 Mar 2024 17:49:12 +0000 https://federalnewsnetwork.com/?p=4940120 Public health is ripe for opportunities to leverage AI, but it's not as simple as just picking the shiny new tool and feeding it data.

The post How CDC’s data office is applying AI to public health first appeared on Federal News Network.

]]>
Federal Monthly Insights - Operationalizing AI - March 26, 2024

As federal agencies push forward on their IT modernization goals, many agencies are exploring the potential use of artificial intelligence tools that can supplement human employees. Federal agencies are currently applying AI to a variety of missions, and public health is no different. The Centers for Disease Control and Prevention’s new Office of Public Health Data Surveillance and Technology (DST) is looking into ways to apply AI to public health data, as well as ways to leverage generative AI to bolster their efforts.

“There was actually a series of 15 pilots that were run across different centers and in offices across the agency,” Jennifer Layden, director of DST, said on Federal Monthly Insights – Operationalizing AI. “These were used to help evaluate the type of infrastructure we would need, what type of capabilities we would use, what would be the security factors that we’d have to consider? And the variety of these projects or pilots ranged from more like programmatic work to more operational work, such as website redesign, evaluating, comments back on a protocol or whatnot.”

CDC stood up DST last year to coordinate its data strategy. That includes improving data exchange with other federal, state and local agencies and non-governmental partners; improving the ways data informs public health initiatives; and ways to better visualize and distribute data for public consumption. AI is quickly becoming a part of those efforts.

AI use cases

For example, automated processes can flag potential health threats quicker, facilitating more rapid notifications and communication. But it can also improve internal workflows, making CDC employees more efficient at their jobs. And generative AI can quickly produce a fact sheet about a new public health threat to educate both those at risk and the medical professionals who may need to treat them.

For example, Layden discussed one test use case where AI is examining public cooling sites to identify what areas could be at risk for spreading Legionella, a disease spread through contaminated water.

Ensuring data privacy and reducing bias

Amidst a variety of potential use cases, Layden said DST is focusing on putting guardrails around the use of these tools.

“What we’re trying to do in the process is ensure that one, we establish guidance by which programs and scientists can have some basic playbook by which to use such tools to ensuring that people do it safely and securely,” she told The Federal Drive with Tom Temin. “[Two:] recognizing that we don’t want to create any risks to de-identification or information sharing that that should not be shared. And then three, how to also factor in ethical and bias considerations.”

Data privacy and ethical and bias considerations are especially important when working with public health data. One major concern around AI tools is that bad actors can leverage them to violate the data privacy of patients and citizens by manipulating the tools to reveal personally identifiable information. That’s where de-identification and determinations about what data is appropriate to share come into play. But that data also has to be as equitable and diverse as possible so as not to introduce any biases and potentially create new underserved populations, or exacerbate the conditions of existing ones.

Picking the right teams and AI tools

That’s why Layden said DST encourages the use of multidisciplinary teams when working with public health data. She advocated for teams that include experts in the disease or other public health threat, people who understand the populations affected or at risk, and people who grasp the data tools and methodologies to perform advanced analytics.

“It is really a multidisciplinary team that needs to come together to understand what the question is that we’re trying to answer,” Layden said. “What are the considerations we need to factor in, as we understand the data that we’re using? And then what are the best tools to help answer that question? So not just using a new tool because it’s a new tool, but is it the best tool to answer the question at hand?”

Another consideration revolving around these tools is the fact that they evolve; after all, AI tools have been around for some time now, but generative AI only hit the spotlight about a year ago. As that evolution occurs, experts need to continuously reevaluate them for a number of reasons: Are they still the best tool for the job? Has the nature of identifiable information changed in any way?

Sharing information and tools

The appropriate community also needs access to that information. Best practices and lessons learned can prevent other teams from making similar mistakes, or save them time in evaluating their own tools. Layden said that stakeholders need to continuously build out, test and validate that framework for it to keep doing its job.

That’s because the capabilities have to continue to evolve, because so will the threats. So public health professionals have to keep pace.

“One of the challenges in public health broadly — and not unique or not new — is bringing in the more advanced analytic capabilities, the workforce expertise,” Layden said. “We’ve also looked at ways to partner with academic and private partners, recognizing that our bandwidth, our capabilities to understand the full spectrum of tools and how they could be used … Will be slower to build up those capabilities in-house. So how we can partner with experts either in academic or private is another way for us to build up the capabilities, our understanding, as well as expertise.”

One way CDC accomplishes that is through the use of shared tools. For example, Layden said more than half of state jurisdictions use a tool for case investigation that the CDC operates and maintains. Similarly, there’s a shared surveillance system for tracking emergency room data. And there’s a shared governance model to help support the development and sharing of even more tools.

One of the benefits of sharing tools like this is it encourages sharing data more broadly, and in the same formats, reducing the amount of work data scientists have to do to reconcile the data before they can begin analyzing it.

“So in my mind, public health, the more we can share, build up enterprisewide tools that can be used and leveraged appropriately is one step that we need to continue to take and to grow, but then also sharing the best practices.

 

The post How CDC’s data office is applying AI to public health first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/artificial-intelligence/2024/03/how-cdcs-data-office-is-applying-ai-to-public-health/feed/ 0
DoD Cloud Exchange 2024: Splunk’s LaLisha Hurt on achieving digital resilience https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-splunks-lalisha-hurt-on-achieving-digital-resilience/ https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-splunks-lalisha-hurt-on-achieving-digital-resilience/#respond Tue, 26 Mar 2024 16:19:42 +0000 https://federalnewsnetwork.com/?p=4934098 Focus on three modernization musts to achieve cloud transformation: strategy, security and buy-in, says Splunk federal leader.

The post DoD Cloud Exchange 2024: Splunk’s LaLisha Hurt on achieving digital resilience first appeared on Federal News Network.

]]>

Military and civilian agencies have long struggled to make the jump to cloud computing. Deciding on the right cloud approach and strategy that best aligns with their mission needs for today and tomorrow is no easy task. But more important, agencies continue to struggle with modernization efforts amid concerns about potential security gaps and vulnerabilities the cloud introduces. 

“It’s a tricky balance. The reason why it’s tricky is because organizations rely on various IT and security architecture applications and legacy systems implemented for their specific mission support. Another challenge is that many agencies struggle with having so many tools, having an influx of data coming in from various logs across all these disparate legacy systems  — and they don’t integrate well. They don’t talk to one another,” said LaLisha Hurt,  public sector industry advisor at Splunk.

Cloud security concerns persist for most federal agencies for a reason, Hurt said during Federal News Network’s DoD Cloud Exchange 2024.

In its 2023 CISO Report, for example, Splunk found that chief information security officers identified that cloud applications and infrastructure have the biggest security coverage gaps across industries, with cloud impacting business services, healthcare and technology at 71%, 64% and 64% respectively. Cloud security impacts manufacturing at 64%. 

To address that problem within DoD, the Pentagon awarded the multibillion-dollar Joint Warfighting Cloud Capability contract to establish a common and secure cloud infrastructure. Last year, Chief Information Officer John Sherman instructed the military services to prioritize JWCC for their cloud modernization efforts. So far, less than 2% of the $9 billion contract has been utilized as concerns around security linger. 

Transferring to cloud, however, is essential to modernization efforts. Hurt noted that, in the end, it all goes back to the mission. 

The California statewide automated welfare system, for instance, needed to ensure it delivered benefits for Californians in a highly secure and uninterrupted manner. The agency was able to replace three disparate legacy systems with one single cloud-based platform, which saved over $30 million in taxpayer dollars.

“While they also improved productivity and reduced risks, that’s really the mission that this particular entity was trying to solve for” — safe, consistent access to benefits,” Hurt said. “And I think it’s similar for other agencies. They have their mission, and they’re looking for help to deliver on that.” 

No transformation happens without collaboration

Cloud transformation starts with a strategy and gaining the support of various stakeholders to deliver on the strategy.

“I know that sounds simple, but people want to jump to the capabilities or technologies. But what’s that strategy that you’re trying to align to? And do you have buy-in from not only your leadership but the people that are going to be implementing it — your employees — which I think is equally important,” Hurt said.

Determining the model will depend on each agency or organization’s unique mission needs and ensuring that the model can be scalable and increase as the demands grow. 

“So many customers are going cloud only. Some remain on-prem for unique mission needs. And then there are others that actually operate in a hybrid environment,” Hurt said. “And I don’t think there’s a right or wrong approach, as long as it serves your business needs. And also, as long as it allows you to scale in the future. That’s important.”

She continued: “The other thing I would say is to take a risk-based approach and ensure you have a strong inventory of assets, systems and classification prior to the migration. You might find that everything does not necessarily need to go to the cloud.”

Splunk spends the most time with customers conducting business value assessments to understand the pros and cons of moving to the cloud versus staying on premise, Hurt said.

“It goes back to the mission. What are the things that are mission-critical to your agency? What are the things that you care about most? And where do you want to house them? And what levels of security do you want to put around them? That will dictate whether you keep things on prem versus move to cloud,” she said. “Where are you trying to gain and obtain more efficiencies?”

It’s also important to expand participation in these conversations and bring in “not only your cyber teams but your infrastructure teams, your chief technology officer, your chief information officer,” Hurt said. “It’s really a cross-functional effort that should be considered when you’re building that cloud strategy.”

Discover more articles and videos now on Federal News Network’s DoD Cloud Exchange event page.

The post DoD Cloud Exchange 2024: Splunk’s LaLisha Hurt on achieving digital resilience first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-splunks-lalisha-hurt-on-achieving-digital-resilience/feed/ 0
Federal Executive Forum Artificial Intelligence & Machine Learning Strategies in Government Progress and Best Practices 2024 https://federalnewsnetwork.com/cme-event/federal-executive-forum/2024-federal-executive-forum-artificial-intelligence-machine-learning-strategies-in-government-progress-and-best-practices/ Mon, 25 Mar 2024 13:43:12 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4938431 How are AI/ML strategies evolving to meet tomorrow’s mission?

The post Federal Executive Forum Artificial Intelligence & Machine Learning Strategies in Government Progress and Best Practices 2024 first appeared on Federal News Network.

]]>
var config_4957185 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB6797975585.mp3?updated=1712754924"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/FedExeFor1500-150x150.jpg","title":"Artificial Intelligence & Machine Learning Strategies in Government Progress and Best Practices 2024","description":"[hbidcpodcast podcastid='4957185']nnMachine learning and artificial intelligence continue to play an important role in the evolution of agency people management, processes and technology. But how are strategies evolving to meet tomorrow\u2019s mission?nnDuring this webinar, you will gain the unique perspective of top government AI\/ML experts:n<ul>n \t<li><strong><a href="https:\/\/www.dhs.gov\/person\/dimitri-kusnezov-phd" target="_blank" rel="noopener noreferrer">Dimitri Kusnezov<\/a><\/strong>, Under Secretary, Science & Technology Directorate, Department of Homeland Security<\/li>n \t<li><a href="https:\/\/www.linkedin.com\/in\/shane-barney-69026528\/" target="_blank" rel="noopener noreferrer"><strong>Shane Barney<\/strong><\/a>, Chief Information Security Officer, US Citizenship & Immigration Services<\/li>n \t<li><a href="https:\/\/www.linkedin.com\/in\/menonwa\/" target="_blank" rel="noopener noreferrer"><strong>Ramesh Menon<\/strong><\/a>, Chief Technology Officer, Defense Intelligence Agency<\/li>n \t<li><a href="https:\/\/www.state.gov\/biographies\/matthew-graviss\/" target="_blank" rel="noopener noreferrer"><strong>Matthew Graviss<\/strong><\/a>, Chief Data & Artificial Intelligence Officer, State Department<\/li>n \t<li><a href="https:\/\/www.linkedin.com\/in\/katiecartytierney\/" target="_blank" rel="noopener noreferrer"><strong>Katie Tierney<\/strong><\/a>, Area Vice President, Digital Services & Operations Management, BMC<\/li>n \t<li><a href="https:\/\/www.linkedin.com\/in\/sujmohanty\/" target="_blank" rel="noopener noreferrer"><strong>Sujit Mohanty<\/strong><\/a>, General Manager, Public Sector, Field Engineering, DataBricks<\/li>n \t<li><a href="https:\/\/www.linkedin.com\/in\/mjh47899\/" target="_blank" rel="noopener noreferrer"><strong>Michael Hardee<\/strong><\/a>, Chief Architect, Law Enforcement & Justice, Red Hat<\/li>n \t<li><strong>Moderator: Luke McCormack,\u00a0<\/strong>Host of the Federal Executive Forum<\/li>n<\/ul>nPanelists also will share lessons learned, challenges and solutions, and a vision for the future."}};

Machine learning and artificial intelligence continue to play an important role in the evolution of agency people management, processes and technology. But how are strategies evolving to meet tomorrow’s mission?

During this webinar, you will gain the unique perspective of top government AI/ML experts:

  • Dimitri Kusnezov, Under Secretary, Science & Technology Directorate, Department of Homeland Security
  • Shane Barney, Chief Information Security Officer, US Citizenship & Immigration Services
  • Ramesh Menon, Chief Technology Officer, Defense Intelligence Agency
  • Matthew Graviss, Chief Data & Artificial Intelligence Officer, State Department
  • Katie Tierney, Area Vice President, Digital Services & Operations Management, BMC
  • Sujit Mohanty, General Manager, Public Sector, Field Engineering, DataBricks
  • Michael Hardee, Chief Architect, Law Enforcement & Justice, Red Hat
  • Moderator: Luke McCormack, Host of the Federal Executive Forum

Panelists also will share lessons learned, challenges and solutions, and a vision for the future.

The post Federal Executive Forum Artificial Intelligence & Machine Learning Strategies in Government Progress and Best Practices 2024 first appeared on Federal News Network.

]]>
DoD Cloud Exchange 2024: DISA’s Korie Seville on crafting cloud products that easily adapt to user need https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-disas-korie-seville-on-crafting-cloud-products-that-easily-adapt-to-user-need/ https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-disas-korie-seville-on-crafting-cloud-products-that-easily-adapt-to-user-need/#respond Mon, 25 Mar 2024 13:02:02 +0000 https://federalnewsnetwork.com/?p=4938415 The Hosting and Compute Center at DISA creates new services to lower the barrier to quick cloud adoption and scalability.

The post DoD Cloud Exchange 2024: DISA’s Korie Seville on crafting cloud products that easily adapt to user need first appeared on Federal News Network.

]]>

Rank Korie Seville among the Defense Department’s go-to guys for cloud computing smarts.

Seville’s title at the Defense Information Systems Agency might sound a bit cryptic: deputy chief technology officer for compute and senior technical adviser for J9 hosting and compute. But he’s clear about his two-hatted role helping Defense Department agencies succeed in their cloud deployments.

At the J9 — Joint Operations and Plans Directorate — level, “I basically act as an integration point between DISA’s hosting and compute directorate and the rest of the agency from a cloud computing perspective,” Seville said during Federal News Network’s DoD Cloud Exchange 2024.  “At the external level, I basically work with other agencies’ CTO- level engineers and leaders within different agencies to advise them on their cloud portfolios, their migration strategies and their overall hosting and compute strategies.”

Seville said that cloud computing, as agencies move from simply hosting applications to reworking them into microservices, has enhanced DoD’s capability to distribute workloads and create greater resiliency. The evolution also improves computing outside the continental United States (OCONUS), a critical DISA and DoD challenge  generally.

Defense users gain “the capability to expand and get better services, no matter where they are across the world,” he said.

Tapping cloud effectively OCONUS

Asked about DISA’s signature cloud computing contracts, known collectively as the Joint Warfighting Cloud Capability, Seville said the program supports OCONUS needs by easing the buying process for cloud. JWCC “is just an acquisition vehicle,” he said. “It’s a way to purchase cloud, but it doesn’t necessarily by itself solve the OCONUS problem.”

Contractors on JWCC, though, do provide a variety of tactical edge platforms, ranging from modular data centers to backpack-sized computing units.

It’s in how teams want to use the cloud and transform how they deliver the mission that’s an intense focus for DISA and Seville. For instance, what if say two deployed teams focused on the same area of responsibility want to be able to collaborate, he said. How would that happen? How could it be done as an operational expense “instead of having to put a large capital investment in to utilize these cloud capabilities?” Seville said

One answer is Stratus, a government-owned cloud product DISA has deployed in Hawaii, he said. A second option is DISA’s joint operational edge product that uses public clouds. Seville said DISA partners with the DoD chief information officer “to push public cloud capabilities to the OCONUS user community.” A new instance of that capability is under development for Japan and a couple of other locations.

Basically, it consists of one of the commercial cloud services providers, in this case Amazon, providing its hardware housed in DISA secure facilities and operated by government employees. Seville said DISA plans to add the other JWCC suppliers “to be able to get their enterprise-grade deployable solutions, put them in our facilities and have them there for consumption.”

Seville said his group partners closely with DISA’s program executive office for transport, which manages the network connections needed for computing nodes to communicate with one another.

“Their technical director and myself stay very connected. We basically sit together and share roadmaps,” he said, adding that sometimes “my roadmap is going this way, your roadmap is going that way.”

When that happens, the two offices work out “where can we meet and take advantage of some of the resiliency that each of us is building in to make our products operate better together,” Seville said. But they leave the choice of specific transport options to the users, he said.

Providing new DISA common services for the cloud

Another DISA cloud project now under development, and dubbed Olympus, focuses on common services that surround cloud workloads.

“These are things like name resolution, Domain Name System capabilities, certificates, network time — all of these things that are often overlooked in application deployment,” Seville said. “but they’re crucial to getting an application off the ground.”

Olympus will provide these services as needed. Two minimally viable products created so far for Olympus focus on core competencies:

  • Network connectivity and boundary protection
  • A basic suite of common services

The addition of common services elements will result in what Seville called a managed platform, “where the customers can just come in and drop their apps, and we remove the burden, or share the burden, of bringing all of those common services up and operational.”

The basic goal is to help DISA’s customers access meet cloud needs quickly by “really lowering the barrier for entry for getting started in cloud.” He pointed out that the Air Force’s Cloud One and the Navy’s Flank Speed programs provide similar services. But because those service-driven projects are focused on their respective organizations, “we designed Olympus to catch the customers that may have fallen through the cracks,” Seville said.

DISA hosts the pilot version of Olympus in the Microsoft Azure cloud. Seville stressed that the Hosting and Compute Center (HAC) takes an iterative approach informed by customer feedback when crafting products, Olympus included.

“When we develop any of our capabilities, we really try to get away from that five-year plan, 10-year plan, where we know exactly where we’re going to go and nothing can force us to deviate,” he said, and added, “The most important thing to us is our customers, the warfighters. They know their missions better than we do. For us to prescribe where we’re going to go doesn’t make sense if our goal is to support the warfighter.”

Ensuring ‘optionality’ in all DISA cloud offerings

HAC views providing choice as a foundational factor in helping DoD users implement the cloud capabilities they need to meet their specific situations and missions.

“One of the design tenets, and one of the tenets of our entire organization, has been optionality,” Seville said. “And so when I have an OCONUS user who’s trying to build out a capability, we’re going to provide them with a menu of options.”

He used the analogy of a pizza parlor menu, where a customer can choose from a variety of toppings for their pie: “Do they want a combination of tactical edge, operational edge and maybe some data center as a service to give them the ultimate level of resilience? Or do they want to go strictly tactical edge and just maintain local ownership of that computing capability?”

As cloud hosting has taken hold in DoD, Seville said he’s now seeing increased use of the elasticity and flexibility cloud computing offers. An important reason is that early estimates of cost savings from simply shifting workloads failed to pan out.

“People are starting to realize that taking advantage of elastic scale, taking advantage of serverless capabilities, that’s how you’re going to save that money,” he said. To get there, though, application owners will have to go the refactoring or redeveloping route. And he said users will also have to keep rationalizing their application sets, retiring those that won’t work in the cloud.

“There is an app refactor model that has to take place in order for you to effectively take advantage of elastic scale,” Seville said. DISA can partner with users redoing applications to help them fully realize cloud benefits.

By going to containerization and microservices for applications, Seville said, users will get closer to cloud interoperability and easily moving workloads among competing cloud providers. That vision of a  “cloud-agnostic, multicloud, hybrid cloud, pick-up-an-app-and-move-it-wherever-I-want model really relies on that app rationalization, that app modernization framework.”

Discover more articles and videos now on Federal News Network’s DoD Cloud Exchange event page.

The post DoD Cloud Exchange 2024: DISA’s Korie Seville on crafting cloud products that easily adapt to user need first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-disas-korie-seville-on-crafting-cloud-products-that-easily-adapt-to-user-need/feed/ 0
Generative AI: Start small but scale fast https://federalnewsnetwork.com/federal-insights/2024/03/generative-ai-start-small-but-scale-fast/ https://federalnewsnetwork.com/federal-insights/2024/03/generative-ai-start-small-but-scale-fast/#respond Mon, 25 Mar 2024 11:21:11 +0000 https://federalnewsnetwork.com/?p=4938332 David Knox, the chief technology officer for industrials, energy and government at Oracle, urges federal agencies to take a measured approach to generative AI.

The post Generative AI: Start small but scale fast first appeared on Federal News Network.

]]>
Generative artificial intelligence is unlike any technology that’s come along in recent memory. One reason: You’d be hard pressed to find an application or process to which generative AI doesn’t apply. In some sense, it can do more than it cannot do.

That, plus the technology’s sudden emergence in media and at so many industry conferences and gatherings, has organizations worried. They want to avoid rapid obsolescence by failing to adopt generative AI right away.

David Knox, the chief technology officer for industrials, energy and government at Oracle, urges federal agencies to take a measured approach. He says you might be able to do anything with generative AI, but you can’t do everything. Knox recommends starting with what he called proving grounds — low-risk, low-complexity processes with which to try out AI.

Even in such entry use cases, it’s wise to test the work in an isolated “sandbox” environment while you evaluate the benefits and ensure the requisite security and privacy controls remain in place.

If the resulting AI-powered application does work as intended, agencies then need to be able to create “a path to production,” Knox says. That means knowing your compliance framework and requirements in advance, as with any new technology deployment.

Identifying the proving ground generative AI applications will enable users to operate in what Knox calls the “find/fail fast/fix” mode. For example, nearly every agency deals with human capital processes such as hiring, retention and performance management. Knox advises choosing a single process and using real data to test both the efficacy of generative AI and whether the resulting process retains those crucial compliance measures. Then, if it does, apply the process in a limited production scope before scaling to agency-wide use.

Given the wide potential of generative AI, nearly every federal process is a candidate for its application. Knox says a less intuitive, perhaps, but potentially large payoff candidate for generative AI would capture and preserve institutional knowledge held by the generation of federal employees eligible to retire, the boomers and late millennials.

Such people “have an incredible amount of institutional knowledge of just what are the programs, what are the processes, how to get things done,” Knox says. The information may or may not be written down, and if it is, often no one knows where. Recording knowledgeable people’s answers can create an unstructured database to which the application of generative AI can be ideal. He cautions that unstructured knowledge capture can be risky, especially when using generative AI to capture it. Knox advised a trust-first mindset, with human supervision, rather than using people’s answers to directly train AI.

Knox cited procurement as another internally facing function where such knowledge capture would have big payoff, asking senior practitioners about systems, policies, acronyms, norms of operating, compliance rails and other things to be aware of.

One externally facing area for applying generative AI is citizen serving applications. Here, Knox said, agencies can use AI for what he termed document understanding. He defined that as going beyond optical character recognition and parsing data in individual fields, and beyond converting documents to digital images by essentially turning them into an interactive knowledge base.

In all cases of AI deployment, Knox said, it’s important to keep in mind the human factor, because people often think AI will somehow replace them. Given the complexities of federal procurement, HR management, finances and accounting, grant-making and program management, Knox said, people will realize instead how AI will benefit them — not by replacing them, but rather by augmenting their decision-making and removing routine or repetitive tasks connected to their jobs.

“We didn’t get rid of people when we invented calculators,” he said. “We’re not going to do that with generative AI.”

The post Generative AI: Start small but scale fast first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/03/generative-ai-start-small-but-scale-fast/feed/ 0
The federal AI playbook: Mission transformation for the AI era https://federalnewsnetwork.com/cme-event/federal-insights/the-federal-ai-playbook-mission-transformation-for-the-ai-era/ Fri, 22 Mar 2024 13:03:30 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4935713 Learn how AI innovation has the potential to transform all areas of government.

The post The federal AI playbook: Mission transformation for the AI era first appeared on Federal News Network.

]]>
Artificial Intelligence (AI) will likely be the most impactful technology of this era. Its impact on essential workflows, from content creation to search and analysis to decision support, is already being felt across the government.

Government initiatives—from the 2020 AI in Government Act to the Biden Administration’s recent AI Executive Order—speak to the fundamental tension that agencies must navigate as AI becomes more available and capable. Innovation vs. risk is always a difficult balance, but AI’s potential is only starting to be understood.

Download this ebook to learn more from Microsoft.

The post The federal AI playbook: Mission transformation for the AI era first appeared on Federal News Network.

]]>
DoD Cloud Exchange 2024: Akamai’s Robert Gordon on streamlining cloud operations at scale https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-akamais-robert-gordon-on-streamlining-cloud-operations-at-scale/ https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-akamais-robert-gordon-on-streamlining-cloud-operations-at-scale/#respond Fri, 22 Mar 2024 11:39:22 +0000 https://federalnewsnetwork.com/?p=4935680 Managed service providers can tackle the coordination necessary across cloud providers, DoD agencies and multitude of apps, the Akamai systems engineer says.

The post DoD Cloud Exchange 2024: Akamai’s Robert Gordon on streamlining cloud operations at scale first appeared on Federal News Network.

]]>

For the Defense Department, the benefits of hosting applications in the cloud bring some challenges. Chief among them? Avoiding  the cost and time of repeatedly developing computing services common across all applications.

The simple answer is that DoD agencies should instead develop once and use many times, said Robert Gordon, director of engineering for Akamai Defense. The development and deployment of such services offers an ideal way to use managed service providers, he said. MSPs operate in a value-added manner between DoD agencies and primary commercial cloud service providers.

“On one side, there are the mission application teams that are trying to move their workloads to the cloud. On the other side, MSPs sit on top of the commercial clouds, and they try to figure out ways to be able to support these applications at massive scale,” Gordon said during Federal News Network’s DoD Cloud Exchange 2024.

DoD components sometimes have hundreds or thousands of cloud-hosted apps in what he called critical mass. Each, though, needn’t have its own unique services.

For example, every cloud application requires a user access mechanism that’s not related to the operation or logic of the app itself. Access solutions, Gordon said, can be difficult to engineer because of the many DoD rules around security and other characteristics.

Akamai “focuses on those common hard problems because the benefit of solving that problem is multiplied by hundreds or thousands of instances,” he said.

After access comes authentication “and how it fits in with the zero trust initiative that’s sweeping through  DoD and is totally tied in with the cloud is another aspect of this,” he said. “The mission application teams are on their own to try to figure out how to do it, unless there’s a common services layer” providing the service.

Such common services “are the things the MSPs should look for, the things that everyone’s going to have to do,” he added. “Everyone’s going to have to solve this problem.”

Taking advantage of common services at scale

Some common services occur on the back end of applications, such as database calls or network connections among apps, Gordon said. He named single sign-on systems that require connections from, say, an application in the Army to an application in the Defense Information Systems Agency.

“They may not have a plug into the Army, or Air Force or whatever DoD backend that has all the enterprise information,” Gordon said.

Plus, application owners typically face a complicated process to obtain access.

An agency’s tech or development  staff might know how to write identity cloud service or security assertion markup language, Gordon said. “But that’s only part of the puzzle. The back end is equally important,” he said. “You have no way of figuring out what the attributes are that you need to make your decisions. You have no way of enforcing authorization in a common way, using those attributes.”

Migrating data to the cloud and operating data exchanges also provide opportunities for use of common services, he said.

“Whether database access, or even system-to-system communication, most of these are big, complex systems with a lot of trading partners that are used to being able to FTP files to each other,” Gordon said.

That’s because everyone was on the same DoD information network. The cloud complicates those exchanges and communications connections because now systems use the internet and commercial clouds.

“This is another area where the MSPs provide common services to try to streamline that,” Gordon said. “And when they can’t provide common services, MSPs at least provide playbooks so that the application teams that need to do these things know what they need to do it in a compliant, secure and data-aligned way.”

Discover more articles and videos now on Federal News Network’s DoD Cloud Exchange event page.

The post DoD Cloud Exchange 2024: Akamai’s Robert Gordon on streamlining cloud operations at scale first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-akamais-robert-gordon-on-streamlining-cloud-operations-at-scale/feed/ 0
Digitizing issue management: Enabling decisions for finance and risk https://federalnewsnetwork.com/cme-event/federal-insights/digitizing-issue-management-enabling-decisions-for-finance-and-risk/ Thu, 21 Mar 2024 14:28:32 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4934102 How is your agency organizing efforts around enterprise risk management and digitizing processes?

The post Digitizing issue management: Enabling decisions for finance and risk first appeared on Federal News Network.

]]>
Officials at the Defense Department and the Department of Veterans Affairs are in very different places historically when it comes to audit readiness, but both agencies face similar challenges with legacy systems and data compatibility as they modernize their financial management processes.

The Pentagon has famously never passed an annual audit. DoD said it made incremental progress toward that goal last year. The Marine Corps made history as the first military branch to receive an unmodified audit opinion earlier this year.

Meanwhile, the VA last year achieved its 25th clean audit in a row. Edward Murray, the principal deputy assistant secretary for management at the VA, said agency officials have that record in mind as they carry out its large-scale financial system modernization program.

“When we did our major system migration, we felt we had a lot of turf to protect,” Murray said on Federal News Network. “We were told, ‘Be successful with your new system implementation, and don’t lose our clean audit opinion.’”

But Murray said the VA isn’t just trying to maintain its previous success. He said the modernization effort is also attempting to build on previous audit findings to address some of the agency’s longstanding gaps and weaknesses.

One of the VA’s biggest gaps, Murray said, has been the lack of integration between acquisition and financial systems, respectively. The new system will integrate those two areas, such that auditors will be able to clearly link financial transactions back to procurement data and contracting actions.

The VA’s National Cemetery Administration recently became the first agency component to adopt that integrated system, Murray said.

“We’re looking to elevate where we were and streamline processes, prove internal controls, all those good things that you would expect once you’ve achieved a clean audit opinion level for some time,” Murray said. “Preserve it and grow it.”

Marine Corps audit success

The Navy and DoD as a whole, meanwhile, are taking a lot of lessons forward from the successful Marine Corps audit.

“It really highlighted some of the enterprise policy changes that we have to make to enable this,” Alaleh Jenkins, the principal deputy and assistant secretary of the Navy for finance management and comptroller at the Department of the Navy, said during the webinar.

“There are things that we do just because we have been always doing it,” she added.

A major factor is “change management,” Jenkins continued. The Navy is attempting to bring together several systems and processes under its Enterprise Resource Planning program. It’s a massive undertaking with numerous data and system integration challenges.

“The more interfaces we have with the legacy system, the more it complicates our situation for data quality,” Jenkins said. “It brought in a different level of attention to use of data analytics.”

DoD’s ADVANA big data analytics platform has been instrumental in helping military components bring their data together into one system, she said, including helping the Marine Corps address its audit issues.

“But it really came down to every chief warrant officer and the civilians in the Marine Corps and the leadership at all levels of the organization to get after it,” Jenkins said.

Financial data and digitization challenges

As DoD officials look to build on the Marine Corps’ success, they will have to address a long-standing challenge: the myriad IT systems across the department.

A recent DoD inspector general audit found the department operates at least 4,500 unclassified IT systems. The IG reports that DoD does not have a “complete or accurate” inventory of systems relevant to financial reporting.

Shawn Lennon, the finance deputy director and deputy chief financial officer at the Defense Logistics Agency, said managing data flows between disparate systems remains a major challenge for audit readiness.

“When you look at it from a financial reporting perspective, every time you transmit data between systems, there’s a risk that exists, that that data is not transmitted accurately, completely and timely between those two systems,” Lennon said.

Lennon also serves as director of DLA’s Finance Improvement Audit Remediation effort. He said the logistics agency is undertaking several “digital business transformation” initiatives, including upgrading the warehouse management system used to track parts and materiel across the military services.

“That’s a really key part of it — doing that true business process reengineering, challenging your assumptions, looking for more efficient ways to go to standard, and working with your customers,” Lennon said. “We work every day with the military services in one way or another to try to find solutions that’ll work for them. And for us, number one: to drive cost down. And number two: to improve financial reporting outcomes to drive audit opinions.”

DLA is also implementing a new identity, credential and access management system. And its in the process of adopting the Treasury Department’s “Government Invoicing” or “G-Invoicing” system.

The adoption of the Treasury Department service will be crucial, Lennon said, to addressing a major financial management challenge across the federal government: unsupported intergovernmental account balances.

“DLA sells a lot of materials — from clothing and construction, to weapons system parts, to fuel — to the military services and others throughout the federal government,” Lennon said. “In the future, all of those transactions will flow through that system to ensure that both trading partners, the buyer and the seller, have the same information and that we stay in sync.”

Similarly, as the Navy migrates data into its new ERP system, Jenkins said the department is working to ensure it has access to consolidated and timely data across functions like acquisition, HR and logistics. The Navy has created a special enclave of the DoD-wide ADVANA platform called “Jupiter” where a lot of those applications are becoming available. She said the Navy is spending a lot of time on “data cleansing” and ensuring the information is complete.

“Our data by itself, the finance data, it’s meaningless,” Jenkins said. “It is powerful when it’s connected with the rest of the data.”

One of the applications the Navy recently established is called the “Commanders Enterprise Resource Management Council.” The goal is to teach commanders how to use data analytics to improve budget execution and increase the department’s buying power, Jenkins said.

“Our data and analytics platform and automation is becoming part of our day to day journey,” she said. “And the more and more we can train our people to use that on a day to day basis, it helps us as a as a community and as an organization.”

Learning objectives:

  • Modernizing financial management systems for enhanced efficiency
  • Roadmaps and timelines for financial management systems transformation
  • Achieving and sustaining audit-readiness through system collaboration and automation

The post Digitizing issue management: Enabling decisions for finance and risk first appeared on Federal News Network.

]]>
DoD Cloud Exchange 2024: Army’s Leo Garciga on clearing obstacles to digital transformation https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-armys-leo-garciga-on-clearing-obstacles-to-digital-transformation/ https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-armys-leo-garciga-on-clearing-obstacles-to-digital-transformation/#respond Thu, 21 Mar 2024 11:30:43 +0000 https://federalnewsnetwork.com/?p=4933567 The Army CIO expects the service's new software development policy will bring better capabilities to soldiers faster.

The post DoD Cloud Exchange 2024: Army’s Leo Garciga on clearing obstacles to digital transformation first appeared on Federal News Network.

]]>

Leonel Garciga has been on a sprint with a bulldozer.

Since becoming chief information officer of the Army last June, Garciga has been clearing policy obstacles, built up over the course of decades, to help spur digital transformation from the private to the general to the service’s secretary.

For most, the job of CIO is a marathon. But Garciga needed to start at a sprinter’s pace to — in the words of Army Secretary Christine Wormuth — “start breaking through the bureaucracy of the department.”

Even so, he acknowledges that the marathon part of his job will begin once the policy obstacles are cleared. But for now, he continues to open the throttle on the bulldozer.

“If it’s a policy challenge, if it’s a standard operating procedure challenge, I’m the guy with the pen. Help me fix that. If you’ve got lessons learned inside industry or in the commercial space, bring those standard operating procedures, bring those policies over, bring those guardrails over. Let’s put it on paper, and let’s get it signed out. Don’t let that prevent us from delivering,” Garciga said during Federal News Network’s DoD Cloud Exchange 2024.

“That’s the big thing that I keep pushing. Instead of saying, ‘Hey, policy doesn’t let me …,’ tell me this is what the policy should say, and let’s get that signed out. Let’s work through the friction and get that done. I continue to tell folks like that’s where we need the most help. We need to make sure that we get that alignment done because right now you’ve got someone who likes moving really fast, and I’m willing to underwrite a significant amount of risk when it makes sense.”

Focusing on 5 software reforms for Army DevSecOps

In March, Garciga pulled out his bulldozer to topple the Army’s approach to software development. Wormuth issued a new agile software policy detailing five changes to reform what she called the institutional processes of the Army.

The software reforms include everything from changing the way the Army writes requirements to emphasizing flexible acquisition approaches and training the workforce in these methods.

Garciga said the policy changes will help the service streamline its ability to build contracts based on agile and DevSecOps methodologies.

“A really big push includes centralizing some of that work at an acquisition digital center of excellence, which will be focused around these core agile contracts that we want to get out the door to support software development efforts,” he said. “The next big piece is really changing our approach to requirements by taking the holistic view we’ve had before to write these large dissertation type requirements and scaling them down to capability needs statements. So what it really does is take that requirements process and bring it down to core functionality versus those [individual systems] and allowing teams to have a little bit more left and right limits as they move forward.”

These changes aren’t just for IT or development teams. Garciga said the acquisition and nonacquisition workforces, as well as the test and evaluation experts, must all move in the same direction to meet the Army’s digital transformation goals. Otherwise, he said, creating a modernized foundation to build off of will be more difficult.

The Army can’t just write a policy and expect change to happen, which is why Garciga said the new digital center of excellence at Aberdeen Proving Ground in Maryland will take the lead in procuring software.

“The center will include subject matter experts who understand software development, who can help customers really flesh out how they want to get from that contract, put it in place in the most agile way that really does include all those requirements for agile development, sprint cycles and all those things that you need expertise in,” he said.

“The other piece, which is a Step 2 that’s happening simultaneously, is a team the CIO’s office is standing up. It’s a very small cell, which is really focused on helping either big programs or really critical programs in the Army run through the wickets of a better software contract. Whether it’s legacy stuff that we have that may need some shaping to get the right agile contract in place or to get the right task orders in place, we would bring our expertise with some software development experts and some engineers to help the command or the program really reshape their contracting efforts in coordination with the center of excellence for digital contracting.”

Turning to industry partners for Army cloud assist

The software expert cell already is working with a handful of Army commands on specific and critical programs. Garciga said the next step is to create a governance structure to help manage expectations and data. He said that will come together this spring.

Garciga expects that the changes will help the service work better with existing and potential vendor partners.

“With the traditional contracting approach, we alienated some of our more leading edge partners because we were telling them to go backwards to deliver,” he said. “I think that this is going to give some flexibility to these companies to bring in some expertise and so they can more healthily compete in the environment. For some of the folks that have been supporting us a long time, are good partners who haven’t had the opportunity to take that next step, this is really going to give them a landing pad to accelerate some of those efforts.”

Along with the new software policy, Garciga has led the effort to update guidance around reciprocity of security authorizations, use of a software container policy and a new software as a service policy.

All of these efforts, of course, are underpinned by the use of cloud services. To that end, Garciga said his office is close to releasing the revamped cArmy platform, with cArmy 2.0 launching possibly in the April.

The service added agility based on all the lessons learned and made the cloud platform bit more user-friendly for Army partners, Garciga said.

“A lot of work is happening in that space. We’re working the AWS side to create a new landing zone. We’ll start to transition some of the existing customers into a new landing zone, which I’m excited about because it’s going to ease a lot of their pain and some of their challenges with just getting day-to-day operations done,” he said. “Then after that, we’ll move on to Microsoft Azure, and we are still looking at where we have opportunity with some of our other cloud service providers.”

Applying lessons from early Army cloud moves

The decision to update C-Army meant the service took a “tactical pause” over the last few months in moving workloads and applications to the cloud.

Garciga said the pause let the Army reevaluate its delivery model around cloud services.

“Like most traditional folks and enterprises who moved to the cloud, we raced in some areas, and we made some mistakes. We did some things that made sense at the time but don’t make as much sense now. And as new cloud services have become available in the regions across all our cloud service providers, it’s really caused us to rethink some of some of the technical work that’s been done,” he said.

“We made some decisions that made sense to do, like physically lifting and shifting a capability and just run the infrastructure as a service. It made sense at the time for the services that were available and for what we were trying to do to overcome some challenges that we had as an Army and in some of our server rooms. But we did that probably in the least optimized way. As we’re now two, three, four years down the road, we’re like, ‘Wow, that’s really suboptimized. Our costs are really high here.’ ”

That’s particularly true for some of the services and systems the Army move to the cloud early on, Garciga said. The end result? The Army created new legacy technology debt in the cloud, he added.

The new C-Army platform should streamline the service’s ability to deliver core enterprise cloud services, reduce the number of trouble tickets the help desk receives and provide standardized templates for vendors and customers alike.

“You can be a little bit more predictable on what kind of capabilities you want to deliver and how you want them delivered. We are really focusing on some foundational things that will allow the acquisition community and our partners to understand what the environment looks like in a more streamlined way,” Garciga said.

“We will streamline onboarding services and really automate as much of the onboarding for customers as we can. We really want to deliver a lot of the information upfront. What does the environment look like? What do our images look like? What baseline managed services are we delivering as an Army to your tenant? Getting that out is hugely important. So our focus is going to be making sure that we make that available to all the folks that are coming into the environment. This will make it a little bit easier for folks to come in.”

Discover more articles and videos now on Federal News Network’s DoD Cloud Exchange event page.

The post DoD Cloud Exchange 2024: Army’s Leo Garciga on clearing obstacles to digital transformation first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-armys-leo-garciga-on-clearing-obstacles-to-digital-transformation/feed/ 0