Roundtables - Federal News Network https://federalnewsnetwork.com Helping feds meet their mission. Tue, 30 Jan 2024 19:51:57 +0000 en-US hourly 1 https://federalnewsnetwork.com/wp-content/uploads/2017/12/cropped-icon-512x512-1-60x60.png Roundtables - Federal News Network https://federalnewsnetwork.com 32 32 Ask the CIO: Defense Logistics Agency https://federalnewsnetwork.com/cme-event/federal-insights/ask-the-cio-defense-logistics-agency/ Fri, 28 Apr 2023 15:39:07 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4551064 In this exclusive webinar edition of Ask the CIO, host Jason Miller and his guests from the Defense Logistics Agency will dive into the warehouse modernization system and future strategies at the DLA. In addition, Zebra Technologies' John Wirthlin will provide an industry perspective.

The post Ask the CIO: Defense Logistics Agency first appeared on Federal News Network.

]]>
In this exclusive webinar edition of Ask the CIO, host Jason Miller and his guests from the Defense Logistics Agency and Zebra Technologies dive into the warehouse modernization system and future strategies at the DLA.

Click here for more on this discussion.

Learning Objectives:

  • Status of DLA’s Warehouse Modernization System
  • Reasons to Modernize
  • Lessons Learned
  • Workforce Readiness and Training
  • Industry Analysis

Complimentary Registration
Please register using the form on this page or call (202) 895-5023.

The post Ask the CIO: Defense Logistics Agency first appeared on Federal News Network.

]]>
DISA pushes companies to adopt standards to ensure interoperability across zero trust architectures https://federalnewsnetwork.com/cme-event/federal-insights/ciso-handbook-defense-information-systems-agency/ Tue, 25 Apr 2023 14:32:35 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4551000 During this exclusive CISO Handbook webinar, moderator Justin Doubleday and guests Brian Hermann from the Defense Information Systems Agency and Christopher Day from Tenable will explore zero trust progress and strategy at DISA.

The post DISA pushes companies to adopt standards to ensure interoperability across zero trust architectures first appeared on Federal News Network.

]]>
The success of the Defense Department’s zero trust push is inherently going to rely on the tools and services of contractors, who will help fill the gaps in the 45 security capabilities laid out in DoD’s zero trust strategy.

And for Brian Hermann, the director of cyber security and analytics for the Defense Information Systems Agency, one of the key things he needs from vendors amid DoD’s push to its “target” level zero trust architecture by 2027? Honesty.

“I think we also need everybody to be realistic about what their tools bring to the fight,” Hermann said on Federal News Network. “Because one way that I know that a vendor is not telling me the truth is if they tell me that their tool can hit the easy button for zero trust. That’s just not realistic.”

DISA is advancing DoD’s zero trust architecture push through its “Thunderdome” program, which successfully completed its prototype phase earlier this year. The agency is now looking to add other elements to the program. So far, featured two primary applications: a software-defined networking (SD-WAN) and a Secure Access Service Edge (SASE).

“Those capabilities proved themselves to be successful so that we we achieved a decision to more broadly deploy those capabilities across this terrain,” Hermann said.

Thunderdome isn’t a one-for-one replacement for DISA’s Joint Regional Security Stacks, but the program is putting in place zero trust capabilities that replace the functionality of JRSS, Hermann explained. The SASE capability puts the security stack “much closer to the customer” than the JRSS entry points, he said.

“That’s a key capability for us to be able to make critical access control decisions. I call them fine-grained access control decisions, so we can leverage information about a user, about their device and make that access control decision,” Hermann said. “And we’re using it today right now with the folks that have been successfully piloting the capabilities as part of Thunderdome.”

But that’s also where the industry piece comes back in. As the military services and other DoD components adopt zero trust capabilities, the key will be ensuring the various tools and services work together.

“I don’t necessarily care whether we use exactly the same tools, or whether we use exactly the same contracts — you’d like to try to make sure that we save money as we do this – but most importantly, I want to look for those places where we need interoperability,” Hermann said. “And it’s where, to be frank, industry is not necessarily as mature as we would like. With SASE, we’re concerned that if organizations do too many different things, the vendors’ tools don’t talk to each other yet. So we’re driving for vendors to work together and establish some standards.”

Hermann said that DISA Chief Technology Officer Steve Wallace and his team have traveled to Silicon Valley to discuss interoperability with many of the technology companies involved in the zero trust security space.

“Realistically, we are so large, so complex, it’s unlikely that one single tool is going to be the selection for the entire department,” Hermann said. “And so if you need to work together and everybody gets a piece of this, going forward, how do we make sure that these things don’t generate either a bad user experience.”

From the major cloud providers on down to specialized tool vendors, DoD officials will be looking to stitch together a zero trust architecture that’s effective in securing data, but doesn’t make the user experience a nightmare.

Christopher Day, the vice president of strategic capabilities and programs and chief technology officer at Tenable, said there are some key things DoD officials and their industry partners will need to consider as they build that architecture out.

“When I’m looking at modern products, say through a procurement, those are some of the things I start to look for,” Day said. “Can I can I make that system talk to another system? How can I move data from that system? I’m not going to be locked into some proprietary formats. Things like that. If a vendor is trying to lock me in to a weird format, or something like that, I get pretty sketchy about that. And so I think anybody who’s looking to tie multiple products together, those are some of the things you want to look for.”

Learning objectives:

  • Zero trust and cyber initiatives at DISA
  • Achieving the target level of zero trust
  • Industry analysis

Complimentary Registration
Please register using the form on this page or call (202) 895-5023.

The post DISA pushes companies to adopt standards to ensure interoperability across zero trust architectures first appeared on Federal News Network.

]]>
Ask the CIO: NASA Procurement Innovation https://federalnewsnetwork.com/cme-event/ask-the-cio/ask-the-cio-nasa-procurement-innovation/ Fri, 27 Jan 2023 18:25:11 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4411490 In this exclusive webinar edition of Ask the CIO, learn about how one of the most well-known agencies partners innovation and procurement to meet its mission.

The post Ask the CIO: NASA Procurement Innovation first appeared on Federal News Network.

]]>
var config_4448677 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/traffic.megaphone.fm\/HUBB6016065045.mp3?updated=1675106250"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2018\/12\/AsktheCIO1500-150x150.jpg","title":"NASA\u2019s reconstituted data analytics office breathing more life into acquisition","description":"[hbidcpodcast podcastid='4448677']nnNASA is reconstituting its data analysis team to improve its decision making around acquisition.nnThe idea of using data to drive decisions isn\u2019t new for NASA. But with the move to centralize much of acquisition across the space agency, data analysis became more critical.nnKarla Smith Jackson, the assistant administrator for procurement and the senior procurement executive at NASA, said this data-driven analysis approach is different this time than previous attempts.nn\u201cWe're looking at how do we use data to get these better acquisition outcomes? So what does that really mean? We want to leverage e-business tools. We know we need a new contract writing system. We're in the middle of market research for that to be able to get what we're hoping is a whole procure-to-pay system. But if not, we will just take the procure piece and let the CFO deal with the pay piece,\u201d Jackson Smith said on <a href="https:\/\/federalnewsnetwork.com\/category\/radio-interviews\/ask-the-cio\/">Ask the CIO<\/a>. \u201cWe're going to integrate our grants, as well our cooperative agreements in the contracting writing system. That's one big thing that we're looking at. We're looking at, hopefully, integrating our acquisition forecasting tool with that tool, and then the planning and design of all of our e-business tools.\u201dn<h2>Review pricing trends<\/h2>nSmith Jackson said the new tools would help contracting officers and others in the acquisition workforce evaluate proposals by analyzing rates and pricing, for example.nnShe said NASA has <a href="https:\/\/federalnewsnetwork.com\/ask-the-cio\/2020\/03\/before-she-leaves-wynn-setting-nasa-up-for-a-reorganized-consolidated-it-future\/">centralized<\/a> its pricing function where each center no longer has individual experts, but now there are 35 to 40 in one office with specialization in sole source awards, competitive acquisitions as well as commercial items.nn\u201cThat gives us a lot of capability and flexibility from a data analytics to be able to look at pricing trends. We can look at rates based on the size of the business, based on geographical location or whatever segment of the marketplace. We are populating databases and dashboard now, to get some insight into those areas in support of better pricing,\u201d Smith Jackson said. \u201cWe also have business system challenges being accounting systems and estimating systems that our industry partners might have. We'll be working with the Defense Contract Management Agency as well as the Defense Contract Audit Agency to be able to get real time data to help some of our negotiators and industry partners get to better deals.\u201dnnNASA created a new division Enterprise Services and Analysis, which is led by Geoff Sage, who took over the lead role in August after spending four years as the NASA Federal Acquisition Regulation (FAR) Supplement manager, to bring new capabilities and tools to the acquisition community.nnOne of the first areas the new office is taking on is procurement administrative lead time (PALT). The Office of Federal Procurement Policy <a href="https:\/\/federalnewsnetwork.com\/reporters-notebook-jason-miller\/2021\/01\/ofpp-administrator-wooten-gives-frictionless-acquisition-a-boost-on-his-way-out-the-door\/">issued a memo<\/a> in January 2021 aimed at creating a common definition and provided guidance for how agencies can reduce their acquisition lead times.nn\u201cOur first issue was data integrity and data cleansing or cleanup. We had an average in fiscal 2021 of 325 days for award of a contract, which we know is not correct. That means data was input into a database that was not accurate. We know what happened was when we made an award for an IDIQ contract, every task order that we would order reached back to that initial award of the umbrella contract. So then you just have a protracted PALT,\u201d Smith Jackson said. \u201cWe were able to clean that up and we went to an average of 128 days in 2022. That was just based on data cleansing and cleanup. With respect to that we're now going to be able to baseline our PALT, be it a competitive procurement, a modification or a sole source procurement, and then be able to figure out where are the long poles in the tent? How do we get better? Where do we go to streamline?\u201dn<h2>Data driven-market research<\/h2>nIn addition to PALT, NASA is using data to improve its acquisition forecast to industry. Jackson Smith said while the forecast already received kudos from vendors, her office wants to give contractors the more capabilities to sort data by location or commodity or schedule.nnA third area of focus around data is <a href="https:\/\/federalnewsnetwork.com\/acquisition\/2020\/11\/nasa-project-aims-to-train-upgrade-its-acquisition-staff\/">improving NASA\u2019s market research<\/a>.nn\u201cThis goes to the heart of our ability to get more small businesses into our community, our industrial supply chain or our supplier base,\u201d she said. \u201cSometimes a small business can't hold a team together if you can't get that requirement awarded in a timely fashion. By having a market research, you know who's capable, who's qualified and when we have ready requirements, we hope to attract more small businesses.\u201dnnHaving clean and real-time data is going to be less effective unless NASA has the systems to handle it so Smith Jackson said another major priority is improving its procurement systems.nnShe said NASA needs a new contract writing system as the currently one is going to sunset in 2028.nn\u201cOur expectation is we'll complete the market research within the next 30-to-45 days, and the plan is to get a request for proposals (RFP) out in late summer,\u201d Smith Jackson said. \u201cThe other thing is with respect to an e-business tool, we do expect to leverage best-in-class (BIC) vehicles. Once we know what's out there, what we're finding in the market research is important. We don't want to duplicate the wheel of other activities or other agencies have something similar to what we need, we're going to be leveraging those abilities and working with those agencies.\u201dnnPart of not reinventing the wheel is a plan to take more advantage of BIC, including the new OASIS+ vehicles from the General Services Administration.n<h2>Big plans for OASIS+<\/h2>nSmith Jackson said she plans to release a notice to industry to strongly encouraging them to get on OASIS+ as NASA will spend a significant amount of money through that contract in the coming years.nn\u201cThat will allow us to move our spend under management to best in class contracts, and then we'll be able to leverage, not just a streamlined manner and method of doing work, but also streamline proposal process for industry to be able to respond to NASA requirements,\u201d she said. \u201cThat'll be transformational for us. I'll also be something we'll experiment first in the in the NASA Acquisition Innovation Launchpad (NAIL).\u201dnnSmith Jackson added the agency doesn\u2019t plan on spending 100% of its services spend on OASIS+, but, through the NAIL, they will run a few test cases and see how it works for them.nnNASA <a href="https:\/\/federalnewsnetwork.com\/acquisition\/2022\/10\/nasa-taking-a-page-out-of-dhs-book-with-a-new-acquisition-innovation-lab\/">launched the NAIL<\/a> earlier this month, modeling it after the Homeland Security Department\u2019s Procurement Innovation Lab. Her goal is to create a safe place for mission and acquisition experts to experiment with ways to reduce cycle time and ways to be more cost effective.nn\u201cThe way the NAIL is going to be set up is at the headquarters, where I sit and where we're going to basically manage the framework. We're putting together a NASA Innovation Council and they'll have a representative from each one of our 10 centers,\u201d she said. \u201cWhat we're looking for is proposed ideas for what we call enterprise level innovation. We're not looking for individual innovations at each center because we are going to have a testbed at each center, where we can experiment on as much smaller scale. But the things that will be proposed to the headquarters will be enterprise level. We're looking at two or more centers that would be impacted. Ideally, the entire enterprise would benefit. We're going to be looking for I mentioned procurement innovations, but also we're looking for program management innovations. That NAIL Innovation Council will be feeding us those. Now as a companion to that we're hoping to have a NASA industry Innovation Council, and that will feed us innovations that industry would like us to look at.\u201dn<div data-empty="true"><\/div>n "}};

NASA is reconstituting its data analysis team to improve its decision making around acquisition.

The idea of using data to drive decisions isn’t new for NASA. But with the move to centralize much of acquisition across the space agency, data analysis became more critical.

Karla Smith Jackson, the assistant administrator for procurement and the senior procurement executive at NASA, said this data-driven analysis approach is different this time than previous attempts.

“We’re looking at how do we use data to get these better acquisition outcomes? So what does that really mean? We want to leverage e-business tools. We know we need a new contract writing system. We’re in the middle of market research for that to be able to get what we’re hoping is a whole procure-to-pay system. But if not, we will just take the procure piece and let the CFO deal with the pay piece,” Jackson Smith said on Ask the CIO. “We’re going to integrate our grants, as well our cooperative agreements in the contracting writing system. That’s one big thing that we’re looking at. We’re looking at, hopefully, integrating our acquisition forecasting tool with that tool, and then the planning and design of all of our e-business tools.”

Review pricing trends

Smith Jackson said the new tools would help contracting officers and others in the acquisition workforce evaluate proposals by analyzing rates and pricing, for example.

She said NASA has centralized its pricing function where each center no longer has individual experts, but now there are 35 to 40 in one office with specialization in sole source awards, competitive acquisitions as well as commercial items.

“That gives us a lot of capability and flexibility from a data analytics to be able to look at pricing trends. We can look at rates based on the size of the business, based on geographical location or whatever segment of the marketplace. We are populating databases and dashboard now, to get some insight into those areas in support of better pricing,” Smith Jackson said. “We also have business system challenges being accounting systems and estimating systems that our industry partners might have. We’ll be working with the Defense Contract Management Agency as well as the Defense Contract Audit Agency to be able to get real time data to help some of our negotiators and industry partners get to better deals.”

NASA created a new division Enterprise Services and Analysis, which is led by Geoff Sage, who took over the lead role in August after spending four years as the NASA Federal Acquisition Regulation (FAR) Supplement manager, to bring new capabilities and tools to the acquisition community.

One of the first areas the new office is taking on is procurement administrative lead time (PALT). The Office of Federal Procurement Policy issued a memo in January 2021 aimed at creating a common definition and provided guidance for how agencies can reduce their acquisition lead times.

“Our first issue was data integrity and data cleansing or cleanup. We had an average in fiscal 2021 of 325 days for award of a contract, which we know is not correct. That means data was input into a database that was not accurate. We know what happened was when we made an award for an IDIQ contract, every task order that we would order reached back to that initial award of the umbrella contract. So then you just have a protracted PALT,” Smith Jackson said. “We were able to clean that up and we went to an average of 128 days in 2022. That was just based on data cleansing and cleanup. With respect to that we’re now going to be able to baseline our PALT, be it a competitive procurement, a modification or a sole source procurement, and then be able to figure out where are the long poles in the tent? How do we get better? Where do we go to streamline?”

Data driven-market research

In addition to PALT, NASA is using data to improve its acquisition forecast to industry. Jackson Smith said while the forecast already received kudos from vendors, her office wants to give contractors the more capabilities to sort data by location or commodity or schedule.

A third area of focus around data is improving NASA’s market research.

“This goes to the heart of our ability to get more small businesses into our community, our industrial supply chain or our supplier base,” she said. “Sometimes a small business can’t hold a team together if you can’t get that requirement awarded in a timely fashion. By having a market research, you know who’s capable, who’s qualified and when we have ready requirements, we hope to attract more small businesses.”

Having clean and real-time data is going to be less effective unless NASA has the systems to handle it so Smith Jackson said another major priority is improving its procurement systems.

She said NASA needs a new contract writing system as the currently one is going to sunset in 2028.

“Our expectation is we’ll complete the market research within the next 30-to-45 days, and the plan is to get a request for proposals (RFP) out in late summer,” Smith Jackson said. “The other thing is with respect to an e-business tool, we do expect to leverage best-in-class (BIC) vehicles. Once we know what’s out there, what we’re finding in the market research is important. We don’t want to duplicate the wheel of other activities or other agencies have something similar to what we need, we’re going to be leveraging those abilities and working with those agencies.”

Part of not reinventing the wheel is a plan to take more advantage of BIC, including the new OASIS+ vehicles from the General Services Administration.

Big plans for OASIS+

Smith Jackson said she plans to release a notice to industry to strongly encouraging them to get on OASIS+ as NASA will spend a significant amount of money through that contract in the coming years.

“That will allow us to move our spend under management to best in class contracts, and then we’ll be able to leverage, not just a streamlined manner and method of doing work, but also streamline proposal process for industry to be able to respond to NASA requirements,” she said. “That’ll be transformational for us. I’ll also be something we’ll experiment first in the in the NASA Acquisition Innovation Launchpad (NAIL).”

Smith Jackson added the agency doesn’t plan on spending 100% of its services spend on OASIS+, but, through the NAIL, they will run a few test cases and see how it works for them.

NASA launched the NAIL earlier this month, modeling it after the Homeland Security Department’s Procurement Innovation Lab. Her goal is to create a safe place for mission and acquisition experts to experiment with ways to reduce cycle time and ways to be more cost effective.

“The way the NAIL is going to be set up is at the headquarters, where I sit and where we’re going to basically manage the framework. We’re putting together a NASA Innovation Council and they’ll have a representative from each one of our 10 centers,” she said. “What we’re looking for is proposed ideas for what we call enterprise level innovation. We’re not looking for individual innovations at each center because we are going to have a testbed at each center, where we can experiment on as much smaller scale. But the things that will be proposed to the headquarters will be enterprise level. We’re looking at two or more centers that would be impacted. Ideally, the entire enterprise would benefit. We’re going to be looking for I mentioned procurement innovations, but also we’re looking for program management innovations. That NAIL Innovation Council will be feeding us those. Now as a companion to that we’re hoping to have a NASA industry Innovation Council, and that will feed us innovations that industry would like us to look at.”

 

The post Ask the CIO: NASA Procurement Innovation first appeared on Federal News Network.

]]>
Cyber leaders aim to embed zero trust principles in systems https://federalnewsnetwork.com/cme-event/federal-insights/speeding-up-the-move-to-zero-trust/ Tue, 15 Nov 2022 13:53:17 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4352877 As civilian and defense agencies work through the nuances of incorporating zero trust strategies, the question becomes: How can this process be sped up? During this exclusive webinar, moderator Justin Doubleday will discuss tools and techniques accelerating the move to zero trust with agency and industry leaders.

The post Cyber leaders aim to embed zero trust principles in systems first appeared on Federal News Network.

]]>
Duration: 1 hour
Cost: No Fee

Officials behind the Air Force’s marquee cloud development platform are planning to help customers with the daunting first steps toward a “zero trust” security posture.

The Defense Department and the rest of the federal government have a mandate to adopt zero trust in the coming years. The Air Force’s Platform One is trying to simplify that journey for its customers by embedding some zero trust capabilities into the custom software factories it builds through its “Big Bang” offering.

“Instead of trying to look at the maturity model, seeing the many things that you have to do, and getting overwhelmed thinking you have to eat the elephant all at once, eat it a little bit at a time,” Kevin Twibell, the chief information security officer for Platform One, said on a “Speeding Up The Move To Zero Trust” panel.

“Let’s start with what is your system doing or not doing? What’s your code doing or not doing?” Twibell continued. “It’s not just the infrastructure that’s zero trust. It’s what’s your application is doing or not doing or supposed to be doing. So that’s why we want to embed all of that, we want to be able to build a better Big Bang that people can deploy and get a notable value out of and those people also talk back to us again, that open source community, talk to us, let us know how it went.”

Platform One is also looking at scaling the Cloud Native Access Point, or CNAP, which provides secure, authorized access to Defense Department resources in a commercial cloud environment.

“How does this scale? How does this go to protect other DoD systems?” Twibell said. “How do we continue to evolve that? How do we continue to integrate a better [identity credentialing and access management] into it? How do we look at more data tracking or tagging that can be embedded into each one of those features? So between those two programs and more on the back end, we’re constantly trying to embed zero trust. We need to utilize those services. When you deploy them, you’re getting a piece of that. So it’s less that you have to figure out on the back end for your infrastructure.”

The U.S. Patent and Trademark Office, meanwhile, has been designated as one of the key customer service providers by the Biden administration. That means Jamie Holcombe, the chief information officer at USPTO, has to be especially concerned about protecting the patents and other data that traverse his agency’s networks.

“It’s really essential that we know who’s coming in, and we authenticate that,” Holcombe said, adding that the agency is introducing multifactor authentication for its users. “We’re forcing our new users to authenticate.”

One facet of zero trust is assuming your network will be breached, and architecting it accordingly, so attackers can’t move laterally once they break in. But it can be a surprising exercise for organizations to actually map the dependencies and connections within their enterprise, according to Gary Barlet, the federal chief technology officer at Illumio.

“They suddenly realize, ‘Hey, why does my web front end have a connection to my database that’s not supposed to be there,” Barlet said. “And it’s amazing the number of people that discover that they’ve got all these interconnections that they were never aware of. So we encourage people to go in, get that visibility, and then start turning those connections off. If a web interface doesn’t need to talk to a database server, then you shut down those ports and protocols that are being used for that communication.”

Learning objectives:

  • Approaches to zero trust in the cybersecurity space
  • Tools and techniques of zero trust
  • Cloud-based infrastructure and zero trust

Complimentary registration
Please register using the form on this page or call (202) 895-5023.

The post Cyber leaders aim to embed zero trust principles in systems first appeared on Federal News Network.

]]>
Federal Executive Forum Defense and Homeland Cloud Computing in Government https://federalnewsnetwork.com/cme-event/federal-executive-forum/federal-executive-forum-defense-and-homeland-cloud-computing-in-government/ Tue, 15 Nov 2022 12:18:04 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4352829 Defense and Homeland Security agencies are focused on the mission of protecting the United States. But how do cloud computing programs fit into the next level of this mission? During this webinar, you will gain the unique perspective of top government security experts from the Air Force, USCIS, DHS, Navy and industry leaders. 

The post Federal Executive Forum Defense and Homeland Cloud Computing in Government first appeared on Federal News Network.

]]>

Duration: 1 hour
Cost: No Fee

Defense and Homeland Security agencies are focused on the mission of protecting the United States. But how do cloud computing programs fit into the next level of this mission?

During this webinar, you will gain the unique perspective of top government security experts from the U.S. Army, DHS, USCIS, U.S. Navy and industry leaders.

The following experts will explore what the future of cloud computing in government means to you:

  • Paul Puckett, Director, Enterprise Cloud Management Agency, U.S. Army
  • Shane Barney, Chief Information Security Officer, U.S. Citizenship and Immigration Services
  • Dr. Mark Lucas, Director, Cloud Computing Operations, Department of Homeland Security
  • Louis Koplin, Deputy Chief Technology Officer, DON CIO, Department of the Navy
  • Jonathan Alboum, Federal Chief Technology Officer, ServiceNow
  • David Kelly, Technology Fellow, Deloitte Consulting
  • Evong Chung, Senior Director, Solutions Architecture, Red Hat
  • Moderator: Luke McCormack, Host of the Federal Executive Forum

Panelists also will share lessons learned, challenges and solutions, and a vision for the future.

Registration is complimentary. Please register using the form on this page or call (202) 895-5023.

The post Federal Executive Forum Defense and Homeland Cloud Computing in Government first appeared on Federal News Network.

]]>
Modernizing Cybersecurity at the Defense Department https://federalnewsnetwork.com/cme-event/federal-insights/modernizing-cybersecurity-at-the-defense-department/ Wed, 05 Oct 2022 19:53:38 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4286099 Thunderdome must clear an operational assessment and red team tests. DISA also faces the hurdles of scaling a new security tool and processes across enterprise DoD networks. We talk with DISA’s Drew Malloy about the challenges ahead.

The post Modernizing Cybersecurity at the Defense Department first appeared on Federal News Network.

]]>
Duration: 1 hour
Cost: 
No Fee

The Defense Information Systems Agency’s zero trust security model, Thunderdome, is moving toward the end of its pilot stage with a fielding decision scheduled for January 2023. First, the prototype needs to pass some tests before DISA can move on to the challenge of scaling it across Defense Department networks.

DISA awarded the Thunderdome other transaction agreement to Booz Allen Hamilton in January 2022. The OTA was extended for six months in July to work on a prototype to meet the requirements of the classified Secure Internet Protocol Router Network (SIPRNet), in addition to the original unclassified capabilities.

Drew Malloy, technical director for DISA’s cybersecurity and analytics directorate, says Thunderdome is now in “operational assessment” phase. The program will go through red teaming — where testers simulate adversarial activity — before going to a fielding decision in January.

Thinking beyond the perimeter to gain end-to-end, cyber perspective

The zero trust model uses commercial capabilities like Secure Access Service Edge and software defined-wide area networks.

“The SASE solution is really top of mind as one of the big ones. We have our SD-WAN component that has security kind of built into our SD-WAN product so that we have more of a security stack that we can take and push down to the customer edge — get it closer to the data so that it’s more performant,” Malloy said.

“Then, we are taking a look at the application and our application security stacks that we have, as well as looking at from a cyber situational awareness perspective, how we do defensive cyber operations in the cloud? We’ve been very network-centric in our defensive posture for a lot of what we’re doing,” he continued. “So we wanted to take a look at all of the telemetry that’s being thrown off of some of the things that we’re putting out as part of Thunderdome. How do we look at that from an end-to-end perspective? Now, we aren’t just defending the perimeter or defending an application stack, we’re actually looking at end-to-end, user session–based security.”

Malloy said beyond getting Thunderdome’s capabilities organized, certified and tested, his team is also focused on the task of scaling

“The high-level strategy has remained relatively consistent, but we’ve looked at the actual implementations and what they’re going to look like,” he said. “We have such a huge footprint. How many different sites are we going to have? How are we going to manage those sites? What’s the provisioning going to look like? What’s the sustainment tail going to look like? Things of that nature have really been top of mind for how we push things out.”

Thunderdome’s zero trust prototype is intended to eventually meet the security needs of DoD’s fourth estate, agencies that aren’t a part of the military services. Malloy said DISA is open to partnering with other mission partners.

“But as a department, we have a pretty consistent track record of not agreeing on what one single solution is,” he added. “So we wanted to operate with that as a design constraint in mind to say, ‘There are going to be other solutions out there. How do we make sure that we work well together? How do we interoperate?’ That comes down to things as basic as identity, credential and access management. And then, how do we federate that solution to make sure that there’s really that consolidated view of identity within the department? And then moving to some of the capabilities within Thunderdome itself, how do we make sure that we aren’t isolating ourselves and/or having to stand up duplicative systems in order to achieve the same goal?”

Defense organizations move toward ‘quasi-enterprise’ zero trust

The military services typically have their own set of priorities and programs, and that’s no different with DoD’s zero trust security push so far.

“You’re not going to look at the Air Force, Navy, Marines, Army and tell them, ‘Wait for DISA to solve this for you,’ ” said Gram Slingbaum, federal solutions engineer at CyberArk. “The identity programs that are coming out of all the different services, they’re kind of having to stand up their own. And so they’re all quasi-enterprise. It’s not DoD-wide, but it’s maybe for a specific branch.”

The Pentagon has set a goal of reaching zero trust maturity across the vast U.S. military enterprise by 2027, while the White House Office of Management and Budget’s Federal Zero Trust Strategy directs civilian agencies to work toward adopting a zero trust architecture by the end of fiscal 2024.

The deadlines have agency IT teams moving out quickly to establish zero trust capabilities that can one day be implemented across their enterprises, trying to balance both speed and scale.

“We’re seeing pockets where it’s been highly successful and then grows into a bigger program,” Slingbaum said. “We’re also seeing folks that are on a smaller scale but have lost funding because it wasn’t an enterprise program. They pushed up the chain and were told to wait in line until the larger system comes on. So there’s not a one-size-fits-all here.”

Learning Objectives:

  • Thunderdome and Zero Trust
  • What’s after Thunderdome?
  • Industry analysis

Complimentary Registration
Please register using the form on this page or call (202) 895-5023.

The post Modernizing Cybersecurity at the Defense Department first appeared on Federal News Network.

]]>
For DoD, software modernization and cloud adoption go hand-in-hand https://federalnewsnetwork.com/cme-event/federal-insights/pushing-forward-on-dod-software-modernization/ Mon, 26 Sep 2022 19:47:45 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4266255 During this exclusive webinar, moderator Jared Serbu and guest Lily Zeleke, acting DCIO for information enterprise, Office of the DoD CIO with the Department of Defense will discuss software modernization strategy at the Department of Defense. In addition, Cindi Stuebner, futurist and senior director, industry markets, defense business line at Pegasystems will provide an industry perspective.

The post For DoD, software modernization and cloud adoption go hand-in-hand first appeared on Federal News Network.

]]>
Duration: 1 hour
Cost: 
No Fee

The software factories that have sprung up throughout the Defense Department – and their embrace of DevSecOps methodologies – are a central part of how the Pentagon is thinking about modernizing its software development practices. But from the point of view of DoD’s top enterprise IT official, their growing role in the department’s technology development ecosystem is, in some ways, more of an evolution than a revolution.

The department released its first-ever software modernization strategy in February, and officials are now finalizing a more detailed plan to implement it. The strategy emphasizes the need to speed up software delivery times, including by better coordinating its existing factories to share code and commercial development tools across service boundaries.

Lily Zeleke, the acting deputy DoD chief information officer for information enterprise, said DevSecOps and the factories have maintained their importance while the CIO’s office – together with the department’s acquisition and research and engineering leadership – have worked to refine the implementation guidance.

But she said it’s also important to bear in mind that DoD isn’t new to the software development game.

“Ultimately, these are places where capabilities are already being developed. All software factories and the DevSecOps practice do is accelerate what we need to do to modernize the capabilities that we need. They seem sort of a mystery, but they’re really not. They’re sort of an evolution of development that we already do,” Zeleke said in an interview with Federal News Network. “Most of the ones you see are actually very mission-specific, whether it’s shipboard or airframe related, etcetera. We want to evolve the processes to make sure that we’re talking about the mission, the functionality, and what the software factories fulfill … that’s where the focus of some of the implementation plan is, when we put out guidance, these things need to be more at the forefront for our mission.”

The new software modernization strategy also serves as the latest iteration of DoD’s cloud computing strategy. It says the department needs a multi-vendor approach to commercial cloud services, and that it’s still a priority to migrate systems to the cloud.

But the emphasis isn’t on moving to the cloud for its own sake. Zeleke says the department now sees cloud computing as, first and foremost, an “enabler” for its technology modernization efforts.

“For us to be able to do all of the things we’ve outlined in the software modernization strategy, we need the cloud capabilities to enable the accelerated and secure platforms. We need commercial-enabled services and the ability to sort of move at the pace of the threat,” she said. “Of course, there’s clearly cloud-related initiatives and activities that must take place, but it is imperative that the cloud is enabling what we’re trying to do: modern software practices. So they really sort of go hand-in-hand with the initiatives that we have.”

The department is preparing to award contracts worth up to $9 billion as part of its Joint Warfighting Cloud Capability (JWCC) procurement. Awards to up to four companies are now expected in December after the initial award date – March 2022 – was postponed.

The JWCC approach differs from DoD’s previous, ill-fated JEDI Cloud contract in several key ways. Not only does it envision multiple vendors instead of one, but its use won’t be mandatory for DoD components who’ve already established their own contracts with commercial cloud providers. At least not initially.

“DoD and the military  services have been doing cloud for a very long time, which has actually informed what gaps and what potential urgent unmet needs we have, like at the tactical edge and [outside the continental U.S. I really believe JWCC and the military services’ cloud offerings bring something to the table that we all need. So JWCC is a complementary capability, and not something that is trying to take over what the services are doing already,” Zeleke said. “As as the services run out their contracts and JWCC meets their needs, we certainly want to onboard them [to the new contract]. But I really honestly believe there’s so much that we need to do, based on everything we talked about in our software modernization strategy, that every single one of these cloud capabilities are going to be required.”

Meanwhile, DoD is trying to make sure the new cloud services it offers align with its future security models as the department evolves toward zero trust over the next five years. As part of the development of a forthcoming zero trust strategy, expected to be released any day now, the department has held talks with commercial cloud providers to make sure their environments can accommodate DoD’s model.

“I believe zero trust is the undergirding imperative to where we’re going with cloud and software based capabilities, and DevSecOps as the norm,” Zeleke said. “They go hand in hand: integrating cybersecurity with the process so you’re delivering secure at every stage, from end to end. Zero trust is not a widget you just put on to the cloud, it is really a conglomerate of cybersecurity requirements that are part of our system already. But that will evolve to make us fully compliant with zero trust.”

Learning Objectives:

  • Software modernization strategy
  • Cloud and JWCC
  • Industry analysis

Complimentary Registration
Please register using the form on this page or call (202) 895-5023.

The post For DoD, software modernization and cloud adoption go hand-in-hand first appeared on Federal News Network.

]]>
Taking an API, cloud driven approach can lead agencies to better mission outcomes https://federalnewsnetwork.com/cme-event/federal-insights/maximizing-security-and-flexibility-in-the-orbit-of-a-cloud-migration/ Fri, 23 Sep 2022 14:55:04 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4261188 Granting access to, sharing and securing data in the cloud continues to challenge agencies across government. During this exclusive webinar, moderator Jason Miller will discuss cloud strategy, security and applications with agency and industry leaders.

The post Taking an API, cloud driven approach can lead agencies to better mission outcomes first appeared on Federal News Network.

]]>
Duration: 1 hour
Cost: No Fee

Agencies are expected to spend more than $18.6 billion dollars a year on cloud services by fiscal 2024. That’s the latest estimate from market research firm Deltek.

That is up from $14.5 billion in 2022.

One other big trend Deltek found was the move of DevSecOps processes to the cloud.

All of this means agencies are putting more data into these instances and that could pose a series of new challenges.

Among those challenges agencies need to deal with is how to access that data and share it among the different cloud-based systems, and then how to secure that data at rest and in transit.

The use of application programming interfaces (APIs) is a big part of this puzzle. APIs can help agencies access data from disparate clouds to better drive decisions. APIs can bring together legacy systems that otherwise can’t talk or share their data.

But at the same time, a recent report from Gartner says APIs are well on their way to becoming the main attack vector in 2022.

Agencies must take steps to ensure their applications are secure and flexible to help unleash their data and workforce’s talent.

Gary Parker, a cloud architect at the U.S. Postal Service, said his organization’s focus on modernization is about redesigning and securing legacy applications.

“We are taking our enterprise legacy services, our big hitters, and we’re decomposing them and we’re re architecting them into microservices,” Parker said during the discussion Maximizing Security and Flexibility in the Orbit of a Cloud Migration “We’re not focused on a single cloud. We started small with a conversational AI application for passport inquiries, and then, from there, we developed organizational policies and procedures. Then we moved on to what we call our get it right initiatives. These are our big initiatives that we need to focus on for delivering for America. Those would include, for example, our package tracking API work. We’re very close to having that ready. The COVID test kit initiative from the White House, we built that end-to-end in the cloud.”

A mobile training experience

The 19th Air Force is taking a bit of a different approach. They are not modernizing existing systems, but creating new approaches to pilot training.

Brian Kirk, the deputy pilot training transformation IT lead and senior software engineer for the 19th Air Force, said instead of providing training through a desktop application, the goal is to make it mobile.

“We have a lot of pilots that move around so having a static home base network isn’t real helpful for them as they move through the pilot training. We are building all of this strategy on an API-first methodology so that we can bring in applications and remove applications without affecting any of the other partners, programs and applications that we’ve got,” he said. “We’ve got it in its very infantile stages. At this point, we have introduced some students into the program, more of the content management side versus the full learning management aspect of it. But hopefully, this this coming January, we will be pretty much running full steam in our first program.”

Kirk said the API-first approach is important because the Air Force is collecting pilot training data from multi bases across the country. He said getting all of that in sync to verify the progress of pilot trainees couldn’t happen without the cloud and APIs.

“As we’re training these pilots, we’re flying to distant locations, not necessarily 100% of the time to another military base, so they will stay overnight at that location and come back the next day. Well, traditionally, the instructors have to write everything down and then when they get back to the home base, they find the computer that’s got the application on it and they insert it there,” he said. “We’re just trying to free up some of that so that you know the best memory is the freshest memory. Rather than waiting an entire day or several hours or even several days potentially to get that information into the system, we’re trying to make it available to them pretty much anywhere they’re at that they have an internet connection, and the cloud gives us that capability.”

Real-time feedback to drive decisions

Lt. Col. Kim Hoffman, the division chief of innovation and technology for 19th Air Force, said the improved feedback is leading to better pilots.

“The cloud in the infrastructure that we’re building lets students and instructors go back and review those notes, review those grade sheets, review any videos from the virtual reality devices that we’ve used our immersive training devices, they can, they can view that real time on the road. They can even go in and practice the scenario before they’re back out there,” Hoffman said. “Now when they’re preparing for their return flight back home on Sunday, they’ve already gone through those motions and flown it in an actual simulated environment as opposed to just sitting there and running through it step by step. So being able to access this data in real time will help both the students and the instructors. It also feeds back into the system on that data analyst side of the house of see how many times this student practiced while they were at home on their own time see how much better their grades are, or how worse they are. That feeds back into our syllabus and our courseware, the type of videos are we providing them and what kind of devices are we leaning into for the next iteration.”

She said all of that data is helpful to improve all aspects of pilot training.

Alexis Bonnell, emerging technology evangelist for government at Google, said whether it’s the Air Force, the Postal Services or any other agency, the goal is not to move to the cloud, but to unleash their mission once they’re in the cloud.

“This move of API-first is really leaning into information flows versus repository mentalities. The way I think about it is that idea of catalyzing information, maybe even beyond controlling it,” she said. “I think we saw that for really the last 10 years on the commercial side, but I think now you’re really seeing public servants come out of the last three years and realize that they are going to have a higher rate of change than ever before. But more importantly, in that role of information steward, they’re going to have more access to more information than ever before, whether that’s inside or whether that’s outside the organization. So really this idea of how you use technology to be able to be curious, to be able to lean into those information flows.”

Unlocking creativity through the cloud

The use of APIs and other approaches to unlock the data is part of a move toward more creativity across the public sector.

Vint Cerf, the vice president and chief internet evangelist at Google, said agencies are discovering new ways to use data and make decisions because of what the cloud, APIs and other infrastructure technologies can provide.

“We encounter customers who have on-premise investments, including information that they want to keep there, so building systems that will allow interworking running loads on-premise and then running an expanded load, for example, in the cloud, or other ways of being helpful to our customers moving into a cloud environment,” Cerf said. “The idea that you’re stepping into a computing environment, which is very different from a closed world, is super important. At the same time, because you’re stepping into an environment which could be less closed then you’re accustomed to, we have other concerns like security and authenticity and protection of information and use of cryptography and the like, which need to be drawn into the architecture that is being used by our customers in order to literally weave a service that meets their needs, but also does so in a more expansive way.”

Learning objectives:

  • Cloud strategy for workloads and applications
  • Security considerations and the cloud
  • Use cases

Complimentary registration
Please register using the form on this page or call (202) 895-5023.

The post Taking an API, cloud driven approach can lead agencies to better mission outcomes first appeared on Federal News Network.

]]>
Building zero trust as IT devices continue to multiply https://federalnewsnetwork.com/cme-event/federal-insights/it-asset-management-in-the-era-of-zero-trust/ Mon, 12 Sep 2022 20:59:02 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4243126 During this exclusive webinar, moderator Scott Maucione and guest Steve Wallace, chief technology officer at the Defense Information Systems Agency will discuss the IT landscape and asset management in the era of zero trust. In addition, Tom Kennedy, vice president at Axonius will provide an industry perspective.

The post Building zero trust as IT devices continue to multiply first appeared on Federal News Network.

]]>
Duration: 1 hour
Cost: 
No Fee

Technological devices are proliferating as citizens and the government embrace new products to make their lives and work easier. With the expansion of the internet of things, more and more devices are pairing to networks and providing never-before-seen data.

While the advances are helping government and companies work more efficiently, the explosion of devices can be hard to wrangle and create potential security risks.

“Asset management seems on the surface like a real simple thing,” said Tom Kennedy, vice president of Axonius Federal Systems during the strategy session IT Asset Management in the Era of Zero Trust, sponsored by Axonius. “But what we found is that the IT landscape has grown in complexity, so much over the over the last dozen years or so, that most of those tools now give you fragmented, incomplete views of the enterprise.”

Kennedy said the change in the IT landscape can be equated to a slow boil. The IT world went from a fairly homogenous environment to one that added mobile devices, cloud and software-as-a-service applications.

“It really has gone from a very simple to a very complex and diverse environment pretty rapidly in the last dozen or so years,” he said. “The good news is all your asset management is out there, right? It’s, just in silos and spread out and not coordinated.”

One area that the Defense Department and private industry are particularly concerned about with those silos it the emergence of gray IT and rogue devices.

Gray areas that are not properly watched over by enterprise systems and rogue devices are ones that are not recognized by systems, but show up on networks.

“If you think about most endpoint detection tools, they’re all agent based,” Kennedy said. “Those tools can tell you where your agent is installed, they can tell you all the devices that your agent is installed. But what about the devices that the agent is not installed on? Most government agencies have network access control systems, so they can tell you precisely all the devices that are on the network. But what about an unauthorized device that’s not on their network? If you think about the emergence of working from home and telework this has been exacerbated tremendously.”

Kennedy said these threats increase the need for a zero trust architecture within agencies. Zero trust architectures only allow needed permissions for devices, keeping most of them away from administrative access.

“People are government agencies are searching for ways to meet the Biden Administration’s requirement for zero trust,” Kennedy said. “There’s no one set way that you have to do zero trust. It’s just a methodology that you’re trying to try to build into your organizational policies. If you don’t have 100% confidence that you know exactly where all your devices are across your enterprise, then it’s kind of hard to get going with zero trust. Many of our clients are at the very early stages of zero trust. They’re looking at cyber asset management as a foundational step to get a firm grasp on our state data and the device records out there. Then we can launch into identity management and other factors within zero trust.”

Kennedy said it’s important for organizations to get a grasp on their master device records first and then move to building out their architecture.

Learning objectives:

  • Thunderdome overview
  • SOAR overview
  • The evolution of the IT landscape
  • Zero trust
  • Industry analysis

Complimentary registration
Please register using the form on this page or call (202) 895-5023.

The post Building zero trust as IT devices continue to multiply first appeared on Federal News Network.

]]>
Data forms the foundation of fraud detection programs https://federalnewsnetwork.com/cme-event/federal-insights/staying-ahead-of-fraud-waste-and-abuse/ Fri, 09 Sep 2022 15:13:54 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4239775 Few phenomena are more troubling for federal programs than the loss of dollars through fraud. Each year this form of waste results in billions of dollars of improper payments. Health care delivery programs are among the biggest targets. Watch on-demand as we talk with leaders from the CMS, HHS, VA, and Optum Serve.

The post Data forms the foundation of fraud detection programs first appeared on Federal News Network.

]]>
Date: September 20, 2022
Time: 2:00 pm ET
Duration: 1 hour
Cost: 
No Fee

Few phenomena are more troubling for federal programs than the loss of dollars through fraud. Each year this form of waste results in billions of dollars of improper payments. Health care delivery programs are among the biggest targets.

Agencies have many tools to ensure the percentages are low and to prevent fraud and program abuse, rather than try and recover lost funds.

Fraud and abuse prevention are central activities at the Center for Medicare and Medicaid Services, part of the Health and Human Services Department. Its programs are more extensive than people often recognize. In addition the basic Medicare and Medicaid programs themselves, CMS also encompasses Medicare Part D, the Children’s Health Insurance Program, and the marketplace for the Affordable Care Act.

In fact, CMS Medicare alone, CMS handles 1.2 billion claims submitted per year. That’s according to Dara Corrigan, the CMS deputy administrator and director of its Center for Program Integrity. During a panel discussion on ways to stay ahead of fraud, waste and abuse, Corrigan said fraud prevention requires continuous attention and agility.

“Those who want to commit fraud are always trying to stay ahead of the system,” Corrigan said. “So if there is a new benefit, or there is a new challenge, or something unprecedented, like the pandemic, it opens up new doors for people to commit fraud.”

At the scale of CMS, keeping ahead of fraud requires timely and accurate program data, Corrigan said, as well as carefully calibrated data analytics applications.

“What we’re always trying to do with our data analytics,” she added, “is to have the most correct and accurate data in the same place at the same time. So that we can be using algorithms and analytics to try and see where the fraud is going or where it might be starting.” Pay-and-chase, Corrigan said, is never efficient.

Miranda Bennett, the assistant inspector general for investigations at HHS, underscored the importance of fraud prevention, noting that HHS spent $2.4 trillion through CMS in fiscal 2021,

“Fraud schemes shift constantly, and we work very diligently to stay ahead of those fraudsters,” Bennett said. Therefore it’s critical, she said, to use data and technology to detect fraud schemes as early as possible. That in turn enables the Office of Inspector General to make timely recommendations.

The Veterans Affairs Department, with nine million direct beneficiaries, more than 150 medical centers, and a $240 billion annual budget, shares many of the fraud and abuse challenges with CMS.

“It’s a task that we have to constantly be thinking about. What is coming down the pipeline,” said David Johnson, VA’s assistant IG for investigations, “as opposed to what’s already happened.” VA uses data analytics, social media and new tools “to detect fraud before it even occurs.”

Besides medical and disability claims frauds, Johnson said, the VA must also watch for employee embezzlement and drug diversion at its medical facilities, some of which resemble mini-cities.

Another issue is procurement, Johnson said.

“VA is one of the largest purchasers of goods and services, to supply our medical service, our medical system,” Johnson said. In acquiring pandemic-related “we ran into a number of vendors, who either claimed to have PPE that they did not have, or they were trying to sell counterfeit or substandard goods.”

Start with data

The private sector healthcare industry faces fraud and abuse problems similar those of federal organizations. Amanda Warfield, the vice president of program integrity at Optum Serve, said industry’s approach to combatting fraud also starts with data.

“There’s data in different places,” Warfield said. “Having the data all in one place gives you that visibility, to do trend analysis and to use technologies like artificial intelligence and machine learning to get predictive in nature, to be able to see where fraud schemes are moving.”

Often, when data is compared to program rules, anomalies will manifest themselves and call for deeper investigation, panelists agreed. Other times, the organization may receive a tip that will touch off an investigation, suggesting data sets to assemble for a specific purpose.

“One of our best sources are the beneficiaries of the program,” Corrigan said, people who comb their claims in detail or “they go to their physician and something seems off.” Calls then come into the agency’s 800 numbers or to the inspector general.

Another strategy is to view the program as a recipient or beneficiary would – or as a thief would. Almost like black-hat hacking, panelists said, pressure testing your own program can reveal potential vulnerabilities early enough to prevent fraud.

Agencies are finding that as fraud prevention becomes more and more a data-driven activity, it’s wise to make sure data officers and data scientists are part of the team. Bennett said IG auditors regularly work with data scientists “that can evaluate each data set and how best to marry it up with our technologies.”

The VA’s Johnson said his office also partners with what VA refers to as community care providers, private health care organizations that also provide medical services to veterans. Beyond that, he said, “We have actually formed a task force with the Department of Justice, the Veterans Affairs Health Care Fraud Task Force.” It’s been running for three years, and also includes “partnering with the strike forces that CMS and HHS OIG have already established.”

Johnson added, “One of the things that I’ve wanted to emulate over at VA is the robust data analytics programs that the strikes forces use to detect potential fraud and try to use them, particularly in the growing community care programs.”

Such programs show up clear fraud indicators such as millions of dollars of billing in a single day from a single source, or a surgeon billing for two operations simultaneously on opposite coasts.

Also helpful, Warfield said: Unique provider and biller identifiers, and the ability to do entity matching across private and public domains. She described entity matching as “technology designed to pair two different datasets together and, and boil it down to, ‘this is the same entity in this one and that one,’ so that you can do those further downstream types of analytics and reporting.”

She emphasized the importance of a team approach, pairing technologists and data scientists with program people who know the rules and what it takes to support a criminal investigation.

“We have found that having … our IT and data folks, really partnering with the folks who are doing the investigations, is really critical.”

Learning objectives:

  • Challenges in Preventing Fraud, Waste and Abuse
  • Data Management
  • Use Cases

Complimentary Registration
Please register using the form on this page or call (202) 895-5023.

The post Data forms the foundation of fraud detection programs first appeared on Federal News Network.

]]>
CISA aims to provide agencies with dashboard of capabilities for identity management https://federalnewsnetwork.com/cme-event/federal-insights/ciso-handbook-icam-and-zero-trust/ Mon, 29 Aug 2022 15:04:03 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4220028 During this exclusive CISO Handbook webinar, moderator Justin Doubleday and guest Ross Foard, ICAM subject matter expert with the Cybersecurity and Infrastructure Security Agency will explore how ICAM factors into zero trust and other modern security practices. In addition, Bryan Murphy, senior director at CyberArk, will provide an industry perspective.

The post CISA aims to provide agencies with dashboard of capabilities for identity management first appeared on Federal News Network.

]]>
Duration: 1 hour
Cost: 
No Fee

The Cybersecurity and Infrastructure Security Agency is helping agencies with crucial identity, credential and access management capabilities through its Continuous Diagnostics and Mitigation program, among other efforts.

Ross Foard, ICAM subject matter expert at CISA, says the cyber agency been developing strong authentication measures for agencies through the CDM program since 2017. The program has helped agencies develop a master user record that gives them a “comprehensive list or repository of all their users,” Foard said.

Those tools were developed for legacy environments initially, and with many agencies now modernizing and moving to cloud computing services, Foard said CDM is modernizing its master user record tools.

“These capabilities will be extended to be able to help CFO Act agencies move to the cloud,” Foard said.

While agencies have typically used the Personal Identity Verification (PIV) cards to authenticate users, agencies are also exploring new authentication mechanisms, like single sign-on technologies. Such capabilities have been encouraged by the Office of Management and Budget through the federal zero trust strategy.

“The really important point about these new single sign-on services that are cloud-based is they operate on modern protocols,” Foard said. “And those modern protocols are very important because they allow you to identify the strength of the person authenticating, but even after that, it differentiates between how that person gets access to different applications. It doesn’t replay a password. You have specific assertions that are sent to an application. And you can send with those assertions specific information about the user to make sure that you know what he can do when he gets those applications. So this modernizing of these protocols is really very important.”

In the past, agencies have relied on manual processes to allow privileged access to networks, sometimes resulting in overprovisioning, according to CyberArk Senior Director Bryan Murphy.

“When you move this to an automated process, it becomes very auditable,” Murphy said. “We can make sure that we’re compliant. We know when things are happening. We can trigger on the different things that happen. And it seems that we’ve shifted from years past, where we felt the manual control was the gatekeeper to keep the attackers away, where in reality, we need to leverage the artificial intelligence that we have and a lot of the automation we can put in place. Because these attackers aren’t working on our systems or accessing our systems during normal hours. They’re not very loud when they’re in our systems, and we’ve got to make sure that we constantly have protections that are looking at and sniffing out these types of scenarios.”

Modern protocols and automation are key facets of OMB’s federal zero trust strategy. Agencies are now working toward implementing zero trust on their networks by the end of fiscal 2024.

“We know that people and devices are all over the place,” Foard said. “And that should not be a barrier to getting access to services. You just need to make sure that the devices are known and secure, and the people are known and secure when you give them access to those services.”

Learning objectives:

  • ICAM overview at CISA
  • How ICAM factors into zero trust
  • Industry analysis

Complimentary Registration
Please register using the form on this page or call (202) 895-5023.

The post CISA aims to provide agencies with dashboard of capabilities for identity management first appeared on Federal News Network.

]]>
Exploring AI-Powered Automation for IT Operations https://federalnewsnetwork.com/cme-event/federal-insights/exploring-ai-powered-automation-for-it-operations/ Tue, 23 Aug 2022 19:43:54 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4210380 Join moderator Tom Temin and technology experts from civilian and defense agencies in this exclusive two-day webinar as they discuss AI-powered automation for IT. IBM's Melissa Long Dolson will follow with an industry analysis.

The post Exploring AI-Powered Automation for IT Operations first appeared on Federal News Network.

]]>
Join moderator Tom Temin and technology experts from civilian and defense agencies in this exclusive two-day webinar as they discuss AI-powered automation for IT. IBM’s Melissa Long Dolson from IBM will follow with an industry analysis.

Part 1: Civilian Agencies

  • Current state of AI and RPA
  • Best practices for prioritizing AI and RPA
  • Architectural setup for AI and RPA
  • Industry analysis

Part 2: Defense Agencies

  • Current state of AI and RPA from a DoD perspective
  • Best practices for prioritizing AI and RPA
  • Tools and services to enable AI and RPA
  • Industry analysis

Complimentary Registration
Please register using the form on this page or call (202) 895-5023.

The post Exploring AI-Powered Automation for IT Operations first appeared on Federal News Network.

]]>
How to develop actionable information to protect federal supply chains https://federalnewsnetwork.com/cme-event/federal-insights/building-the-supply-chain-of-the-future/ Fri, 12 Aug 2022 14:44:52 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4193523 During this exclusive webinar, moderator Jason Miller and agency leaders will explore how organizations are approaching supply chain risk management and the data strategy behind this management.

The post How to develop actionable information to protect federal supply chains first appeared on Federal News Network.

]]>
Over the last five years, agencies have realized the importance of protecting their supply chains. What was a niche area for federal acquisition and cyber experts has grown into a full fledge governmentwide effort.

There is the Federal Acquisition Security Council, which is still trying to unify more than 30 different initiatives across the government.

The National Institute of Standard and Technology kicked off the National Initiative for Improving Cybersecurity in Supply Chains about a year ago.

And of course, the Defense Department and Intelligence Community have their share of initiatives. These range from the Cybersecurity Maturity Model Certification (CMMC) to the Office of Director of National Intelligence’s task force to standardize information sharing of counterintelligence risk information in the supply chain environment.

At the heart of all of these efforts, of course, is the data.

Agencies need better, more capable tools to sift through and analyze the information. They need a better approach to understand that information to drive decisions in real time.

NIST identified six critical success factors for any supply chain risk management (SCRM) program. These include integrating SCRM into acquisition, sharing supply chain information and ensuring ongoing capability implementation measures.

Matthew Halvorsen, the strategic program director for the National Counterintelligence and Security Center’s Supply Chain and Cyber Directorate for the Office of the Director of National Intelligence, said their goal is to find ways to develop actionable information on the threats to the government’s critical supply chain areas.

“We’re working in a couple of ways to help develop that piece of the puzzle. We’re looking for ways to develop new sources of information, increasing analytical capabilities to help understand those foreign threats and the capabilities to exploit those supply chains. We’re also looking at ways to help develop new processes to identify suspect or high risk vendors, products, software services that really are pose a risk to our supply chain and our national economic future,” Halvorsen said during the discussion Building the Supply Chain of the Future. “We at the NCSC are really working now to help develop an integrated strategy for supply chain risk management and capabilities across the IC that really helps synergize those strategies from the government with the private industry because the U.S. government doesn’t, generally speaking, own factories with a purchase it from private industry.”

ODNI’s supply chain risk management task force is developing an “integrated strategy” that will set baseline capabilities across the intelligence community and detail initiatives to continue to advance SCRM.

The Army Material Command is taking a more operational approach to supply chain risk management through its acquisition strategies.

Deacon Maddox, the director of supply chain management for the Army Materiel Command, said their organization is integrating data and tools into contracting activities to give the Army the best understanding possible.

These tools include digital twins and a data analytics platform.

“Our commanding Gen. Edward Daly has undertaken an initiative to standardize some of our major procurement processes at our lifecycle management commands. This optimization effort that we’ve have begun really lays the groundwork to do some of the supply chain risk management from a standardized way across the command that allows us to conserve resources where we can and allows us to be more efficient in how we manage the supply chain,” Maddox said. “But it also opens up opportunities for us to look at our own internal organic industrial base to supplement some of the supplies that may be at risk.”

The Army is using digital twins for individual weapon systems, where it is breaking down each piece of that weapons system and modelling it.

“We have efforts ongoing right now to pilot this technology. Then there’s also a digital twin of our facilities where you can take a facility and create a digital model of it and then run efficiency scenarios through it,” Maddox said. “From a SCRM perspective, the digital twin allows you to anticipate where you may have problems in the future. If you’ve got sensor data coming off of your weapon systems and that is feeding into this model and you’re able to understand what your future requirements are going to be with the lead times. It allows you to get ahead of those lead times so that you’re not waiting 18-to-24 months with deadline systems and not having mission capable systems ready to go.”

Retired Navy Rear Adm. John Polowczyk, the government supply chain leader at Ernst & Young, said public and private sector organizations need to have a deep understanding of their industrial base’s capabilities and capacities, the inherent risks, whether it be cyber hardening or foreign influence as well as diminishing manufacturers and sources of supply.

“Some of this is enabled by industry 4.0 like technologies and processes. People are relooking their operating model where they have things and where they manufacture items. They certainly are working on alternate sources of supply and geographic diversity,” Polowczyk said. “That’s what bit us during COVID where we were wholly reliant on Asia for manufacturing of personal protective equipment and a lot of other durable medical goods. You also need a resilient workforce, a trained workforce because one day you’re operating this machine and the second the next day because of an illness or an outage, you’re able to operate this machine. You really have to have a resilient and agile workforce. Finally, there is all of the cyber hardening, the data sharing and securing those things in your ecosystem.”

The goal for most organizations is to manage and understand their supply chain all the way down through the lowest levels.

Polowczyk said there are commercial firms and some elements of the federal government who are using end-to-end visibility tools through commercial products or some homegrown things to really understand what their supply chains look like.

“I do think the data architectures and working across clearance systems, being able to have the visibility needed at all levels is a key in this area, which I don’t think we’ve solved yet,” he said. “I’ve always viewed the vendor vetting intelligence piece is critical, but maybe not on every box of pencils. But when you’re talking about weapons systems and some very critical sensitive technologies, I think that the blend of intelligence-based data, analytics and the tools that they’re using, and there are a suite of things that everybody, including EY, has data analytics and off the shelf tools in this area. How we blend that with the acquisition workforce who are just trying to get the deal done in the most cost effective manner for the federal government. That is where the hard part is.”

Halvorsen said the IC is looking for better tools to increase their visibility using publicly available data as well as sensitive information.

“One of the things we deal with at the NCSC when we talk to our civil agency sector quite a bit is the understanding that acquisition professionals, generally thinking outside of the IC, don’t work in a secure world, meaning they don’t have security clearances and aren’t working on classified systems. So analytical tools that really bring in that publicly available data are really something we are always looking at,” he said. “As part of the efforts with the FASC are tools that have that information sharing component so that we can share information across the federal enterprise to help each agency with their risk management decisions.”

Learning Objectives:

  • Supply Chain Risk Management Overview
  • Processes and Tools to Evaluate Supply Chain Risk Management
  • Data Strategy Towards Supply Chain Risk Management

Complimentary Registration:

Please register using the form on this page or call (202) 895-5023.
This program is sponsored by   

The post How to develop actionable information to protect federal supply chains first appeared on Federal News Network.

]]>
CMS looks to make ‘intentional’ investments in push to underlying zero trust pillars https://federalnewsnetwork.com/cme-event/federal-insights/ciso-handbook-centers-for-medicare-medicaid-services/ Thu, 11 Aug 2022 18:55:27 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4192019 During this exclusive CISO Handbook webinar, moderator Justin Doubleday and guest Robert Wood, chief information security officer at Centers for Medicare and Medicaid Services will explore how his agency is implementing zero trust and other modern security practices. In addition, David Chow, global chief technology strategy officer at Trend Micro, will provide an industry perspective.

The post CMS looks to make ‘intentional’ investments in push to underlying zero trust pillars first appeared on Federal News Network.

]]>
Duration: 1 hour
Cost: 
No Fee

The term “zero trust” can be a misnomer, suggesting a simple methodology for organizations to secure their networks.

In reality, zero trust is a concept built upon several underlying security technologies and methods that come together to form a more secure architecture. The federal zero trust strategy breaks down the concept into five pillars, based on the Cybersecurity and Infrastructure Security Agency’s zero trust maturity model.

Robert Wood, the chief information security officer at the Center for Medicare and Medicaid Services, says his agency is trying to take that more granular approach to zero trust, rather the spreading funding evenly across every pillar at once.

“Where we’ve been trying to really be critical and intentional is finding the areas throughout the enterprise where we can make significant progress against the maturity model, that also have a lot of adoption, where the benefits of that investment are going to be felt by the consumers of this centralized service or the underlying like environment that they’re building on,” Wood said during a CISO Handbook discussion hosted by Federal News Network.

“Whether it’s a cloud infrastructure or data centers or whatever,” he continued. “And so finding those intersections where we can invest intelligently, and then see ourselves make these substantial jumps against the maturity model, where the benefits are going to be as far reaching as possible in the in the enterprise.”

Wood said the “nuanced” pieces of the zero trust approach – ranging from multifactor authentication to encryption and beyond – are going to benefit CMS’s enterprise as a whole.

“Zero trust is not everything,” he added. “There’s a lot of other stuff that any sensible security organization should be doing.”

Many organizations have been looking to modernize their identity solutions for authenticating and authorizing users on their networks, whether they be internal employees or external customers and partners.

For CMS, one challenge will be in dealing with how legacy identity and network solutions were built across different subcomponents.

“We as an agency are effectively living out a representation of what’s referred to as Conway’s Law where the system that is built is a direct mapping of, not some ideal software architecture, but rather of the organizational architecture that is building it,” Wood said. “And so because we have different parts of the agency, even within the Office of Information Technology, building different parts of our identity setup to serve all of these different populations, you end up with this thing that pieced together.”

David Chow, the global chief technology strategy officer at Trend Micro, says the shift to a zero trust security architecture is a “monumental effort.” He applauded the maturity approach the federal government is taking to get there.

He said one key step agencies can take to start is to better understand the boundary of their networks, and then examine any critical security deficiencies within their systems. Automation and machine learning he will also be key to identify and addressing vulnerabilities efficiently, according to Chow.

“If the agency really wants to get to the high maturity level, I would say identity and access management is one huge area that that needs to be focused on,” Chow said. “And then you have an automated response to any security violation or any security incident. That’s the second part that needs to be focused on. So that would be my recommendation to any CIO or CISO.”

Learning objectives:

  • Key Considerations for the Move to a Zero Trust Approach
  • Implementing Modern Security Practices
  • Industry Analysis

This program is sponsored by   

Complimentary Registration
Please register using the form on this page or call (202) 895-5023.

The post CMS looks to make ‘intentional’ investments in push to underlying zero trust pillars first appeared on Federal News Network.

]]>
Federal News Network’s Workplace Reimagined https://federalnewsnetwork.com/cme-event/federal-insights/federal-news-networks-workplace-reimagined/ Fri, 05 Aug 2022 18:47:19 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4183287 What does the future of how feds work hold? Find out from federal speakers leading the charge to reinvent both the office and the hybrid experience during this exclusive one-day event.

The post Federal News Network’s Workplace Reimagined first appeared on Federal News Network.

]]>

What does the future of how feds work hold? Find out from federal speakers leading the charge to reinvent both the office and the hybrid experience.

This exclusive one-day event looked at the way the government is reimagining the federal workplace right now. It’s not a new idea – to look at the “future” of how employees work and also how agencies recruit, hire, train and retain employees. But with the no-brakes adoption of a hybrid, multi-cloud environment over the past few years, those plans have morphed and accelerated.

Feds on the frontlines of critical initiatives and industry experts sat down for in-depth discussions with Federal News Network editors to discover:

  • How agencies are using and want to evolve technology to transform the experience of a hybrid workplace
  • What a future in-office workplace might look like compared to before the pandemic
  • Where efforts to change up government hiring and onboarding practices are leading

The post Federal News Network’s Workplace Reimagined first appeared on Federal News Network.

]]>