Automation - Federal News Network https://federalnewsnetwork.com Helping feds meet their mission. Mon, 01 Apr 2024 17:49:59 +0000 en-US hourly 1 https://federalnewsnetwork.com/wp-content/uploads/2017/12/cropped-icon-512x512-1-60x60.png Automation - Federal News Network https://federalnewsnetwork.com 32 32 Senate bill aims to bring federal records law into the age of ‘WhatsApp’ https://federalnewsnetwork.com/agency-oversight/2024/03/senate-bill-aims-to-bring-federal-records-law-into-the-age-of-whatsapp/ https://federalnewsnetwork.com/agency-oversight/2024/03/senate-bill-aims-to-bring-federal-records-law-into-the-age-of-whatsapp/#respond Thu, 28 Mar 2024 20:25:27 +0000 https://federalnewsnetwork.com/?p=4943375 The legislation comes after recent federal records controversies where officials lost or deleted messages, like the missing Jan. 6 Secret Service texts.

The post Senate bill aims to bring federal records law into the age of ‘WhatsApp’ first appeared on Federal News Network.

]]>
Key Senate lawmakers are pushing to raise the stakes for government officials who delete texts or use personal online accounts to skirt federal records law.

Homeland Security and Governmental Affairs Committee Chairman Gary Peters (D-Mich.) and Sen. John Cornyn (R-Texas) are introducing the “Strengthening the Federal Records Act of 2024” today.

The bill would tighten disclosure requirements for “non-official messaging accounts” used to carry out government business, while also strengthening the ability of the National Archives and Records Administration to hold agencies accountable for complying with record-keeping rules.

“Federal agencies must maintain adequate records so that the American public can hold officials accountable, access critical benefits and services, and have a clear picture of how the government is spending taxpayer dollars,” Peters said in a statement. “We must also update the law to keep pace with rapidly changing technology and ensure that we are not sacrificing transparency as we embrace new forms of communication.”

The bill would prohibit federal employees from using “non-official” messaging applications to carry out government business unless the messages are backed up or otherwise saved in an official account.

Beyond texting, government officials have also increasingly turned to platforms like WhatsApp and Signal in recent years. Those “ephemeral” messaging applications allow users to permanently delete messages after a set amount of time.

“American taxpayers deserve a full accounting of federal records, including across all forms of digital communication,” Cornyn said. “This legislation would help make sure technological advancements do not hamstring the government’s ability to provide greater accountability and transparency for federal records.”

The proposed FRA reforms do not address record-keeping at the White House. Those practices are governed by a separate statute, the Presidential Records Act.

But the legislation comes after numerous federal record-keeping controversies at the agency-level in recent years. For instance, the Secret Service lost key text messages from the day of the Jan. 6 Capitol riot, reportedly due to an IT system update.

The Department of Homeland Security inspector general, who had been investigating the missing Secret Service texts, more recently admitted to lawmakers he routinely deletes texts off his government-issued phone.

And during a hearing held by the homeland security committee earlier this month, Republicans pointed to a National Institutes of Health official who had told colleagues he used his personal email account to avoid having his records pulled under a Freedom of Information Act request.

“Records are the currency of democracy,” Anne Weismann, a former Justice Department official and law professor at George Washington University, said during the hearing. “They are the way we hold government actors accountable. And we have seen too many examples, whether it’s at NIH, whether it’s at DHS, whether it’s the Secret Service, where federal employees are either willfully or unwittingly avoiding or contravening their record keeping responsibilities. And as a result, the historical record of what they’re doing and why they’re doing it, is incomplete.”

Certification requirements

Under the legislation, federal employees would also have to certify their compliance record-keeping requirements before leaving an agency. Weismann pointed to reports that senior officials in the Trump administration may have deleted crucial messages regarding Jan. 6 before leaving government.

“If they had been required to certify upon leaving government that they had complied with their record keeping responsibilities, that might not have happened, or there would have been some ability to hold them accountable for what they did,” Weismann said during a hearing held by the homeland security committee earlier this month.

The legislation would expand a NARA program that automatically captures the email messages of senior agency officials.

The “Capstone” program would be expanded to automatically capture other forms of electronic messages, including through the “culling” of transitory messages and personal messages “as appropriate,” per the legislation.

Justice Department referral

Peters’ and Cornyn’s bill would also require NARA to refer repeated violations of the FRA to the Justice Department, including cases where employees unlawfully remove or destroy records.

Weismann had told lawmakers that NARA has been reticent to refer violations of records laws to DOJ, especially in cases where records were allegedly destroyed. She said that’s despite the fact that the Archives admits it doesn’t have the resources or authorities to investigate and punish record-keeping violations on its own.

“[NARA] is not well equipped, they don’t have the investigative resources, for example, that the Department of Justice has, which is precisely why we think it’s so critical that the obligation to make that referral be made clear,” Weismann said.

The bill comes as federal agencies and NARA manage an increasing amount of electronic records. NARA will stop accepting permanent paper records from agencies starting this summer.

Numerous advisory committees and advocacy groups have warned that agencies have largely been unprepared to handle the growing influx of digital data over the past two decades, impacting everything from classified information sharing to FOIA processing.

The Peters-Cornyn legislation would also set up an “Advisory Committee on Records Automation” at NARA. The committee would be responsible for encouraging and recommending ways that agencies can take advantage of automation to ingest and manage their electronic records.

The bill has garnered the support of multiple advocacy groups, according to statements provided by the Homeland Security Committee. They include the Citizens for Responsibility and Ethics in Washington (CREW), Americans for Prosperity, Protect Democracy, Government Information Watch, and the Association of Research Libraries.

“Government records are ultimately the property of the American people and agencies are responsible for maintaining the emails, texts, and documents they create,” Debra Perlin, policy director for CREW, said in a statement. “The Strengthening Oversight of Federal Records Act would update and bolster our federal recordkeeping laws to account for changes in technology, and make it easier for organizations like ours to ensure that records are created and preserved during any administration.”

The post Senate bill aims to bring federal records law into the age of ‘WhatsApp’ first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/agency-oversight/2024/03/senate-bill-aims-to-bring-federal-records-law-into-the-age-of-whatsapp/feed/ 0
Generative AI: Start small but scale fast https://federalnewsnetwork.com/federal-insights/2024/03/generative-ai-start-small-but-scale-fast/ https://federalnewsnetwork.com/federal-insights/2024/03/generative-ai-start-small-but-scale-fast/#respond Mon, 25 Mar 2024 11:21:11 +0000 https://federalnewsnetwork.com/?p=4938332 David Knox, the chief technology officer for industrials, energy and government at Oracle, urges federal agencies to take a measured approach to generative AI.

The post Generative AI: Start small but scale fast first appeared on Federal News Network.

]]>
Generative artificial intelligence is unlike any technology that’s come along in recent memory. One reason: You’d be hard pressed to find an application or process to which generative AI doesn’t apply. In some sense, it can do more than it cannot do.

That, plus the technology’s sudden emergence in media and at so many industry conferences and gatherings, has organizations worried. They want to avoid rapid obsolescence by failing to adopt generative AI right away.

David Knox, the chief technology officer for industrials, energy and government at Oracle, urges federal agencies to take a measured approach. He says you might be able to do anything with generative AI, but you can’t do everything. Knox recommends starting with what he called proving grounds — low-risk, low-complexity processes with which to try out AI.

Even in such entry use cases, it’s wise to test the work in an isolated “sandbox” environment while you evaluate the benefits and ensure the requisite security and privacy controls remain in place.

If the resulting AI-powered application does work as intended, agencies then need to be able to create “a path to production,” Knox says. That means knowing your compliance framework and requirements in advance, as with any new technology deployment.

Identifying the proving ground generative AI applications will enable users to operate in what Knox calls the “find/fail fast/fix” mode. For example, nearly every agency deals with human capital processes such as hiring, retention and performance management. Knox advises choosing a single process and using real data to test both the efficacy of generative AI and whether the resulting process retains those crucial compliance measures. Then, if it does, apply the process in a limited production scope before scaling to agency-wide use.

Given the wide potential of generative AI, nearly every federal process is a candidate for its application. Knox says a less intuitive, perhaps, but potentially large payoff candidate for generative AI would capture and preserve institutional knowledge held by the generation of federal employees eligible to retire, the boomers and late millennials.

Such people “have an incredible amount of institutional knowledge of just what are the programs, what are the processes, how to get things done,” Knox says. The information may or may not be written down, and if it is, often no one knows where. Recording knowledgeable people’s answers can create an unstructured database to which the application of generative AI can be ideal. He cautions that unstructured knowledge capture can be risky, especially when using generative AI to capture it. Knox advised a trust-first mindset, with human supervision, rather than using people’s answers to directly train AI.

Knox cited procurement as another internally facing function where such knowledge capture would have big payoff, asking senior practitioners about systems, policies, acronyms, norms of operating, compliance rails and other things to be aware of.

One externally facing area for applying generative AI is citizen serving applications. Here, Knox said, agencies can use AI for what he termed document understanding. He defined that as going beyond optical character recognition and parsing data in individual fields, and beyond converting documents to digital images by essentially turning them into an interactive knowledge base.

In all cases of AI deployment, Knox said, it’s important to keep in mind the human factor, because people often think AI will somehow replace them. Given the complexities of federal procurement, HR management, finances and accounting, grant-making and program management, Knox said, people will realize instead how AI will benefit them — not by replacing them, but rather by augmenting their decision-making and removing routine or repetitive tasks connected to their jobs.

“We didn’t get rid of people when we invented calculators,” he said. “We’re not going to do that with generative AI.”

The post Generative AI: Start small but scale fast first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/03/generative-ai-start-small-but-scale-fast/feed/ 0
DoD Cloud Exchange 2024: Slack’s Rob Seaman on powering productivity, collaboration https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-slacks-rob-seaman-on-powering-productivity-collaboration/ https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-slacks-rob-seaman-on-powering-productivity-collaboration/#respond Sun, 17 Mar 2024 17:10:49 +0000 https://federalnewsnetwork.com/?p=4928576 Public and private sector organizations can reduce friction and make employees lives easier by leaning into tools like Slack, says the company's Rob Seaman.

The post DoD Cloud Exchange 2024: Slack’s Rob Seaman on powering productivity, collaboration first appeared on Federal News Network.

]]>
When it comes to speed to decisions, the Army Software Factory offers an important use case. The organization leaned into automation and collaboration tools to disseminate information two to three days faster than through other means — like email or meetings.

This simple, but real-life example is helping the Army, and the Defense Department more broadly, fill the communications gulch that can exist in organizations, especially as agencies continue to adjust to a hybrid workforce.

“What we are most excited about is actually seeing these collaboration technologies that have been so successful in the private sector make their way into the public sector,” said Rob Seaman, senior vice president of platform product at Slack, during Federal News Network’s DoD Cloud Exchange 2024.

In addition to the productivity benefits from leveraging a collaboration platform, GovSlack’s recent FedRAMP High authorization means security conscious public sector agencies can rest assured their data is well protected, Seaman said.

“There are a few key aspects of these collaboration technologies that can help with some of the larger agencies that are interconnected and geographically dispersed or may have people that are working both in the office and at home,” he said. “Some of the primary benefits we see from these collaboration technologies are alignment and speed, as well as the ability to get people together and aligned around a particular initiative or topic where they can work faster than you ever have been able to do before.”

How to stay connected without meetings

Seaman said employees can work together or asynchronously without missing a beat or feeling like they were left out of a discussion.

He said executives at Slack, for example, encourage employees to write documents or record an audio or video clip in lieu of a meeting.

“We do this all the time, where instead of scheduling an all-hands call for the company, every other all hands we will actually do asynchronously, and our executives will just record clips, and then people can go in and watch them at two times speed whenever they like,” he said. “They can actually just read the transcripts instead of watching it if they aren’t in a place where they can listen to audio or it might be interruptive to what they have going on at home.”

Another benefit is the integration with third-party applications that collaboration and productivity tools bring, Seaman said.

How to reduce friction, increase agency speed

At Salesforce, Slack’s parent company, executives manage all of their approvals — from expenses to leave requests — right in Slack using the platform’s integration and automation capabilities.

“We’ve seen a reduction in the median time it takes to approve expense reports from 2.4 days to 1.7 hours  — across 80,000 employees,” Seaman said. “We see a ton of value in actually bringing the systems that your people need to use into where the communication is happening. When somebody needs to approve an expense report or somebody needs to approve a project brief or creative brief or something like that, just bring it to where they’re communicating. It’s also like a notification that may spark a conversation or requires a human to take an action.”

That integration with other software as a service applications is something any large organization in the public or private sector can take advantage of, he suggested. Too often organizations force employees to “context switch” between applications that don’t talk to one another, causing frustration and friction in their daily work, Seaman added. Slack has 2,700 apps that are integrated out of the box.

Seaman said authorizations like FedRAMP High and additional compliance features like application programming interfaces  for e-discovery and data loss prevention tools help engender confidence in the tools.

“The fact that using tools like this — that allow you to achieve a higher level of alignment across your organization, and all of your initiatives will ultimately make you as an agency faster — it allows you to embrace hybrid work,” he said.

“One of the ways that you can achieve that is by bringing more and more of your systems into where the communication is happening. Don’t make people go search for tasks. Bring tasks they need to do to them, and allow them to quickly act on them. You’re going to be faster, you’re going to save money, and, ultimately, they’re going to be happier and more productive.”

Discover more articles and videos now on Federal News Network’s DoD Cloud Exchange event page.

The post DoD Cloud Exchange 2024: Slack’s Rob Seaman on powering productivity, collaboration first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cloud-computing/2024/03/dod-cloud-exchange-2024-slacks-rob-seaman-on-powering-productivity-collaboration/feed/ 0
The IRS launches Direct File, a pilot program for free online tax filing available in 12 states https://federalnewsnetwork.com/technology-main/2024/03/the-irs-launches-direct-file-a-pilot-program-for-free-online-tax-filing-available-in-12-states/ https://federalnewsnetwork.com/technology-main/2024/03/the-irs-launches-direct-file-a-pilot-program-for-free-online-tax-filing-available-in-12-states/#respond Tue, 12 Mar 2024 20:21:30 +0000 https://federalnewsnetwork.com/?p=4922401 After weeks of testing, an electronic system for filing returns directly to the IRS is now available for taxpayers from 12 selected states.

The post The IRS launches Direct File, a pilot program for free online tax filing available in 12 states first appeared on Federal News Network.

]]>
NEW YORK (AP) — After weeks of testing, an electronic system for filing returns directly to the IRS is now available to taxpayers from 12 selected states.

The new system, called Direct File, is a free online tool. Taxpayers in the selected states who have very simple W-2s and claim a standard deduction may be eligible to use it this tax season to file their federal income taxes. The program will also offer a Spanish version, which will be available starting at 1 p.m. Eastern Time on Tuesday.

“This is a milestone,” said IRS Commissioner Daniel Werfel during a Tuesday press conference to announce the expanded availability of the program. Tax season officially began January 29 and the filing deadline is April 15.

“Direct File marks the first time you can electronically file a tax return directly with the IRS,” Werfel said. “And you can’t beat the price — its free.”

The Treasury Department estimates that one-third of all federal income tax returns filed could be prepared using Direct File and that 19 million taxpayers may be eligible to use the tool this tax season. So far, roughly 20,000 people have participated in the pilot program, according to the IRS, and expect participation to grow to 100,000 filers in the coming weeks.

Certain taxpayers in Florida, New Hampshire, Nevada, South Dakota, Tennessee, Texas, Washington, Wyoming, Arizona, Massachusetts, California and New York can participate. Direct File can only be used to file federal income taxes, taxpayers from states that require filing state taxes will need to do so separately.

“Direct File will offer millions of Americans a free and simple way to file their taxes, with no expensive and unnecessary filing fees and no upselling, putting hundreds of dollars back in the pocket of working families each year, consistent with President Biden’s pledge to lower costs,” said National Economic Advisor Lael Brainard.

Werfel said a component of the program that enhances filers’ usability is the live chat feature that allows taxpayers to interact with the IRS while they complete their taxes.

The Direct File pilot is part of the agency’s effort to build out a new government service that could replace some taxpayers’ use of commercial tax preparation software, such as TurboTax. It’s meant to be simple and provides a step-by-step walkthrough of easy-to-answer questions.

Derrick Plummer, a spokesman for Intuit, said in an email that Direct File “is not free tax preparation but a thinly veiled scheme that will cost billions of taxpayer dollars to pay for something already completely free of charge today.”

“This scheme will cost billions of taxpayer dollars and will be unnecessarily used to pay for something already completely free of charge today,” Plummer said.

Several organizations offer free online tax preparation assistance to taxpayers under certain income limits and fillable forms are available online on the IRS website, but the forms are complicated and taxpayers still have to calculate their tax liability.

When asked whether the Direct File program will likely be built out and available in the 2025 filing season, Werfel said: “I don’t want to prematurely reach a conclusion,” he said, but positive reports from users “have been encouraging.”

___

Hussein reported from Washington, D.C.

___

The Associated Press receives support from Charles Schwab Foundation for educational and explanatory reporting to improve financial literacy. The independent foundation is separate from Charles Schwab and Co. Inc. The AP is solely responsible for its journalism.

The post The IRS launches Direct File, a pilot program for free online tax filing available in 12 states first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/technology-main/2024/03/the-irs-launches-direct-file-a-pilot-program-for-free-online-tax-filing-available-in-12-states/feed/ 0
New post for long-time advocate of better digital government https://federalnewsnetwork.com/technology-main/2024/03/new-post-for-long-time-advocate-of-better-digital-government/ https://federalnewsnetwork.com/technology-main/2024/03/new-post-for-long-time-advocate-of-better-digital-government/#respond Mon, 04 Mar 2024 19:15:49 +0000 https://federalnewsnetwork.com/?p=4912310 The Volker Alliance has added a prominent federal technologist to its board. She was the deputy U.S. chief technology officer and founded Code For America.

The post New post for long-time advocate of better digital government first appeared on Federal News Network.

]]>
var config_4911765 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB3561553719.mp3?updated=1709557613"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"New post for long-time advocate of better digital government","description":"[hbidcpodcast podcastid='4911765']nnThe Volker Alliance, a premier good-government group, has added a prominent federal technologist to its board. She was the deputy U.S. chief technology officer and founded Code For America, a non-profit that helps government at all levels with digital challenges. For an update, <a href="https:\/\/federalnewsnetwork.com\/category\/temin\/tom-temin-federal-drive\/"><em><strong>the Federal Drive with Tom Temin<\/strong><\/em><\/a> spoke with Jennifer Pahlka.nn<em><strong>Interview Transcript:\u00a0<\/strong><\/em>n<blockquote><strong>Tom Temin <\/strong>And since starting code for America, which you're no longer associated with, but it's kind of got a life of its own. What have you been up to?nn<strong>Jennifer Pahlka <\/strong>I had the brilliance to step down about six weeks before the shutdown for the pandemic. And my idea was to write a book, which I did. But with the pandemic chaos, I also ended up helping match technologists with governments who needed them during those first couple weeks of the pandemic. And that became something called U.S. Digital Response, which should not be confused with United States Digital Service, which I helped stand up in the White House in 2013-2014. So that was an amazing ride, I'm still on the board of USDR as well. Then went and wrote my book and did some other consulting during the time. And now I'm a senior fellow at both the Niskanen Center and the Federation of American Scientists, and just excited to join the board of the Volcker Alliance, which I've just had such high regard for so long.nn<strong>Tom Temin <\/strong>And with respect to that helping government when the pandemic hit, it must have been somewhat satisfying to see the technology base that was in place already at the federal government that allowed it to quickly pivot, in most cases to everybody teleworking and in some sense not really missing a beat.nn<strong>Jennifer Pahlka <\/strong>I think there were so many things that government at all levels, did incredibly well during the pandemic that we take for granted, because that's what we always do with government. If it goes well, we take it for granted, and if it doesn't, we are quite concerned. Yes, there were the missteps too. So one of the other things I got pulled into in the first summer of the pandemic was a strike team, as they called it. I call it a task force because it sounds less violent for the state of California's backlog of unemployment insurance claims. And that end up being the first three chapters of my book. And I thought it was it was a great lesson for me to learn, and that I could share through the book about what really underpins the problems of government technology. The technology looks like it's the problem, but there are much deeper dynamics underlying them that we need to grapple with.nn<strong>Tom Temin <\/strong>And tell us more about that book, because it sounds like something that the current crew and past crews and future crews ought to read, maybe to help the government keep moving along the digital journey.nn<strong>Jennifer Pahlka <\/strong>It's a book that I think a lot of people think is about technology. It has a QR code on the cover in a flag. So it's clear that we are talking about U.S. government here and some degree of patriotism. But it's really about sort of what I came to conclude is a driving force of our dysfunction in government technology, which is fundamentally this idea, that policy is this thing over here and the delivery, the implementation of that policy is something separate, that separate people do, and they don't really talk to the policymakers. And of course, we think of it as sort of a waterfall, a cascade, a linear process from maybe Congress creates the law, gets handed down to agencies and policymakers, etc.. And down at the bottom of this big waterfall, you have the implementers. Well it turns out for many years people have been challenging those assumptions and realizing that those two things should not be thought of as separate. That when we think of them as separate, we are causing ourselves much more pain than we really need to. I'm excited to see so many people start to grapple with these ideas and really put them into practice and stop sort of saying, let's fix this on the edge here and really go to the core cause.nn<strong>Tom Temin <\/strong>Yeah. So the implication is that good technology implementation really starts at the collaboration stage between people that make policy and people that have to implement policy.nn<strong>Jennifer Pahlka <\/strong>Yeah. I'll give you one example from the book, though there are several. This is a local and state level issue. But when I was at code for America many years ago, we started on the problem of clearing criminal records where legalization of marijuana in many states had meant that somebody who has a past marijuana felony no longer should have it on their record so that they can access things like jobs and housing. But it's a year long paperwork process that most people can't persist through. It's just so much sludge and paperwork and really, there's no need for it. And so we figured out that if these are just records in a database, you can query the database, find all the people who are eligible for that expungement and clear them in bulk. So that's automatic expungement. But the problem is some laws are written in such a way that there's no way to query the database and find all of those people. We had a law here in California called Prop 47 that was written to reclassify burglaries under $950, and I think it was in commercial locations. Well, you cannot query the database. You'd have to go looking in every single person's file and try to read handwriting of a cop who took that case and see if they noted what kind of camera was stolen, and try to figure out if that was $950. So these laws are simply unable to be automated, and therefore they're really going to have very little effect on the people. But if you consult with the people who understand the implementation of the law before you write it, you might make different choices in what you actually make expungement so that the law can have a real effect. And that's a kind of thinking that is rare, but it's starting to grow. And I'm just really happy to see that we can get implementation kind of all the way up front at the process instead of all the way at the end.nn<strong>Tom Temin <\/strong>We were speaking with Jennifer Pahlka. She's former U.S. Chief Technology Officer and now author of Recoding America. And that example of where the law doesn't fit, really, with what the policy actually is to say in this case, it's okay to expunge these records. This is something I was talking to Senator Warner about not long ago, with respect to artificial intelligence and the many ways that Congress could act on that by simply doing little block and tackling legislation to make things in sync in the way you've just described for that particular function of expungement in another domain, which gets to the big issue, that Congress can't even do those little things anymore when everyone agrees this is the policy, all we need to do is change clause A, subchapter one, paragraph B of this law, and it'll all match up. They don't even do that. That must be frustrating.nn<strong>Jennifer Pahlka <\/strong>Well I think we're going to have opportunities for the kinds of legislation that I'm interested in, because this stuff is nonpartisan. There's no culture war at risk here. We're really just talking about stuff that benefits everyone. And I think there will be windows for it. But no, I can't fix congressional dysfunction.nn<strong>Tom Temin <\/strong>Anyone that could would be a genius and would be showered with a crown of America. Let me ask you this. Since you were in government, golly, a decade ago, that's kind of light years in a technology sense. And now we have a company called Nvidia, been around for ages. People used to be happy to get one of their cards in your computer because it could really rev up the gaming. Now they are worth a couple of trillion dollars thanks to artificial intelligence, which has come on the scene in a big way. What are your thoughts about good ways for government to bring this in to the whole digital effort? Really, that's never ending.nn<strong>Jennifer Pahlka <\/strong>Yeah. I used to work in the video game business. I ran the Game Developers Conference for eight years, and so I knew Nvidia back in the day when it was gaming that drove their business. Artificial intelligence is fascinating to me, and I think I came to it a little bit late because I had been fatigued by the hype around blockchain. But when I started really paying attention to it, I started to see that this is a profound opportunity to bring our government forward. And I'll tell you one thing that brought me along. I was visiting the Department of Labor in New Jersey about the spring before I did my book, and ChatGPT had just come out. Now, you had this great team at the [Department of Labor (DOL)] and also the new Jersey Office of Innovation, wonderful folks that they were working to make unemployment insurance just better every day. It's a fantastic strategy. They're not going for some big procurement, they're just bringing on people who know what they're doing and really fixing it week by week. One of the things that this designer was doing was rewriting the letters that people get, their emails that are letters you get in the mail about your claim about adjudication. And they're so hard to understand. They're written in legalese, and it's one of the reasons people don't reply. And then you get a longer backlog. Well, she had been rewriting them for a sort of eighth grade, ninth grade level and then going to the policy team and saying, is this still correct? And then if it is, let's bold the call to action, put it in big letters, then we read the letter and then people start to interact with unemployment insurance a lot easier.nn<strong>Jennifer Pahlka <\/strong>Well, ChatGPT had just come out and she was just feeding those letters into ChatGPT with the prompt, rewrite this so I can understand it better. Now. She still went to the policy team and checked it with them. This is not giving over decision making to AI. It's an AI as an assist to somebody. And I asked her, so is it really helping you? She said, I think we're getting through these letters about 4 or 5 times as fast as we used to. It's not an automatic process, it's an assist. And I thought, that is a fantastic use of AI. We should not be getting things like that. We should just let people do this. Now there are things that are going to need some review. And of course, the AI executive order has called out some of those things. I think we need to be really careful that the guards that are put in place through things like the AI EO for really risky applications don't get applied to these really low risk, high value applications. And the use of them to make letter simpler is just one example. There are dozens, hundreds of others where we really want to enable. We want to put our foot on the gas, not on the brakes, I think in, in those areas. The one area that I think we do need to have caution about, that is not the one that most people talk about, like we don't want to give over decision making to AI. I think that that is much less important than people think, and there's so many other applications. But we have such complex rules and regulations like unemployment insurance. We've been adding rules and regs for 90 years. We never take them away. So you've got thousands and thousands of pages of regs that cover this pretty simple program. I think a lot of people are excited about AI's ability to sort of manage that complexity. And I want people instead to be excited about AI's ability to help us simplify that complexity, not just like, ok, we'll be able to get through this, but here are some proposed simplifications that state legislatures and Congress ought to really take seriously, so that if you're trying to do unemployment insurance in the next downturn, you have 100 pages of regs to deal with, not 9,000.nn<strong>Tom Temin <\/strong>And I want to bridge to another use of AI and the generative AI, which is to make computer programs. And as someone whose endeavors usually have the word coding in them, one technologist said the other day, well, thanks to the generative AI, English is the new programing language. What's your thought on that?nn<strong>Jennifer Pahlka <\/strong>I think that is part of a trend that's been happening for a long time. Technology is sort of the ability to make things that other people can use has been democratized for a long time. AI is for the next phase in it. But go back to that time I talked about right when the shut down sent us all home and all these governments were reaching out to volunteers through USDR. It happened to be a time also when no code, low code tools were becoming available. And about a half of the requests that we got from state and local governments at the time. Can you stand up a form for emergency rental assistance, for example. Those were actually pretty easily filled, not by some fancy tech team, but by a couple of volunteers teaching local government officials themselves how to use something like Airtable. It's just that much more powerful than Excel. And you can really actually run a benefits program. That's not too advanced off of it. And I think that is part generally of how we're going to make all of these things that used to be very specialized, really possible for anybody to do.<\/blockquote>"}};

The Volker Alliance, a premier good-government group, has added a prominent federal technologist to its board. She was the deputy U.S. chief technology officer and founded Code For America, a non-profit that helps government at all levels with digital challenges. For an update, the Federal Drive with Tom Temin spoke with Jennifer Pahlka.

Interview Transcript: 

Tom Temin And since starting code for America, which you’re no longer associated with, but it’s kind of got a life of its own. What have you been up to?

Jennifer Pahlka I had the brilliance to step down about six weeks before the shutdown for the pandemic. And my idea was to write a book, which I did. But with the pandemic chaos, I also ended up helping match technologists with governments who needed them during those first couple weeks of the pandemic. And that became something called U.S. Digital Response, which should not be confused with United States Digital Service, which I helped stand up in the White House in 2013-2014. So that was an amazing ride, I’m still on the board of USDR as well. Then went and wrote my book and did some other consulting during the time. And now I’m a senior fellow at both the Niskanen Center and the Federation of American Scientists, and just excited to join the board of the Volcker Alliance, which I’ve just had such high regard for so long.

Tom Temin And with respect to that helping government when the pandemic hit, it must have been somewhat satisfying to see the technology base that was in place already at the federal government that allowed it to quickly pivot, in most cases to everybody teleworking and in some sense not really missing a beat.

Jennifer Pahlka I think there were so many things that government at all levels, did incredibly well during the pandemic that we take for granted, because that’s what we always do with government. If it goes well, we take it for granted, and if it doesn’t, we are quite concerned. Yes, there were the missteps too. So one of the other things I got pulled into in the first summer of the pandemic was a strike team, as they called it. I call it a task force because it sounds less violent for the state of California’s backlog of unemployment insurance claims. And that end up being the first three chapters of my book. And I thought it was it was a great lesson for me to learn, and that I could share through the book about what really underpins the problems of government technology. The technology looks like it’s the problem, but there are much deeper dynamics underlying them that we need to grapple with.

Tom Temin And tell us more about that book, because it sounds like something that the current crew and past crews and future crews ought to read, maybe to help the government keep moving along the digital journey.

Jennifer Pahlka It’s a book that I think a lot of people think is about technology. It has a QR code on the cover in a flag. So it’s clear that we are talking about U.S. government here and some degree of patriotism. But it’s really about sort of what I came to conclude is a driving force of our dysfunction in government technology, which is fundamentally this idea, that policy is this thing over here and the delivery, the implementation of that policy is something separate, that separate people do, and they don’t really talk to the policymakers. And of course, we think of it as sort of a waterfall, a cascade, a linear process from maybe Congress creates the law, gets handed down to agencies and policymakers, etc.. And down at the bottom of this big waterfall, you have the implementers. Well it turns out for many years people have been challenging those assumptions and realizing that those two things should not be thought of as separate. That when we think of them as separate, we are causing ourselves much more pain than we really need to. I’m excited to see so many people start to grapple with these ideas and really put them into practice and stop sort of saying, let’s fix this on the edge here and really go to the core cause.

Tom Temin Yeah. So the implication is that good technology implementation really starts at the collaboration stage between people that make policy and people that have to implement policy.

Jennifer Pahlka Yeah. I’ll give you one example from the book, though there are several. This is a local and state level issue. But when I was at code for America many years ago, we started on the problem of clearing criminal records where legalization of marijuana in many states had meant that somebody who has a past marijuana felony no longer should have it on their record so that they can access things like jobs and housing. But it’s a year long paperwork process that most people can’t persist through. It’s just so much sludge and paperwork and really, there’s no need for it. And so we figured out that if these are just records in a database, you can query the database, find all the people who are eligible for that expungement and clear them in bulk. So that’s automatic expungement. But the problem is some laws are written in such a way that there’s no way to query the database and find all of those people. We had a law here in California called Prop 47 that was written to reclassify burglaries under $950, and I think it was in commercial locations. Well, you cannot query the database. You’d have to go looking in every single person’s file and try to read handwriting of a cop who took that case and see if they noted what kind of camera was stolen, and try to figure out if that was $950. So these laws are simply unable to be automated, and therefore they’re really going to have very little effect on the people. But if you consult with the people who understand the implementation of the law before you write it, you might make different choices in what you actually make expungement so that the law can have a real effect. And that’s a kind of thinking that is rare, but it’s starting to grow. And I’m just really happy to see that we can get implementation kind of all the way up front at the process instead of all the way at the end.

Tom Temin We were speaking with Jennifer Pahlka. She’s former U.S. Chief Technology Officer and now author of Recoding America. And that example of where the law doesn’t fit, really, with what the policy actually is to say in this case, it’s okay to expunge these records. This is something I was talking to Senator Warner about not long ago, with respect to artificial intelligence and the many ways that Congress could act on that by simply doing little block and tackling legislation to make things in sync in the way you’ve just described for that particular function of expungement in another domain, which gets to the big issue, that Congress can’t even do those little things anymore when everyone agrees this is the policy, all we need to do is change clause A, subchapter one, paragraph B of this law, and it’ll all match up. They don’t even do that. That must be frustrating.

Jennifer Pahlka Well I think we’re going to have opportunities for the kinds of legislation that I’m interested in, because this stuff is nonpartisan. There’s no culture war at risk here. We’re really just talking about stuff that benefits everyone. And I think there will be windows for it. But no, I can’t fix congressional dysfunction.

Tom Temin Anyone that could would be a genius and would be showered with a crown of America. Let me ask you this. Since you were in government, golly, a decade ago, that’s kind of light years in a technology sense. And now we have a company called Nvidia, been around for ages. People used to be happy to get one of their cards in your computer because it could really rev up the gaming. Now they are worth a couple of trillion dollars thanks to artificial intelligence, which has come on the scene in a big way. What are your thoughts about good ways for government to bring this in to the whole digital effort? Really, that’s never ending.

Jennifer Pahlka Yeah. I used to work in the video game business. I ran the Game Developers Conference for eight years, and so I knew Nvidia back in the day when it was gaming that drove their business. Artificial intelligence is fascinating to me, and I think I came to it a little bit late because I had been fatigued by the hype around blockchain. But when I started really paying attention to it, I started to see that this is a profound opportunity to bring our government forward. And I’ll tell you one thing that brought me along. I was visiting the Department of Labor in New Jersey about the spring before I did my book, and ChatGPT had just come out. Now, you had this great team at the [Department of Labor (DOL)] and also the new Jersey Office of Innovation, wonderful folks that they were working to make unemployment insurance just better every day. It’s a fantastic strategy. They’re not going for some big procurement, they’re just bringing on people who know what they’re doing and really fixing it week by week. One of the things that this designer was doing was rewriting the letters that people get, their emails that are letters you get in the mail about your claim about adjudication. And they’re so hard to understand. They’re written in legalese, and it’s one of the reasons people don’t reply. And then you get a longer backlog. Well, she had been rewriting them for a sort of eighth grade, ninth grade level and then going to the policy team and saying, is this still correct? And then if it is, let’s bold the call to action, put it in big letters, then we read the letter and then people start to interact with unemployment insurance a lot easier.

Jennifer Pahlka Well, ChatGPT had just come out and she was just feeding those letters into ChatGPT with the prompt, rewrite this so I can understand it better. Now. She still went to the policy team and checked it with them. This is not giving over decision making to AI. It’s an AI as an assist to somebody. And I asked her, so is it really helping you? She said, I think we’re getting through these letters about 4 or 5 times as fast as we used to. It’s not an automatic process, it’s an assist. And I thought, that is a fantastic use of AI. We should not be getting things like that. We should just let people do this. Now there are things that are going to need some review. And of course, the AI executive order has called out some of those things. I think we need to be really careful that the guards that are put in place through things like the AI EO for really risky applications don’t get applied to these really low risk, high value applications. And the use of them to make letter simpler is just one example. There are dozens, hundreds of others where we really want to enable. We want to put our foot on the gas, not on the brakes, I think in, in those areas. The one area that I think we do need to have caution about, that is not the one that most people talk about, like we don’t want to give over decision making to AI. I think that that is much less important than people think, and there’s so many other applications. But we have such complex rules and regulations like unemployment insurance. We’ve been adding rules and regs for 90 years. We never take them away. So you’ve got thousands and thousands of pages of regs that cover this pretty simple program. I think a lot of people are excited about AI’s ability to sort of manage that complexity. And I want people instead to be excited about AI’s ability to help us simplify that complexity, not just like, ok, we’ll be able to get through this, but here are some proposed simplifications that state legislatures and Congress ought to really take seriously, so that if you’re trying to do unemployment insurance in the next downturn, you have 100 pages of regs to deal with, not 9,000.

Tom Temin And I want to bridge to another use of AI and the generative AI, which is to make computer programs. And as someone whose endeavors usually have the word coding in them, one technologist said the other day, well, thanks to the generative AI, English is the new programing language. What’s your thought on that?

Jennifer Pahlka I think that is part of a trend that’s been happening for a long time. Technology is sort of the ability to make things that other people can use has been democratized for a long time. AI is for the next phase in it. But go back to that time I talked about right when the shut down sent us all home and all these governments were reaching out to volunteers through USDR. It happened to be a time also when no code, low code tools were becoming available. And about a half of the requests that we got from state and local governments at the time. Can you stand up a form for emergency rental assistance, for example. Those were actually pretty easily filled, not by some fancy tech team, but by a couple of volunteers teaching local government officials themselves how to use something like Airtable. It’s just that much more powerful than Excel. And you can really actually run a benefits program. That’s not too advanced off of it. And I think that is part generally of how we’re going to make all of these things that used to be very specialized, really possible for anybody to do.

The post New post for long-time advocate of better digital government first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/technology-main/2024/03/new-post-for-long-time-advocate-of-better-digital-government/feed/ 0
How to ask the right questions to define AI use cases https://federalnewsnetwork.com/federal-insights/2024/03/how-to-ask-the-right-questions-to-define-ai-use-cases/ https://federalnewsnetwork.com/federal-insights/2024/03/how-to-ask-the-right-questions-to-define-ai-use-cases/#respond Mon, 04 Mar 2024 15:29:08 +0000 https://federalnewsnetwork.com/?p=4911874 This approach also points more accurately to the right type of program to apply to the problem, whether RPA, a traditional AI algorithm or generative AI.

The post How to ask the right questions to define AI use cases first appeared on Federal News Network.

]]>

Discussions about how to get started in artificial intelligence tend to focus on use cases. Agency staff ask themselves, in effect, “What could we do with AI?” or with machine learning or robotic process engineering.

Kathleen Featheringham, the vice president for AI and ML at Maximus, suggested a variation of that question, the answer to which will get agencies faster to real, measurable value from their AI investments.

AI success “is more about what are you struggling with? What are the problems, the mission elements, the things that you need to actually make things more successful,” Featheringham said during the series, Operational AI: Driving mission impact at scale. “Starting there helps really define what are good use cases.”

Featheringham said this approach also points more accurately to the right type of program to apply to the problem, whether RPA, a traditional AI algorithm or generative AI.

Program managers and agency executives needn’t bring AI expertise to the problem solving, just details of what they want to accomplish.

“It’s more about, hey, what are the issues, and then work with their technologists to be able to break it down for what type” of AI to use, Featheringham said.

Often, she said, organizations find that functions with rote activities, like sifting through large quantities of documents or images, often provide the uses cases with the fastest returns on investment. A useful approach comes from thinking as an assistant would, asking what would most aid a particular workflow and the person or people performing it.

“How many times you go and do searches in different databases,” Featheringham said. “Let’s say you have to search through seven different ones. Robotic process automation would be great for that.”

Augmenting people

Even AI used to create new outputs, such as analyses or summaries of existing documents, Featheringham said, would not replace people doing those functions because of the need to check the work of the algorithm.

She likened deployment of generative AI to a new employee, regardless of how well trained and educated, asked to turn in a work product.

“Would you just turn in what they gave you? Probably not,” Featheringham said. “Would you go through it fact check it, really refine it? Absolutely.”

It’s wise to ask people doing the day-to-day work about their obstacles and pain points. But Featheringham advised also observing people’s work directly.

“I’ve spent a lot of time going and standing behind people for, ‘show me how you do your job. What do you do?’” she said.

Featheringham added experienced people may become so skilled at integrating multiple tasks and processes that the observer needs to ask why people do certain things in certain ways. That questioning can lead to a much deeper understanding of where to apply AI.

“You get the initial [ideas] of what they think are their needs,” Featheringham said. “But then you also really see it in action. With any of these types of emerging technologies, you can’t just go straight off requirements, you have to see how it’s going to be put into action, how the people would be interacting with it.”

One function Maximus has helped agencies with makes work easier for communications writers. By generating some of pro forma elements of a piece from automated research, the AI lets writers spend more of their often-limited time crafting the creative piece in their own voices.

AI at scale

Beyond specific applications at the individual level, agencies must think about how to deploy AI in the context of modernization, and the enterprise level. That in turn, Featheringham said, requires understanding how AI differs from traditional enterprise software.

“AI is not just the same as any traditional software application,” she said. “There are some nuances that come into play that have to be accounted for.”

Among them: The fact that AI requires training data which the agency must curate so as to avoid bias. Another is that AI, by definition, changes its own logic as it learns.

Featheringham recommended an “ModelOps” approach, modeled after the secure development operations (SecDevOps) approach so many organizations have adopted. It lets organizations deploy regular, incremental software releases with reliable functional and security characteristics. The alternative is slow, expensive “bespoke” applications that require custom work for every use case.

The key to ModelOps lies “in how you can build the controls and measures into the systems so that you can bring different types of AI models and tools in safely,” Featheringham said. “You can actually monitor what they’re doing.”

ModelOps, when designed properly, ensures the agency can see and account for the way given inputs result in changing outputs as AI models learn and adapt.

She added that ModelOps must also take into account that the different flavors of AI vary in the degree to which their performance is probabilistic — very high for generative but low for RPA, which is more deterministic.

Listen to the full show:

The post How to ask the right questions to define AI use cases first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/03/how-to-ask-the-right-questions-to-define-ai-use-cases/feed/ 0
For DoD financial management systems, a not-so-pretty picture https://federalnewsnetwork.com/defense-main/2024/02/for-dod-financial-management-systems-a-not-so-pretty-picture/ https://federalnewsnetwork.com/defense-main/2024/02/for-dod-financial-management-systems-a-not-so-pretty-picture/#respond Wed, 28 Feb 2024 20:25:49 +0000 https://federalnewsnetwork.com/?p=4905971 One reason the Defense Department can't get to a clean financial audit has to do with its multiple and outdated financial management systems. The DoD does have

The post For DoD financial management systems, a not-so-pretty picture first appeared on Federal News Network.

]]>
var config_4905472 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB2959045006.mp3?updated=1709128270"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"For DoD financial management systems, a not-so-pretty picture","description":"[hbidcpodcast podcastid='4905472']nnOne reason the Defense Department can't get to a clean financial audit has to do with its multiple and outdated financial management systems. The DoD does have a plan to modernize the systems, but the Office of Inspector General (OIG) finds a little trouble with how officials are going about it. For the latest, <a href="https:\/\/federalnewsnetwork.com\/category\/temin\/tom-temin-federal-drive\/"><em><strong>the Federal Drive with Tom Temin<\/strong><\/em><\/a>\u00a0 talked with OIG project manager Chris Hilton and Shelby Barnes.nn<em><strong>Interview Transcript:\u00a0<\/strong><\/em>n<blockquote><strong>Tom Temin <\/strong>And fair to say, this was an audit, not so much of DoD finances, but of the systems that make up the financial network there and of their plans to modernize it. That a good way to put it?nn<strong>Shelby Barnes <\/strong>Yes. So I think that's a great way to summarize what this audit was. We focused on the DoD financial systems specifically. We reviewed the systems that were subject to the Federal Financial Management Improvement Act. Essentially, this is a law that requires that systems capture data and record transactions properly. And the DoD has established goals to, as you said, modernize its systems environment and to update its systems or stop using some of its old systems by 2028. However, what we found in our audit was that goal wasn't aggressive enough. And without a more modern systems environment, we found that the DoD will just continue to spend a lot of money on systems that don't record those transactions properly.nn<strong>Tom Temin <\/strong>And just to define the scope of this, it's not just the Pentagon and the fourth estate agencies, but does this also include the armed forces and they're often multiple financial systems?nn<strong>Chris Hilton <\/strong>Yes. It definitely includes all of those systems and all those parts and pieces of the DoD. We looked at basically any plans related to maintaining the DoD's IT system environment and how they impact the DoD financial statements. By the numbers DoD's IT environment contains over 400 systems and applications and over 2000 interfaces. This complex environment contributes to many of the DoD challenges.nn<strong>Tom Temin <\/strong>Right. And it's not simply the multiplicity of them, but in some cases, the age of them and the fact that they can't interoperate with one another in some cases. Fair to say?nn<strong>Chris Hilton <\/strong>That is absolutely correct. I think some of the systems that the DoD still uses today are from the 1950s and 1960s and 1970s. Obviously, they weren't necessarily always intended to produce financial statements. That's a newer requirement. So those are some of the challenges that the department is dealing with.nn<strong>Tom Temin <\/strong>Right. Because in the 1950s and 1960s, they could count the beans, so to speak, but they don't meet what are considered contemporary standards for financial systems.nn<strong>Chris Hilton <\/strong>Correct.nn<strong>Shelby Barnes <\/strong>Yes. That's correct.nn<strong>Tom Temin <\/strong>Plus, there's a certain cost in maintaining these old systems, and the multiplicity is a cost multiplier itself. Fair to say.nn<strong>Chris Hilton <\/strong>That is fair to say. One of our highlights in our report is that the DoD maintains 37 purchasing systems throughout all its components and pieces. And obviously, that presents challenges from the perspective of, well, if you have a challenge across 37 systems, and you have to have 37 corrective actions, so that does present significant challenges for the department.nn<strong>Tom Temin <\/strong>Right. And you mentioned they have 400 systems with 200 interfaces. So that's even beyond the purchasing systems.nn<strong>Chris Hilton <\/strong>2,000 interfaces. I wish it was 200.nn<strong>Tom Temin <\/strong>Yeah, I didn't write the third one down on my sheet here. Ok, so we've got the full scope of that. And let's talk about the scope of the plan. That is to say, what do they hope to do by 2028 at this point. What's their envision for all of this.nn<strong>Shelby Barnes <\/strong>Yeah. So that's actually one of the things that we identified within our audit that wasn't particularly clear. The DoD has multiple plans, all of which focus on a simplified systems environment. That is the department's desire and that is the DoD's goal. But what we found was that the plans didn't clarify what systems the DoD plans to keep and what systems they plan to retire between now and 2028. And so that was one of the things that we highlighted within our report, that the DoD does need to clarify what systems it plans to update, to modernize, and which of those systems it needs to stop using. And we recommended that they stop using them as swiftly as possible.nn<strong>Tom Temin <\/strong>Right. It sounds therefore like the plan is more of a guidance to a future vision than a detailed modernization plan.nn<strong>Shelby Barnes <\/strong>Yes, I would say that's exactly what we found within our audit.nn<strong>Tom Temin <\/strong>We're speaking with Shelby Barnes and Chris Hilton. They are project managers in the Office of Inspector General at the Defense Department. And did you find that they're putting sufficient resources against this modernization effort? And is it in the right place? That is it a CIO project? Is it a CFO project or is it across different boundaries?nn<strong>Chris Hilton <\/strong>I would say there are definitely putting a lot of resources in the area. I think our audit found that there was approximately $4 billion they spent in 2022 on these financial management system. And I think that's one of the challenges we identified, obviously, from the perspective of you're spending so much on these systems that aren't going to get you where you want to go in the current year. And if you just kind of do things as swiftly as possible, like Shelby mentioned, they will get the department to a lot better place.nn<strong>Tom Temin <\/strong>I mean, is there a strategy to say take within one of the armed services, for example, or in something like [Defense Information Systems Agency (DISA)], which is a large component agency, and just consolidate within that piece that component, which would maybe eliminate dozens. And then try to get the Air Force and the Army and DISA together. I'm just making that up, but that idea.nn<strong>Shelby Barnes <\/strong>There definitely are goals that each of, you mentioned, like the Army, Navy and Air Force, they all have their own goals, the plans that we were looking at work for the entire DoD. So I think that what you're speaking about definitely exists at that individual component level. Our review just determined at the entire DoD level. Was the plan detailed enough to get the department where it wants to go?nn<strong>Chris Hilton <\/strong>I would also add to that that there's significant initiatives there to move the department in the right direction, and there are indications that they're doing so. I know, for example, U.S. Marine Corps, they transition to a modern ERP in an effort to attain a clean on their opinion. So there is definitely traction there. I think one of the biggest things talking about, like it being a CIO challenge or a CFO challenge or a military department challenge, is really a team effort. And this is one thing that Mr. Stephens, the deputy chief financial officer, has really focused on. This is a team effort being DoD. DoD is not going to get across the finish line without everyone pushing in the same direction. So that's one thing that has been a laser focus of the department. It's really like this is a team effort, both horizontally across CIO and CFO, but also vertically down to the components and up to DoD.nn<strong>Tom Temin <\/strong>And what were your major recommendations then?nn<strong>Shelby Barnes <\/strong>So one of the most significant recommendations that we made was for the department to create a strategy where it basically determines for all of its systems, whether or not they're going to update their system or if they are going to retire and stop using that system. Essentially, the DoD needs to we believe that the strategy is important because the DoD really needs to wrap their arms around what they have now, and they need to determine what's going to remain and get those systems updated so that they can start producing good and reliable data.nn<strong>Tom Temin <\/strong>And these financial systems, are these a subset of the business systems that comprise the DoD? Because they've had several runs at business system modernizations over the years, at least the 20 years I've been looking at it closely. There have been several gambits to try to get around the business systems, financial systems, a subset here?nn<strong>Chris Hilton <\/strong>Yeah, there are actually, approximately 4600 DoD IT systems, and only about 5% of them currently fall in the category of financial management systems. So it's a actually a quite small subset of the bigger DoD system environment. And obviously trying to get our arms or DoD trying to get its arms around that environment is needed, obviously, to produce good financial data and hopefully obtain an audit opinion.nn<strong>Tom Temin <\/strong>And in general on the plan they have, which doesn't have the detail that you feel they do need, but their plan to 2028, is this basically an in-house effort or do they have integrator support and programmer contractor support?nn<strong>Chris Hilton <\/strong>It's kind of a mixed bag. I mean, obviously there's a lot of contractor support in this effort. So it is diverse I guess, in how they're addressing the issue.nn<strong>Tom Temin <\/strong>All right. And would you say that this is an urgent set of recommendations, this audit. And this publication.nn<strong>Shelby Barnes <\/strong>I would say yes. We feel that this audit report and this recommendation is really imperative. We know that the DoD is working very hard and putting a lot of resources towards modernizing its systems. But we feel that some of the recommendations within this report are really going to put the department on the right track to modernize their system environment, maybe quicker, and that has a direct impact on so many things operationally. And then also the financial statement office.nn<strong>Tom Temin <\/strong>And your memorandum went to the secretary, the deputy secretary, the undersecretary, the Comptroller, the CIO, the auditors, and so on of the different armed services. They know they've got a problem, fair to say.nn<strong>Chris Hilton <\/strong>That's fair to say.nn<strong>Tom Temin <\/strong>And did they generally concur with your recommendations?nn<strong>Chris Hilton <\/strong>Yes. Actually, we had 31 recommendations, quite a few. They concurred with all but one, and the one that they didn't concur with we did ask for further comments. And I think we're kind of headed in the right direction with that one as well. So they know it's a problem. That's one thing we did find during our audit was there's already a lot of efforts going forward. We're just making sure that they're best positioned to make maintain systems that produce good data, uses taxpayer dollars efficiently. And like Shelby said, obtain an audit opinion by 2028.nn<strong>Tom Temin <\/strong>And in the meantime, we could use a few years without continuing resolutions that might help.<\/blockquote>"}};

One reason the Defense Department can’t get to a clean financial audit has to do with its multiple and outdated financial management systems. The DoD does have a plan to modernize the systems, but the Office of Inspector General (OIG) finds a little trouble with how officials are going about it. For the latest, the Federal Drive with Tom Temin  talked with OIG project manager Chris Hilton and Shelby Barnes.

Interview Transcript: 

Tom Temin And fair to say, this was an audit, not so much of DoD finances, but of the systems that make up the financial network there and of their plans to modernize it. That a good way to put it?

Shelby Barnes Yes. So I think that’s a great way to summarize what this audit was. We focused on the DoD financial systems specifically. We reviewed the systems that were subject to the Federal Financial Management Improvement Act. Essentially, this is a law that requires that systems capture data and record transactions properly. And the DoD has established goals to, as you said, modernize its systems environment and to update its systems or stop using some of its old systems by 2028. However, what we found in our audit was that goal wasn’t aggressive enough. And without a more modern systems environment, we found that the DoD will just continue to spend a lot of money on systems that don’t record those transactions properly.

Tom Temin And just to define the scope of this, it’s not just the Pentagon and the fourth estate agencies, but does this also include the armed forces and they’re often multiple financial systems?

Chris Hilton Yes. It definitely includes all of those systems and all those parts and pieces of the DoD. We looked at basically any plans related to maintaining the DoD’s IT system environment and how they impact the DoD financial statements. By the numbers DoD’s IT environment contains over 400 systems and applications and over 2000 interfaces. This complex environment contributes to many of the DoD challenges.

Tom Temin Right. And it’s not simply the multiplicity of them, but in some cases, the age of them and the fact that they can’t interoperate with one another in some cases. Fair to say?

Chris Hilton That is absolutely correct. I think some of the systems that the DoD still uses today are from the 1950s and 1960s and 1970s. Obviously, they weren’t necessarily always intended to produce financial statements. That’s a newer requirement. So those are some of the challenges that the department is dealing with.

Tom Temin Right. Because in the 1950s and 1960s, they could count the beans, so to speak, but they don’t meet what are considered contemporary standards for financial systems.

Chris Hilton Correct.

Shelby Barnes Yes. That’s correct.

Tom Temin Plus, there’s a certain cost in maintaining these old systems, and the multiplicity is a cost multiplier itself. Fair to say.

Chris Hilton That is fair to say. One of our highlights in our report is that the DoD maintains 37 purchasing systems throughout all its components and pieces. And obviously, that presents challenges from the perspective of, well, if you have a challenge across 37 systems, and you have to have 37 corrective actions, so that does present significant challenges for the department.

Tom Temin Right. And you mentioned they have 400 systems with 200 interfaces. So that’s even beyond the purchasing systems.

Chris Hilton 2,000 interfaces. I wish it was 200.

Tom Temin Yeah, I didn’t write the third one down on my sheet here. Ok, so we’ve got the full scope of that. And let’s talk about the scope of the plan. That is to say, what do they hope to do by 2028 at this point. What’s their envision for all of this.

Shelby Barnes Yeah. So that’s actually one of the things that we identified within our audit that wasn’t particularly clear. The DoD has multiple plans, all of which focus on a simplified systems environment. That is the department’s desire and that is the DoD’s goal. But what we found was that the plans didn’t clarify what systems the DoD plans to keep and what systems they plan to retire between now and 2028. And so that was one of the things that we highlighted within our report, that the DoD does need to clarify what systems it plans to update, to modernize, and which of those systems it needs to stop using. And we recommended that they stop using them as swiftly as possible.

Tom Temin Right. It sounds therefore like the plan is more of a guidance to a future vision than a detailed modernization plan.

Shelby Barnes Yes, I would say that’s exactly what we found within our audit.

Tom Temin We’re speaking with Shelby Barnes and Chris Hilton. They are project managers in the Office of Inspector General at the Defense Department. And did you find that they’re putting sufficient resources against this modernization effort? And is it in the right place? That is it a CIO project? Is it a CFO project or is it across different boundaries?

Chris Hilton I would say there are definitely putting a lot of resources in the area. I think our audit found that there was approximately $4 billion they spent in 2022 on these financial management system. And I think that’s one of the challenges we identified, obviously, from the perspective of you’re spending so much on these systems that aren’t going to get you where you want to go in the current year. And if you just kind of do things as swiftly as possible, like Shelby mentioned, they will get the department to a lot better place.

Tom Temin I mean, is there a strategy to say take within one of the armed services, for example, or in something like [Defense Information Systems Agency (DISA)], which is a large component agency, and just consolidate within that piece that component, which would maybe eliminate dozens. And then try to get the Air Force and the Army and DISA together. I’m just making that up, but that idea.

Shelby Barnes There definitely are goals that each of, you mentioned, like the Army, Navy and Air Force, they all have their own goals, the plans that we were looking at work for the entire DoD. So I think that what you’re speaking about definitely exists at that individual component level. Our review just determined at the entire DoD level. Was the plan detailed enough to get the department where it wants to go?

Chris Hilton I would also add to that that there’s significant initiatives there to move the department in the right direction, and there are indications that they’re doing so. I know, for example, U.S. Marine Corps, they transition to a modern ERP in an effort to attain a clean on their opinion. So there is definitely traction there. I think one of the biggest things talking about, like it being a CIO challenge or a CFO challenge or a military department challenge, is really a team effort. And this is one thing that Mr. Stephens, the deputy chief financial officer, has really focused on. This is a team effort being DoD. DoD is not going to get across the finish line without everyone pushing in the same direction. So that’s one thing that has been a laser focus of the department. It’s really like this is a team effort, both horizontally across CIO and CFO, but also vertically down to the components and up to DoD.

Tom Temin And what were your major recommendations then?

Shelby Barnes So one of the most significant recommendations that we made was for the department to create a strategy where it basically determines for all of its systems, whether or not they’re going to update their system or if they are going to retire and stop using that system. Essentially, the DoD needs to we believe that the strategy is important because the DoD really needs to wrap their arms around what they have now, and they need to determine what’s going to remain and get those systems updated so that they can start producing good and reliable data.

Tom Temin And these financial systems, are these a subset of the business systems that comprise the DoD? Because they’ve had several runs at business system modernizations over the years, at least the 20 years I’ve been looking at it closely. There have been several gambits to try to get around the business systems, financial systems, a subset here?

Chris Hilton Yeah, there are actually, approximately 4600 DoD IT systems, and only about 5% of them currently fall in the category of financial management systems. So it’s a actually a quite small subset of the bigger DoD system environment. And obviously trying to get our arms or DoD trying to get its arms around that environment is needed, obviously, to produce good financial data and hopefully obtain an audit opinion.

Tom Temin And in general on the plan they have, which doesn’t have the detail that you feel they do need, but their plan to 2028, is this basically an in-house effort or do they have integrator support and programmer contractor support?

Chris Hilton It’s kind of a mixed bag. I mean, obviously there’s a lot of contractor support in this effort. So it is diverse I guess, in how they’re addressing the issue.

Tom Temin All right. And would you say that this is an urgent set of recommendations, this audit. And this publication.

Shelby Barnes I would say yes. We feel that this audit report and this recommendation is really imperative. We know that the DoD is working very hard and putting a lot of resources towards modernizing its systems. But we feel that some of the recommendations within this report are really going to put the department on the right track to modernize their system environment, maybe quicker, and that has a direct impact on so many things operationally. And then also the financial statement office.

Tom Temin And your memorandum went to the secretary, the deputy secretary, the undersecretary, the Comptroller, the CIO, the auditors, and so on of the different armed services. They know they’ve got a problem, fair to say.

Chris Hilton That’s fair to say.

Tom Temin And did they generally concur with your recommendations?

Chris Hilton Yes. Actually, we had 31 recommendations, quite a few. They concurred with all but one, and the one that they didn’t concur with we did ask for further comments. And I think we’re kind of headed in the right direction with that one as well. So they know it’s a problem. That’s one thing we did find during our audit was there’s already a lot of efforts going forward. We’re just making sure that they’re best positioned to make maintain systems that produce good data, uses taxpayer dollars efficiently. And like Shelby said, obtain an audit opinion by 2028.

Tom Temin And in the meantime, we could use a few years without continuing resolutions that might help.

The post For DoD financial management systems, a not-so-pretty picture first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/defense-main/2024/02/for-dod-financial-management-systems-a-not-so-pretty-picture/feed/ 0
HHS takes step toward goal for better health information sharing https://federalnewsnetwork.com/workforce/2024/02/hhh-takes-step-toward-goal-for-better-health-information-sharing/ https://federalnewsnetwork.com/workforce/2024/02/hhh-takes-step-toward-goal-for-better-health-information-sharing/#respond Tue, 27 Feb 2024 20:07:36 +0000 https://federalnewsnetwork.com/?p=4904328 One of the biggest obstacles to streamlining information sharing in the health field, is the data itself. For decades, various health information systems decades simply have not been compatible with one another. That makes things…

The post HHS takes step toward goal for better health information sharing first appeared on Federal News Network.

]]>
var config_4903746 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB4969229086.mp3?updated=1709037410"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"HHS takes step toward goal for better health information sharing","description":"[hbidcpodcast podcastid='4903746']nnOne of the biggest obstacles to streamlining information sharing in the health field, is the data itself. For decades, various health information systems decades simply have not been compatible with one another. That makes things slower and less efficient for patients, health care practitioners, and the industry itself. Recently Health and Human Services (HHS) updated something known as TEFCA,\u00a0 the trusted exchange framework and common agreement. TEFCA is all about interoperability of health information. For the details, <em><strong>t<a href="https:\/\/federalnewsnetwork.com\/category\/temin\/tom-temin-federal-drive\/">he Federal Drive with Tom Temin<\/a><\/strong><\/em> \u00a0spoke with Dr. Micky Tripathi, the National Coordinator for Health Information Technology at HHS.nn<em><strong>Interview Transcript:\u00a0<\/strong><\/em>n<blockquote><strong>Tom Temin <\/strong>And there seem to be two parallel efforts that have been going on for some time. One is about the data. One is about getting more institutions to use electronic health records, which has been partially successful. But tell us more about TEFCA what it is and what's going on with it.nn<strong>Micky Tripathi <\/strong>So let's just break that down a little bit. First about getting people to use electronic health record systems. We've actually had remarkable success over the last decade. So owing to a lot of federal support in the way of incentives to provide organizations, as well as a lot of private contribution and sweat equity from physicians and those adopting systems. Now, 97% of hospitals and about 80% of physician offices across the country use a certified electronic health record. So not just any old electronic health record system, but they use an electronic health record systems that certified by my office, the national coordinators office. So we don't actually have a big problem with respect to adoption of electronic health records among hospitals and physician offices. We've done a tremendous amount of work over a dozen years on the public and private side to get that in place. But what we're trying to do now is make it as easy as possible for those systems to share information with each other in the best interest of patient care.nn<strong>Tom Temin <\/strong>Got it. And that's where the data interoperability piece comes in. And so TEFCA is all about the data. Fair to say?nn<strong>Micky Tripathi <\/strong>It's all about sharing data among those systems in a safe, reliable, accurate, privacy protecting way.nn<strong>Tom Temin <\/strong>Is the challenge there for getting the systems to maybe update or alter in such a way that the data becomes more interoperable. That is to say, if your gastroenterologist has one system and your eye doctor has another, why those two would ever need the mix, I don't know. But the idea is that one practitioner could see what's going on with another again, at the micro level. And also, I guess for the research community, having interoperable data from multiple sources of systems would be really important.nn<strong>Micky Tripathi <\/strong>Yeah. And certainly one part of it is making sure that the data is sort of compatible, so that if I get information from another practice from another provider, that I actually can make use of it and not have to go through all sorts of expense and heroic efforts on my side to figure out what that data is, because the minute you do that, we know that people will do what everyone does and that you and I do in our regular lives, which is, well, I got it electronically, but it's too hard to figure out. So just send me a fax or let me just pick up the call or can you just mail it to me, it's a lot easier. So what do you need to do is say, how do you make this electronic mode easier for people than the existing ways of doing it? So one part is the data itself, and I'm happy to report that's a big part of what we've done with the electronic health records. So as a part of those electronic health records that, as I said, cover the vast majority of hospitals and most physician offices, they're required to support a minimum data set standard that we call the U.S. Core Data for Interoperability, U.S CDI. And that's like a minimum data set that standardized data that covers all of the data, mostly that you woud think of, Tom, even though I assume you're not a physician. But if you were off the top of your head going to say, what information do I think my doctor would want to have? Well, it's your problems, your allergies, your medications, your lab reports, your results of imaging. That's what's in that dataset. So that we've been able to accomplish. It's absolutely not perfect, but there's a lot of commonality there. So if you're a Nome, Alaska, or in Sarasota, Florida, you can have a pretty good expectation that the data you're going to get out of an EHR system is roughly compatible. Again, not perfect, but pretty good.nn<strong>Micky Tripathi <\/strong>The challenge is how do you connect up those systems, so that when I ask you for a record for, let's say electronically, that I know you are who you claim to be? How do I know that you're not Joe's hacking shop trying to hack into medical records and then sell them on the dark web. And that you're actually authorized to have that information. So there's a difference between saying, well, you are a physician office, but how do I know that you actually see that patient? Because if I give that information to you and you actually don't have a treatment relationship with that patient, that's a violation of privacy from an ethical perspective, it also could be a violation of law as it relates nationally as well as state. So that's what these networks do is provide that overlay of governance and technical and policy requirements that give everyone assurance that everyone on this network is a responsible actor, and if they don't act responsibly there'll be penalties and sanctions associated with it.nn<strong>Tom Temin <\/strong>We are speaking with Dr. Micky Tripathy. He is national coordinator for health information technology at the Department of Health and Human Services. And who are the parties to TEFCA? I imagine the federal government is more of a convener, but also a party to it.nn<strong>Micky Tripathi <\/strong>Yeah. And I think you said that right. The federal government is a convener right now. So the direction that we got from the 21st Century Cures Act of 2016 was for ONC, my office, Office of the National Coordinator to help to develop a nationwide network of networks interoperability model. And what that means is why do we say network of networks? The analogy I like to use is think about the way cell phone networks or ATM networks for that matter, work today in the market. Let's take cell phones because everyone's very familiar with those. You've got AT&T and Verizon and T-Mobile and Sprint, and all of those are actually private networks. If you think about it, they're private commercial networks, but they are connected on the back end via network like governance, technical specifications, expectations about how they exchange information in a way that you and I have the experience of it being a single network. We don't worry about, well, Tom, you bought AT&T phone. I bought a Verizon phone. We're not gonna be able to talk with each other. We never worry about that. We go to Best Buy and we buy or go wherever you go, and you buy the cheapest, best phone for your needs. And you know what's going to be connected with everything else. Right now in health care, we have hundreds of networks, literally. Some of them are state and local networks. Some of them are nationwide networks, but they really don't connect with each other. And what we want to be able to do, and the direction we got from Congress was basically not deploy boards in their mouth, but basically say do for these clinical networks, what cell phone networks have today that you purchase the system you want, you join the network you want, and you'll have the assurance that you will be safely connected to every other network. And you don't need to worry about that anymore.nn<strong>Tom Temin <\/strong>And is the network technically encrypted or VPN type of version traveling over the internet, or are there actual still networks like we used to have value out of networks that predated the internet?nn<strong>Micky Tripathi <\/strong>Yeah these are, in the sort of in the modern age, though, everyone has commodity internet. So basically a network is about establishing governance and then establishing security protocols and\u00a0 technical infrastructure like public key infrastructure, for example, to define what is the network, if you're a part of that, that PKI infrastructure using X509 certs and all of that regular infrastructure, then you are now a part of our network and there are rules about who's in and who's out and what are the rules of the road. We're not laying down T1, T3 lines anymore. We can just use the commodity internet, but there is a security overlay. So only those who are a part of the network can actually exchange information with each other within the same way that your banking information is highly protected, even though you're using commodity internet. There's no special line between you and the bank. You've just got additional security provisions on top. We use the same set of security protocols for TEFCA, this kind of network exchange as well. But the networks again, it's network of network. The networks are already established networks. I mean, that's the principle is that we're not starting from scratch to build these from the ground up. We're saying these are networks that already have a significant number of participants already, and they have to meet certain eligibility requirements as well as technical performance requirements to be considered a tougher network. And once they pass those tests, then they're able to go live and connect it with each other.nn<strong>Tom Temin <\/strong>And first responders. And that whole community often generates the initial information on health when they respond to someone who might be injured or burned or whatever the case might be. Are those party to TEFCA also?nn<strong>Micky Tripathi <\/strong>That's a great general example, actually, of some of the gaps that exist in the marketplace today, and that we want to be able to use TEFCA to help fill. So, as I said, there are a number of networks now. There are literally hundreds of networks across the country that exchange information. And the private sector's actually done a fantastic job. Before joining the federal government in 2001, I was very much a part of that sitting on the boards of some of these nonprofit networks. And so I saw firsthand how much they had accomplished. But the private sector alone can't do it alone, because health care, as you pointed out earlier, the federal government and state governments are very involved in health care. They deliver health care, they pay for health care. They set the rules of the road for health care. So it's very hard for the private sector on its own to solve all these problems. And so that's what TEFCA represents, is really saying, all right, the private sector has taken that as far as it can possibly do, it's done a great job. But now we need public private collaboration with the power of federal government convening to help to bring that together, to say what are the other things we want to do?nn<strong>Micky Tripathi <\/strong>One, is what I described, which is connecting the networks together. Second side of things is there are gaps that the market itself hasn't really solved and has difficulty solving. One is, I should point out, first responders. So we're actually working with a group of first responders who are now already working on joining one of the approved networks, so that we do have the ability then for first responders, ambulances and other first responders to be able to share information with provider organizations. Other gaps that I would point to, though, are public health, huge gap. Right now, even after a pandemic, public health agencies were not able to connect to the networks, the nationwide networks that exist today, for a variety of reasons related to the complexity of regulatory frameworks and the fragmentation of jurisdictions and all of that. Nothing the private sector can solve on its own. That's something that the federal government, ONC and the CDC and jurisdictions working together. Last thing I'll point to or two things. Another one is individual access. You as a patient ought to be able to access the network to get your own information. That seems like how to be fundamental. So we're working very hard to say that's what TEFCA needs to be able to support as well. There's lots of complexity there. And that's why, again this public private collaboration is needed. And then finally payers. Health care payers, they have been excluded from these networks for a variety of competitive reasons that exist in the market. Again, we as the federal government have been saying, you know what, we understand there are competitive issues, but that can't be what prevents us to get to the higher level of health care interoperability that American citizens need. And so that's we're going to do is break through that to fill that gap as well.nn<strong>Tom Temin <\/strong>And so in a lot of ways, the banking and credit card systems and clearances and there's a whole complexity behind all of that. Or say the airlines have inter airline clearance mechanisms and payment mechanisms going back decades. Those are pretty good models too.nn<strong>Micky Tripathi <\/strong>They are absolutely. There's a lot of similarities there. I think one of the differences and why we need more of this proactive public, private sort of collaboration here, is that unlike those other industries. The federal government involved in all those industries, so it's not as if it's not. But the federal government plays a unique and very large role in health care that's somewhat different than in other industries. And the other thing about health care is that it's unbelievably fragmented, much more so than banking airlines that have a lot of consolidation to them. Health care is unbelievably fragmented. And so it takes something like the federal government to help to just convene everyone to say, all right, we're going to get everyone together and work with states as well, to say we're not stepping on states toes, but we need to have something that gives more system to our system. That's the important role that the federal government plays on this.nn<strong>Tom Temin <\/strong>And late last year, the TEFCA group,\u00a0 updated from 1.0 to 1.1, which indicates the relative newness of the whole enterprise here in TEFCA.. But what changed recently.nn<strong>Micky Tripathi <\/strong>2.0 is just about to happen. So, not to get too wonky here, so just set the table here for everyone just so everyone knows what's happened. When we came into this administration, we said within a year we are going to get the TEFCA framework out to the public for the public to react to and provide us comments back in with an eye towards saying we're going to go live. We're going to have networks that step forward and say that they want to do this voluntarily, because the 21st Century Cures Act didn't give the federal government, my office, any budget or any new authority for TEFCA, and explicitly said that TEFCA has to be voluntary. So I have no ability to order, nor does Secretary Becerra have the ability to order anyone to join TEFCA. So we have to make it a true public private collaboration model that say, how can we work together to get to something that all of us want and that the private sector sees as valuable? Otherwise they're not going to invest their money. So within a year, we made available version 1.0 of this common agreement, which is a common contract that everyone across the country would sign if they want to participate in TEFCA exchange. So everyone knows the rules of the road. Again, if you're in Nome, Alaska or Omaha, you know that if you have signed this agreement and you're sharing information with the provider organization in Nome, Alaska, they've agreed to the same set of rules. You don't have to worry about is there a different set of rules here that I don't understand that are going to get me in trouble? So the next thing we did is we said, now we invite private sector networks to step forward and join TEFCA as networks. And I'm really pleased to report that seven of them step forward a year after, as a year ago, step forward and said we are committed to implementing TEFCA. Some of those are very well known, I think to a lot of people. Epic, for example. Very large\u00a0 EHR vendor, they step forward and volunteer to be one of these networks, the Commonwealth Health Alliance, which covers Oracle Health, which is the VA's system.nn<strong>Tom Temin <\/strong>That's the obvious one I was going to ask about.nn<strong>Micky Tripathi <\/strong>Athenahealth, eClinicalWorks, Meditech are all under the Commonwealth Health Alliance umbrella. eHealth exchange, which has a number of federal government participants and at the VA, for example, participates in that as well as others. So significant network stepped forward, those seven and now a year later as of January, seven are now live. All seven of those are live exchanging information with each other. And then 2.0. What the 2.0 common agreement does, which we're going to release before the end of the first quarter here, the end of March,\u00a0 is it upgrades the technical standards to allow API based exchange for those who are technically knowledgeable, which is a more modern way of having information exchange in the same way that you download apps on your phone and make it that easy and that convenient. That's what TEFCA will support in this calendar year.nn <\/blockquote>"}};

One of the biggest obstacles to streamlining information sharing in the health field, is the data itself. For decades, various health information systems decades simply have not been compatible with one another. That makes things slower and less efficient for patients, health care practitioners, and the industry itself. Recently Health and Human Services (HHS) updated something known as TEFCA,  the trusted exchange framework and common agreement. TEFCA is all about interoperability of health information. For the details, the Federal Drive with Tom Temin  spoke with Dr. Micky Tripathi, the National Coordinator for Health Information Technology at HHS.

Interview Transcript: 

Tom Temin And there seem to be two parallel efforts that have been going on for some time. One is about the data. One is about getting more institutions to use electronic health records, which has been partially successful. But tell us more about TEFCA what it is and what’s going on with it.

Micky Tripathi So let’s just break that down a little bit. First about getting people to use electronic health record systems. We’ve actually had remarkable success over the last decade. So owing to a lot of federal support in the way of incentives to provide organizations, as well as a lot of private contribution and sweat equity from physicians and those adopting systems. Now, 97% of hospitals and about 80% of physician offices across the country use a certified electronic health record. So not just any old electronic health record system, but they use an electronic health record systems that certified by my office, the national coordinators office. So we don’t actually have a big problem with respect to adoption of electronic health records among hospitals and physician offices. We’ve done a tremendous amount of work over a dozen years on the public and private side to get that in place. But what we’re trying to do now is make it as easy as possible for those systems to share information with each other in the best interest of patient care.

Tom Temin Got it. And that’s where the data interoperability piece comes in. And so TEFCA is all about the data. Fair to say?

Micky Tripathi It’s all about sharing data among those systems in a safe, reliable, accurate, privacy protecting way.

Tom Temin Is the challenge there for getting the systems to maybe update or alter in such a way that the data becomes more interoperable. That is to say, if your gastroenterologist has one system and your eye doctor has another, why those two would ever need the mix, I don’t know. But the idea is that one practitioner could see what’s going on with another again, at the micro level. And also, I guess for the research community, having interoperable data from multiple sources of systems would be really important.

Micky Tripathi Yeah. And certainly one part of it is making sure that the data is sort of compatible, so that if I get information from another practice from another provider, that I actually can make use of it and not have to go through all sorts of expense and heroic efforts on my side to figure out what that data is, because the minute you do that, we know that people will do what everyone does and that you and I do in our regular lives, which is, well, I got it electronically, but it’s too hard to figure out. So just send me a fax or let me just pick up the call or can you just mail it to me, it’s a lot easier. So what do you need to do is say, how do you make this electronic mode easier for people than the existing ways of doing it? So one part is the data itself, and I’m happy to report that’s a big part of what we’ve done with the electronic health records. So as a part of those electronic health records that, as I said, cover the vast majority of hospitals and most physician offices, they’re required to support a minimum data set standard that we call the U.S. Core Data for Interoperability, U.S CDI. And that’s like a minimum data set that standardized data that covers all of the data, mostly that you woud think of, Tom, even though I assume you’re not a physician. But if you were off the top of your head going to say, what information do I think my doctor would want to have? Well, it’s your problems, your allergies, your medications, your lab reports, your results of imaging. That’s what’s in that dataset. So that we’ve been able to accomplish. It’s absolutely not perfect, but there’s a lot of commonality there. So if you’re a Nome, Alaska, or in Sarasota, Florida, you can have a pretty good expectation that the data you’re going to get out of an EHR system is roughly compatible. Again, not perfect, but pretty good.

Micky Tripathi The challenge is how do you connect up those systems, so that when I ask you for a record for, let’s say electronically, that I know you are who you claim to be? How do I know that you’re not Joe’s hacking shop trying to hack into medical records and then sell them on the dark web. And that you’re actually authorized to have that information. So there’s a difference between saying, well, you are a physician office, but how do I know that you actually see that patient? Because if I give that information to you and you actually don’t have a treatment relationship with that patient, that’s a violation of privacy from an ethical perspective, it also could be a violation of law as it relates nationally as well as state. So that’s what these networks do is provide that overlay of governance and technical and policy requirements that give everyone assurance that everyone on this network is a responsible actor, and if they don’t act responsibly there’ll be penalties and sanctions associated with it.

Tom Temin We are speaking with Dr. Micky Tripathy. He is national coordinator for health information technology at the Department of Health and Human Services. And who are the parties to TEFCA? I imagine the federal government is more of a convener, but also a party to it.

Micky Tripathi Yeah. And I think you said that right. The federal government is a convener right now. So the direction that we got from the 21st Century Cures Act of 2016 was for ONC, my office, Office of the National Coordinator to help to develop a nationwide network of networks interoperability model. And what that means is why do we say network of networks? The analogy I like to use is think about the way cell phone networks or ATM networks for that matter, work today in the market. Let’s take cell phones because everyone’s very familiar with those. You’ve got AT&T and Verizon and T-Mobile and Sprint, and all of those are actually private networks. If you think about it, they’re private commercial networks, but they are connected on the back end via network like governance, technical specifications, expectations about how they exchange information in a way that you and I have the experience of it being a single network. We don’t worry about, well, Tom, you bought AT&T phone. I bought a Verizon phone. We’re not gonna be able to talk with each other. We never worry about that. We go to Best Buy and we buy or go wherever you go, and you buy the cheapest, best phone for your needs. And you know what’s going to be connected with everything else. Right now in health care, we have hundreds of networks, literally. Some of them are state and local networks. Some of them are nationwide networks, but they really don’t connect with each other. And what we want to be able to do, and the direction we got from Congress was basically not deploy boards in their mouth, but basically say do for these clinical networks, what cell phone networks have today that you purchase the system you want, you join the network you want, and you’ll have the assurance that you will be safely connected to every other network. And you don’t need to worry about that anymore.

Tom Temin And is the network technically encrypted or VPN type of version traveling over the internet, or are there actual still networks like we used to have value out of networks that predated the internet?

Micky Tripathi Yeah these are, in the sort of in the modern age, though, everyone has commodity internet. So basically a network is about establishing governance and then establishing security protocols and  technical infrastructure like public key infrastructure, for example, to define what is the network, if you’re a part of that, that PKI infrastructure using X509 certs and all of that regular infrastructure, then you are now a part of our network and there are rules about who’s in and who’s out and what are the rules of the road. We’re not laying down T1, T3 lines anymore. We can just use the commodity internet, but there is a security overlay. So only those who are a part of the network can actually exchange information with each other within the same way that your banking information is highly protected, even though you’re using commodity internet. There’s no special line between you and the bank. You’ve just got additional security provisions on top. We use the same set of security protocols for TEFCA, this kind of network exchange as well. But the networks again, it’s network of network. The networks are already established networks. I mean, that’s the principle is that we’re not starting from scratch to build these from the ground up. We’re saying these are networks that already have a significant number of participants already, and they have to meet certain eligibility requirements as well as technical performance requirements to be considered a tougher network. And once they pass those tests, then they’re able to go live and connect it with each other.

Tom Temin And first responders. And that whole community often generates the initial information on health when they respond to someone who might be injured or burned or whatever the case might be. Are those party to TEFCA also?

Micky Tripathi That’s a great general example, actually, of some of the gaps that exist in the marketplace today, and that we want to be able to use TEFCA to help fill. So, as I said, there are a number of networks now. There are literally hundreds of networks across the country that exchange information. And the private sector’s actually done a fantastic job. Before joining the federal government in 2001, I was very much a part of that sitting on the boards of some of these nonprofit networks. And so I saw firsthand how much they had accomplished. But the private sector alone can’t do it alone, because health care, as you pointed out earlier, the federal government and state governments are very involved in health care. They deliver health care, they pay for health care. They set the rules of the road for health care. So it’s very hard for the private sector on its own to solve all these problems. And so that’s what TEFCA represents, is really saying, all right, the private sector has taken that as far as it can possibly do, it’s done a great job. But now we need public private collaboration with the power of federal government convening to help to bring that together, to say what are the other things we want to do?

Micky Tripathi One, is what I described, which is connecting the networks together. Second side of things is there are gaps that the market itself hasn’t really solved and has difficulty solving. One is, I should point out, first responders. So we’re actually working with a group of first responders who are now already working on joining one of the approved networks, so that we do have the ability then for first responders, ambulances and other first responders to be able to share information with provider organizations. Other gaps that I would point to, though, are public health, huge gap. Right now, even after a pandemic, public health agencies were not able to connect to the networks, the nationwide networks that exist today, for a variety of reasons related to the complexity of regulatory frameworks and the fragmentation of jurisdictions and all of that. Nothing the private sector can solve on its own. That’s something that the federal government, ONC and the CDC and jurisdictions working together. Last thing I’ll point to or two things. Another one is individual access. You as a patient ought to be able to access the network to get your own information. That seems like how to be fundamental. So we’re working very hard to say that’s what TEFCA needs to be able to support as well. There’s lots of complexity there. And that’s why, again this public private collaboration is needed. And then finally payers. Health care payers, they have been excluded from these networks for a variety of competitive reasons that exist in the market. Again, we as the federal government have been saying, you know what, we understand there are competitive issues, but that can’t be what prevents us to get to the higher level of health care interoperability that American citizens need. And so that’s we’re going to do is break through that to fill that gap as well.

Tom Temin And so in a lot of ways, the banking and credit card systems and clearances and there’s a whole complexity behind all of that. Or say the airlines have inter airline clearance mechanisms and payment mechanisms going back decades. Those are pretty good models too.

Micky Tripathi They are absolutely. There’s a lot of similarities there. I think one of the differences and why we need more of this proactive public, private sort of collaboration here, is that unlike those other industries. The federal government involved in all those industries, so it’s not as if it’s not. But the federal government plays a unique and very large role in health care that’s somewhat different than in other industries. And the other thing about health care is that it’s unbelievably fragmented, much more so than banking airlines that have a lot of consolidation to them. Health care is unbelievably fragmented. And so it takes something like the federal government to help to just convene everyone to say, all right, we’re going to get everyone together and work with states as well, to say we’re not stepping on states toes, but we need to have something that gives more system to our system. That’s the important role that the federal government plays on this.

Tom Temin And late last year, the TEFCA group,  updated from 1.0 to 1.1, which indicates the relative newness of the whole enterprise here in TEFCA.. But what changed recently.

Micky Tripathi 2.0 is just about to happen. So, not to get too wonky here, so just set the table here for everyone just so everyone knows what’s happened. When we came into this administration, we said within a year we are going to get the TEFCA framework out to the public for the public to react to and provide us comments back in with an eye towards saying we’re going to go live. We’re going to have networks that step forward and say that they want to do this voluntarily, because the 21st Century Cures Act didn’t give the federal government, my office, any budget or any new authority for TEFCA, and explicitly said that TEFCA has to be voluntary. So I have no ability to order, nor does Secretary Becerra have the ability to order anyone to join TEFCA. So we have to make it a true public private collaboration model that say, how can we work together to get to something that all of us want and that the private sector sees as valuable? Otherwise they’re not going to invest their money. So within a year, we made available version 1.0 of this common agreement, which is a common contract that everyone across the country would sign if they want to participate in TEFCA exchange. So everyone knows the rules of the road. Again, if you’re in Nome, Alaska or Omaha, you know that if you have signed this agreement and you’re sharing information with the provider organization in Nome, Alaska, they’ve agreed to the same set of rules. You don’t have to worry about is there a different set of rules here that I don’t understand that are going to get me in trouble? So the next thing we did is we said, now we invite private sector networks to step forward and join TEFCA as networks. And I’m really pleased to report that seven of them step forward a year after, as a year ago, step forward and said we are committed to implementing TEFCA. Some of those are very well known, I think to a lot of people. Epic, for example. Very large  EHR vendor, they step forward and volunteer to be one of these networks, the Commonwealth Health Alliance, which covers Oracle Health, which is the VA’s system.

Tom Temin That’s the obvious one I was going to ask about.

Micky Tripathi Athenahealth, eClinicalWorks, Meditech are all under the Commonwealth Health Alliance umbrella. eHealth exchange, which has a number of federal government participants and at the VA, for example, participates in that as well as others. So significant network stepped forward, those seven and now a year later as of January, seven are now live. All seven of those are live exchanging information with each other. And then 2.0. What the 2.0 common agreement does, which we’re going to release before the end of the first quarter here, the end of March,  is it upgrades the technical standards to allow API based exchange for those who are technically knowledgeable, which is a more modern way of having information exchange in the same way that you download apps on your phone and make it that easy and that convenient. That’s what TEFCA will support in this calendar year.

 

The post HHS takes step toward goal for better health information sharing first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/workforce/2024/02/hhh-takes-step-toward-goal-for-better-health-information-sharing/feed/ 0
3D printing might be saving naval manufacturing https://federalnewsnetwork.com/commentary/2024/02/3d-printing-might-be-saving-naval-manufacturing/ https://federalnewsnetwork.com/commentary/2024/02/3d-printing-might-be-saving-naval-manufacturing/#respond Fri, 23 Feb 2024 20:08:51 +0000 https://federalnewsnetwork.com/?p=4900158 While many may still think of 3D-printed objects as better suited for a toy store or a scientific lab, the ability to produce components using military-grade materials has taken shape in recent years, much to the Navy’s benefit.

The post 3D printing might be saving naval manufacturing first appeared on Federal News Network.

]]>
More than any other military branch, the Navy’s ability to conduct maritime operations depends largely on the readiness of its vessels – in particular, the health of its submarines. Along with this comes the knowledge that, without the ability to make complex repairs or service these vessels periodically, our capabilities to either project or expand undersea power is near impossible.

Unfortunately, bottlenecks in the submarine industrial base have regularly reared their heads in recent years, largely because capabilities in producing for this industry are often hard to come by. This isn’t surprising when you take into account that the submarine industrial complex has observed a shrink of more than 70% since the 1980s, and that’s one reason the Navy has begun to look towards additive manufacturing, also known as 3D printing.

While many may still think of 3D-printed objects as better suited for a toy store or a scientific lab, the ability to produce components using military-grade materials has taken shape in recent years, much to the Navy’s benefit. Parts large and small can now be produced on demand, in some cases consolidating hundreds of parts together, but without full buy-in from decision-makers, the technology may not reach its full potential.

A new breed of resilience

Despite that aforementioned shrink in the submarine industrial complex, the Navy plans to build two Virginia-class submarines and one of the much larger Columbia-class every year starting in FY2026, effectively five times the work that is performed today (one Virginia-class per year). This represents an enormous commitment from the Defense Department, one that would be difficult to achieve using traditional approaches.

Recently, HII’s Newport News Shipbuilding division and General Dynamics Electric Boat sourced a component from a 3D printing company that they plan to integrate onto the Virginia-class general attack submarine USS Oklahoma.

Because the original equipment manufacturer equipment integrators of the fleet implemented copper-nickel and other marine-based alloys, HII and GDEB created a deck drain made of those same alloys using additive manufacturing. This is one recent, salient example of how 3D printing affords the ability and opportunity to quickly onramp production capacity by fabricating the essential parts that keep vessels at sea for long periods of time in high-performance structural alloys.

Demands for relentless innovation

Between schedule delays due to the COVID-19 pandemic and supply chain related delays due to a lack of suppliers, the Navy and its manufacturers are seeing significant setbacks in their ability to deliver submarines on the cadence originally agreed upon. To make matters more complicated, the service says the high-priority Columbia-class submarines cannot fall behind schedule.

Additive manufacturing is being tested to fill that gap. In November 2023, the Navy sought to expand the supply chain for submarine parts by supporting companies explicitly interested in demonstrating capabilities in metal additive manufacturing. The Additive Manufacturing Center of Excellence has been purported to be the service’s only path to building the two submarine classes on time.

The Navy has already used additive manufacturing to print for several applications, usually small repair pieces needed for ships at sea: circuit covers, radio knobs and other items that would be difficult and expensive to access while deployed. However, this is a far more encompassing endeavor entirely – seeking to produce parts of incredible scale as well for the most demanding structural applications. The advent itself of the Additive Manufacturing Center of Excellence is a great sign of what lies ahead for the marriage of the emerging technology and submarine-related applications.

It’s an equally good sign that the Navy has faith in these applications. The director of the Navy’s submarine-industrial base program told reporters that metal additive manufacturing could increase capacity by 15-20% while improving quality and cutting production by as much as 90%.

Securing dominance in manufacturing

Already, there have been breakthroughs that make additive manufacturing a more comprehensive solution for the Navy and other maritime-related production industries – most notably the ability to print using specific alloys tailored to specific vessels. Another vital breakthrough required is the capability to test in reference environments 3D-printed parts and structures, ensuring a stark trend downward in faulty parts that could stall production at best and endanger servicemen and servicewomen while at sea at worst.

Transformational innovations in additive manufacturing for these applications are being developed by emerging startup companies and the private sector – everything from artificially-intelligent systems in control of intricate weld pool dynamics, to significantly higher deposition rates capable of printing at scales far in excess of what’s been observed in traditional applications. Currently, additive manufacturing is being employed to bolster certain parts installed in Naval submarines, though the Navy is poised to nurture the technology in order to ensure more integral, more structural components, could also be manufactured using additive in the near future.

As the Navy invests in the world of additive manufacturing, it’s incredibly likely that there will emerge more opportunities for other similar industries to take advantage of these developments, such that they could utilize additional alloys or additional 3D printing methods that are in development. As demand continues to skyrocket and supply through more traditional approaches continues to wane, additive manufacturing will become an increasingly more effective and immediate answer that can continue protecting nations on the capabilities of a new breed of advanced manufacturing.

Christian LaRosa is CEO and co-founder of Rosotics, an additive manufacturing firm that is rapidly developing the technology to print large-scale components

The post 3D printing might be saving naval manufacturing first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/commentary/2024/02/3d-printing-might-be-saving-naval-manufacturing/feed/ 0
How agencies can boost productivity using tools they already have https://federalnewsnetwork.com/federal-insights/2024/02/how-agencies-can-boost-productivity-using-tools-they-already-have/ https://federalnewsnetwork.com/federal-insights/2024/02/how-agencies-can-boost-productivity-using-tools-they-already-have/#respond Tue, 20 Feb 2024 20:05:02 +0000 https://federalnewsnetwork.com/?p=4896183 Collaborations tools like Slack can boost productivity by serving as an entry point to streamlining business processes with built-in automation and AI.

The post How agencies can boost productivity using tools they already have first appeared on Federal News Network.

]]>
Federal agencies have been trying to figure out how to do more with fewer people and limited budgets for years now, and that likely won’t stop any time soon. That means they need to figure out how to maximize their productivity. One way to do that is by making work simpler; collaboration tools that proliferated during the pandemic have already begun that process, bringing people and systems together in secure, trusted environments that act as a single repository of knowledge. And with the ongoing push to integrate artificial intelligence into more tools, processes and missions, agencies have the opportunity to simplify the work even more.

“These repositories that tie together different applications and data create a wealth of information,” said Milena Talavera, senior vice president of engineering and infrastructure at Slack. “In the future, people can leverage AI to answer questions and summarize that information, as well as use recaps to catch up once they’re back from vacation. There’s a lot of added productivity for workers, both from the core functionality to future potential.”

Streamlining work

Comprehensive collaboration suites have the ability to meet workers wherever they are, from their desks at headquarters to their workspaces at home to in the field. But at the moment, agencies are largely invested in discrete tools, and often don’t have the budgets or capability to switch wholesale. So they’re looking for ways to build a single entry point, because working across many tools adds complexity, which is the enemy of productivity.

For example, a federal contract manager needs to communicate both internally with their team and externally with the bidders. On top of that, contracts need to go through a series of iterations or approvals, many of which often take place in separate tools. This leads to what’s known as the “swivel chair effect,” meaning the employee is pivoting between tools and copying information manually, much the way they’d do physically in a swivel chair between two screens. Any time this happens, it increases the chances of data loss, mismatched versions, or any number of other human errors getting introduced into the process, to say nothing of the time lost.

“Tools like Slack can understand this problem and solve for it by enabling team communication, contractor collaboration and process automation within a single platform,” Talavera said. “Rather than jumping between applications, stakeholders can message, meet, co-edit documents, route deliverables for approval, and automate repetitive tasks all in one place.”

Applying automation

Slack’s application programming interface allows other tools to connect with Slack and seamlessly transfer data, making all of this possible. Not only does this remove the potential for human error in those data transfers, it also saves time and increases productivity by enabling automation across tools. Agencies need to ask themselves where the repeated tasks are to identify opportunities to apply automation. Approvals are a common scenario where different tools, like existing approval systems, can be connected and streamlined – or something entirely new can be added in their place. That leads to major productivity boosts, where various approvals can all be handled in the same common interface all at once. Agencies can also add automated triggers, where when a certain condition is met, the next task is automatically kicked off.

These tools also have a lot of potential as generative AI is more commonly explored and integrated into existing platforms. One early entry point Talavera mentioned is summarizing information. Collaboration tools like Slack often have significant amounts of information generated in their various channels. When a new employee is added to a working group, it can be time-consuming to play catch up on all that information. Generative AI can extract and summarize key points.

Another potential early opportunity to apply generative AI is augmenting existing search functions. Currently, to answer a question, you have to search for a keyword, and then manually sort through the keyword matches the search function returns until you find the answer. Generative AI can sort through all of those itself much more quickly, summarize them, and return an actual answer based on the data in the tool.

Don’t recreate the wheel

Two major challenges agencies face as they work to integrate AI into their operations are their already strained budgets, and a lack of AI expertise in the federal workforce, largely due to difficult hiring processes and stiff competition from the private sector. With those limitations in mind, the fastest and most effective way agencies can begin seeing productivity increases from AI tools is as a consumer, rather than devoting their limited resources to building bespoke solutions that already have industry analogues.

“Federal agencies can benefit from out-of-the-box solutions without needing extensive technical expertise,” said Talavera. “The beauty is that a lot of these technology tools, including Slack, are working to build in AI capabilities, so agencies don’t have to recreate the wheel and build their own models.”

The post How agencies can boost productivity using tools they already have first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/02/how-agencies-can-boost-productivity-using-tools-they-already-have/feed/ 0
The journey of HHS transforming to zero trust https://federalnewsnetwork.com/cybersecurity/2024/01/the-journey-of-hhs-transforming-to-a-zero-trust/ https://federalnewsnetwork.com/cybersecurity/2024/01/the-journey-of-hhs-transforming-to-a-zero-trust/#respond Tue, 30 Jan 2024 21:25:08 +0000 https://federalnewsnetwork.com/?p=4871581 HHS is moving toward a zero trust architecture, collecting information on where it may be vulnerable and refining its approach.

The post The journey of HHS transforming to zero trust first appeared on Federal News Network.

]]>

The Department of Health and Human Services is in the process of moving toward a zero trust architecture. Currently, the agency is collecting information on where it may be vulnerable and refining its approach to this new approach to security.

HHS undertook an exercise to identify systems and all the facets within the zero trust model, a painstaking process for each system. HHS then used the information to create a maturity model that it applied to identify where it may be falling short.

“We want to make sure that there’s not built in security compromises that we can identify from manufacturing to implementation,” said La Monte Yarborough, the chief information security officer at HHS. “We want to ensure that we’re weighing against our legacy technologies to ensure that they’re still, while they remain in service, capable of being patched appropriately, being updated appropriately and they can handle transforming into a zero trust paradigm.”

He said for each system, HHS must measure and manage the risks to both cybersecurity and mission areas as they implement zero trust.

Zero trust, in a lot of ways, is a compilation of discrete activities that were being done already in some facet or form. Now, the exercise is to try to discern or evaluate how mature those processes are, and figuring out a way by which you can coalesce them in a manner that you could reasonably define as a zero trust architecture,” Yarborough said on Federal Monthly Insights — Zero Trust. “We refer to the system model and that helps us from a roadmap perspective to evaluate and assess where we are individually, and then evaluate the types of resources that we need to apply to move the dial. We go over scar tissue, lessons learned, etc., and we get the opportunity to identify any particular pain points that we could help each other out in moving forward.”

Yarborough said once his team figures out where there are gaps in zero trust coverage, they can devise a plan based on the challenges, whether it’s technology, personnel, funding or all of the above.

“That data is helpful for us because then we can begin to have the conversations on where we’re going to get those resources, and we can make the respective appeals to those who could potentially provide us those resources,” he said.

Additionally, he said HHS can lean on industry, particularly its cloud service providers, to take advantage of their inherent capabilities.

“It requires a certain level of understanding of the behavior of the cloud service provider. It requires being very specific in contract language and service level agreements with respect to the behavior of the cloud service provider,” Yarborough said.

Yarborough said the challenge HHS faces when it comes to zero trust is as much a culture change as a technology problem.

“People who do the same thing every day within the cybersecurity sphere look at it through a particular lens. You have to truly think differently in some ways to fully appreciate this new concept. And that might require some external eyes to come in and help facilitate our direction,” he told the Federal Drive with Tom Temin. “I think it would be a good idea to make room for the possibility of, perhaps, leveraging external talent, who comes in without a particular bias to the environment that these operators have operated in for years, just to make sure that the perspective is accurate.”

Like with any new initiative, educating the workforce remains a key factor in successfully moving to zero trust.

“I think along this journey we’re going to have to expect or provide some level of training because I think zero trust as a paradigm is also a mindset. We’re accustomed to doing things in certain ways. We’re accustomed to having access in the ways that we have them now, and they may change somewhat at the end of this journey. So it’s not only the technical things that we implement. It’s not only the logical things that we implement, but it’s our mindset, our perspective,” Yarborough said. “It’s a constant, dynamic kind of activity. When we hear that term [zero trust], I try to disabuse people of this notion that we’re starting from scratch and trying to build toward something.”

HHS is also being cautious on how to approach zero trust for non-human entities. Yarborough said artificial intelligence and machine learning technologies are powerful, but his concern is leveraging technology without violating copyrights and intellectual property rights.

“We’re doing our best to first appreciate the capacity of AI and ML technologies. So we’re trying to understand their capacity, their ability to help us do our work. And we’re still doing that discovery process or assessment on how available do we make these technologies — maybe not just to our technical folks, but to our everyday users,” Yarborough said. “We have to be mindful of what we can do to defend ourselves from those who will come after us, leveraging those technologies.”

 

The post The journey of HHS transforming to zero trust first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/cybersecurity/2024/01/the-journey-of-hhs-transforming-to-a-zero-trust/feed/ 0
Unlocking Zero Trust: Strategic approaches and cybersecurity modernization https://federalnewsnetwork.com/cme-event/federal-insights/unlocking-zero-trust-strategic-approaches-and-cybersecurity-modernization/ Tue, 30 Jan 2024 18:25:16 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4871196 How do AI, cloud computing and analytics fit into creating a successful zero trust strategy?

The post Unlocking Zero Trust: Strategic approaches and cybersecurity modernization first appeared on Federal News Network.

]]>
January was the two-year anniversary of the Office of Management and Budget’s zero trust strategy.

In the 24 months of work, agencies faced, and had to overcome, many competing cyber and other priorities as they work to ensure a resilient and trusted operating environment.

Many agencies started off their zero trust journeys with the “low hanging fruit” of identity and access management. But now that they are in year three of this journey, agency cyber leaders must continue to mature their zero trust programs, take even more advantage of enterprise services and tools and apply a more strategic approach to how they secure their networks, systems and data.

That strategic approach is more than just cyber, but a pathway to digital transformation and IT modernization. In fact, OMB expects the implementation of zero trust concepts to help agencies better envision their future technology infrastructure.

As the 2024 zero trust deadline quickly approaches, agencies still have the longest section of this marathon to push through.

Wayne Rodgers, the zero trust lead and senior cybersecurity manager at the Education Department, said the agency is getting ready to move into the next phase of its journey after spending much of the last 18 months transforming its cybersecurity infrastructure.

“Right on the horizon is software-defined wide area network (SD-Wan). That way we can extend our zero trust policy to users when they’re on-premise as well as things that are on-premise like printers and such, right there on site, that don’t necessarily have a secure access service edge (SASE) agent,” said Rodgers on the discussion Unlocking Zero Trust: Strategic Approaches and Cybersecurity Modernization. “We already have automated tenants in response to use cases and brought a lot more efficiency to our cyber security operations center (SOC) analysts. We also implemented a tier three zero trust program management office, which has been instrumental in updating policy when it comes to zero trust. They really were the programmatic execution arm of making sure that we could implement and deploy SASE in a very fast timeframe. We ended up being able to deploy SASE within the period of three months, and we were able to migrate all users and cut off legacy virtual private networks (VPN) two months after that.”

Education’s improved cyber defenses

Rodgers said Education also has deployed endpoint detection response (EDR) on all laptops and soon it will deploy the technology on all servers.

“We have seen threat hunt capabilities increase and our visibility analytics have increased,” he said. “I always give an example: We were able to find malware retroactively from a couple months prior to deploying that. So the visibility analytics have been great from EDR. We already had an identity credential access management (ICAM) solution. We’ve since migrated to a new one and we have a new identity provider and we’re working toward automating lifecycle management of users.”

The idea of better and even cheaper cyber capabilities is a key tenet of zero trust.

The Office of Personnel Management is leaning on the cloud to improve its cyber posture.

James Saunders, OPM’s chief information security officer, said through the move to zero trust the agency reduced the number of cyber tools it was using and redirected those funds to zero trust technologies.

“About 70% of the zero trust technologies you need are now coming from a platform versus point solutions. What I mean by that is the bigger vendors in the space can pretty much give you everything that you need. We were able to eliminate one-off solutions that are hard to integrate and hard to automate,” Saunders said. “Now the platform covers most of it. We do still have point solutions where we feed into different mechanisms like security orchestration, automation and response (SOAR) use cases for bringing that data in and allowing the machine to make decisions and do a lot of the lower level work.”

OPM met MFA-mandate

Saunders said OPM’s cyber analysts now have more time to work on more complex security efforts and don’t have to stitch together multiple point solutions.

All of this work also helped OPM meet the White House’s mandate to move to phishing resistant multi-factor authentication by Dec. 31.

Saunders said all of these efforts so far in the zero trust journey have been about building trust.

“Other agencies that partner with us, share data with us and citizens who interact with our services need to know that we’re constantly doing our due diligence and continuously raising our cybersecurity bar,” he said. “To me, that’s what zero trust is about, building trust. Starting with zero, but eventually you build on it and we’re trusting that OPM is doing the right thing to protect our environment.”

Similar to OPM, the Nuclear Regulatory Commission also is consolidating its cyber tools to move faster toward meeting the goals of the zero trust architecture.

“We’ve done a major focus on identity and access management. We’ve changed our identity provider, as a number of agencies have. We’re also looking now specifically at datasets and artificial intelligence use cases as well because that’s all going to be part of our big approach to zero trust as we move forward,” said Jonathan Feibus, NRC’s CISO and director of the Cyber and Infrastructure Security Division. “We’re trying to figure out how we can get all of these tools, all of these requirements built into our development platforms, how we can get all of these transitional tools into the cloud and how we can use everything that we are doing, in terms of cybersecurity, in terms of oversight, in terms of governance, and in terms of access to data and other tools across our environment to help us with the zero trust journey.”

Platforms and integration

OPM, NRC and Education are following a similar path as many public and private sector organizations in their move to zero trust.

Felipe Fernandez, the chief technology officer at Fortinet Federal, said the strategy is focused on creating a comprehensive security approach based on different use cases, including accessing data, providing access to remote workers and providing services to citizens.

“What we’re doing is creating a platform that’s automatically integrating with other tool sets that are typically found in the environment,” Fernandez said. “We had toolset sprawl that really grew and it started to become cumbersome for most customers trying to do cybersecurity. That was really driven by strategic approaches, such as defense-in-depth or a multi-layered approach to security. What we discovered is that really just becomes cumbersome and results in operational inefficiencies because the practitioners need to learn all these tools. So this natural reduction of vendors, tool sets and capabilities into single platforms is what’s really helping drive these zero trust architecture initiatives.”

Learning objectives:

  • Modernizing security through zero trust
  • Navigating the zero trust culture shift
  • Applying automation in security

The post Unlocking Zero Trust: Strategic approaches and cybersecurity modernization first appeared on Federal News Network.

]]>
How to leverage data, automation for HR transformation https://federalnewsnetwork.com/cme-event/federal-insights/how-to-leverage-data-automation-for-hr-transformation/ Thu, 25 Jan 2024 20:13:33 +0000 https://federalnewsnetwork.com/?post_type=cme-event&p=4866158 Learn from HR and technology leaders about how to transform human capital management

The post How to leverage data, automation for HR transformation first appeared on Federal News Network.

]]>
Is your agency making the most of its HR data? Are you using analytics and automation to distill insights and improve decision-making?

In our new ebook, learn from HR and technology leaders about how to transform human capital management:

  • Helena Wooden-Aguilar at EPA
  • Robyn Rees at Interior
  • Tanisha Lewis at the Metropolitan Washington Airports Authority
  • Matt Cornelius at Workday

The post How to leverage data, automation for HR transformation first appeared on Federal News Network.

]]>
Federal records officers see need for automation, collaboration across agencies https://federalnewsnetwork.com/technology-main/2024/01/federal-records-officers-see-need-for-automation-collaboration-across-agencies/ https://federalnewsnetwork.com/technology-main/2024/01/federal-records-officers-see-need-for-automation-collaboration-across-agencies/#respond Fri, 19 Jan 2024 12:20:04 +0000 https://federalnewsnetwork.com/?p=4857626 The Federal Records Officers Network is a self-organized community that looks to lead on federal records issues, like digitization.

The post Federal records officers see need for automation, collaboration across agencies first appeared on Federal News Network.

]]>
var config_4860827 = {"options":{"theme":"hbidc_default"},"extensions":{"Playlist":[]},"episode":{"media":{"mp3":"https:\/\/www.podtrac.com\/pts\/redirect.mp3\/traffic.megaphone.fm\/HUBB6326967567.mp3?updated=1705934126"},"coverUrl":"https:\/\/federalnewsnetwork.com\/wp-content\/uploads\/2023\/12\/3000x3000_Federal-Drive-GEHA-150x150.jpg","title":"Federal records officers see need for automation, collaboration across agencies","description":"[hbidcpodcast podcastid='4860827']nnWhen Ron Swecker took on the role of records officer at the Department of Transportation in 2012, he had little background in records and information management.nnSwecker had managed information systems and worked on several e-gov initiatives. But in order to learn more about his new role, he attended trainings offered through the National Archives and Records Administration, as well as industry conferences and seminars on records management.nnAfter meeting other records specialists at other agencies, Swecker and his colleagues realized they lacked a central community where they could discuss best practices, training, and more about the federal records management profession. So they decided to start one.nnThe first meeting of the Federal Records Officers Network, or the FRON, occurred in June 2013 with about 15 members. Today, the FRON has approximately 350 members from across federal agencies and the military departments.nn\u201cWe've grown considerably over that time and continue as the word gets out,\u201d Swecker said. Swecker is now a records and information program manager at the Securities and Exchange Commission. He\u2019s also co-chairman of the FRON.nnThe goal of the group is to advance the records and information management profession. The network offers training and mentorship opportunities, while also providing a venue to share best practices and feedback on federal recordkeeping issues. Membership in the FRON is open to anyone with a .gov or .mil email address.nnAs the FRON has grown over the last decade, agencies have also been grappling with the transition away from paper to fully electronic recordkeeping, with records officers and management specialists at the vanguard of that push. Agencies <a href="https:\/\/federalnewsnetwork.com\/technology-main\/2022\/12\/white-house-extends-e-records-deadline-to-june-2024\/" target="_blank" rel="noopener">now have a deadline<\/a> of June 2024 for when NARA will stop accepting permanent records that aren\u2019t digital.nn\u201cThe records management community . . . has the same challenges that every other area within the federal government has,\u201d Swecker said. \u201cAnd those are budgetary constraints. So I hear a lot of, we just don't have the budget to move fast enough or to meet specific deadlines. So that has always been a challenge.\u201dnnAs federal records management becomes a digital endeavor, FRON sees the need for agencies to shift toward \u201cmultidisciplinary\u201d records and information management teams that include traditional records specialists and IT personnel with a background in information management, Swecker said.nnThe FRON is also working with members of the federal Chief Data Officers Council, as requirements for electronic recordkeeping, such as metadata standards, and CDO responsibilities increasingly overlap.nn\u201cI think that collective effort is what's necessary to move forward,\u201d Swecker said.nn\u201cWe need to move on beyond just digitization and provide automation into the processes themselves,\u201d he continued. \u201cAnd in managing those records. I think that's where the convergence of the traditional records and information management discipline comes together with other disciplines like data management. We all have to work collaboratively to be able to automate as much of the processes and managing records as possible. I think that's going to be the greater challenge. And that's going to certainly take some time, but that's the direction that everyone seems to be moving towards.\u201d"}};

When Ron Swecker took on the role of records officer at the Department of Transportation in 2012, he had little background in records and information management.

Swecker had managed information systems and worked on several e-gov initiatives. But in order to learn more about his new role, he attended trainings offered through the National Archives and Records Administration, as well as industry conferences and seminars on records management.

After meeting other records specialists at other agencies, Swecker and his colleagues realized they lacked a central community where they could discuss best practices, training, and more about the federal records management profession. So they decided to start one.

The first meeting of the Federal Records Officers Network, or the FRON, occurred in June 2013 with about 15 members. Today, the FRON has approximately 350 members from across federal agencies and the military departments.

“We’ve grown considerably over that time and continue as the word gets out,” Swecker said. Swecker is now a records and information program manager at the Securities and Exchange Commission. He’s also co-chairman of the FRON.

The goal of the group is to advance the records and information management profession. The network offers training and mentorship opportunities, while also providing a venue to share best practices and feedback on federal recordkeeping issues. Membership in the FRON is open to anyone with a .gov or .mil email address.

As the FRON has grown over the last decade, agencies have also been grappling with the transition away from paper to fully electronic recordkeeping, with records officers and management specialists at the vanguard of that push. Agencies now have a deadline of June 2024 for when NARA will stop accepting permanent records that aren’t digital.

“The records management community . . . has the same challenges that every other area within the federal government has,” Swecker said. “And those are budgetary constraints. So I hear a lot of, we just don’t have the budget to move fast enough or to meet specific deadlines. So that has always been a challenge.”

As federal records management becomes a digital endeavor, FRON sees the need for agencies to shift toward “multidisciplinary” records and information management teams that include traditional records specialists and IT personnel with a background in information management, Swecker said.

The FRON is also working with members of the federal Chief Data Officers Council, as requirements for electronic recordkeeping, such as metadata standards, and CDO responsibilities increasingly overlap.

“I think that collective effort is what’s necessary to move forward,” Swecker said.

“We need to move on beyond just digitization and provide automation into the processes themselves,” he continued. “And in managing those records. I think that’s where the convergence of the traditional records and information management discipline comes together with other disciplines like data management. We all have to work collaboratively to be able to automate as much of the processes and managing records as possible. I think that’s going to be the greater challenge. And that’s going to certainly take some time, but that’s the direction that everyone seems to be moving towards.”

The post Federal records officers see need for automation, collaboration across agencies first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/technology-main/2024/01/federal-records-officers-see-need-for-automation-collaboration-across-agencies/feed/ 0
Industry Exchange Cyber 2024: Tanium’s Sam Kinch on taking advantage of automation at the endpoint https://federalnewsnetwork.com/federal-insights/2024/01/industry-exchange-cyber-2024-taniums-sam-kinch-on-taking-advantage-of-automation-at-the-endpoint/ https://federalnewsnetwork.com/federal-insights/2024/01/industry-exchange-cyber-2024-taniums-sam-kinch-on-taking-advantage-of-automation-at-the-endpoint/#respond Wed, 17 Jan 2024 18:29:43 +0000 https://federalnewsnetwork.com/?p=4855671 Endpoint management and security is becoming more automated. But it requires using the most current data about network activity.

The post Industry Exchange Cyber 2024: Tanium’s Sam Kinch on taking advantage of automation at the endpoint first appeared on Federal News Network.

]]>

Because so many cybersecurity breaches start with users and their devices, anything that improves endpoint management will bolster an agency’s cybersecurity protections.

That’s the idea behind the emerging technology of autonomous endpoint management.

Sam Kinch, director of technical account management at Tanium, likens autonomous endpoint management to self-driving vehicles programmed to deal with whatever they encounter in traveling from Point A to Point B.

“We want to get to that point in the same way with endpoint management, whether it’s patching or compliance, or even threat response” Kinch said during Federal News Network’s Industry Exchange Cyber 2024.

The goal? “To take that whole enterprise and automate it to the point where I can have fewer people” managing endpoints, Kinch said. That makes more of the cyber team available for “more prioritized business processes, rather than the mundane, day-in and day-out tasks,” he added.

This level of automation can detect when an endpoint goes offline errantly, prompting submission of a trouble ticket — sometimes even before a user is aware of a problem. In a larger sense, Kinch said, a high level of automation can rein in what Tanium’s chief technology officer refers to as the “suburban sprawl” of cybersecurity operations.

That sprawl stems from “a massive shift in the amount of data that’s been absorbed by businesses as part of their daily business practices,” Kinch said. There’s also a wider variety of data, all of which adds complexity. Deployment of hyperconverged workloads in commercial computing clouds adds to the levels of data and complexity too, he said. Still another factor: the large numbers of people permanently teleworking at least some of the time. Kinch pointed to several independent studies that indicate “70% of successful breaches source from the endpoint.”

“What we have are businesses that have really grown up very fast and haven’t brought their IT infrastructures along with them,” he said. “That’s where automation really shines. You bring in automation to reduce the complexity — reducing those mundane tasks.”

Gathering the right data for endpoint cyber management

As machine learning and artificial intelligence increasingly power automation, Kinch said it’s important to understand and control the data feeding the algorithms. He again used the analogy of an autonomous car. Suppose the car reaches a stop sign at which it stopped yesterday and then proceeded to go straight. If on a subsequent day construction blocked the road straight ahead, the car must have relevant data to find a different course.

“The most important thing that makes that AI effective is the real-time nature of data, having data that is not days or weeks old,” Kinch said.

The same thinking applies to automating the patching of endpoint software. Kinch said that many of the thousands of software components on a given device don’t automatically update themselves. AI-driven automation can take over both the patching and the validation of patches, which reduces the chances of breaking something while simultaneously speeding up the patch process.

“If I had real-time data, I would know exactly what endpoints I would have to update,” Kinch said. “I could roll it through a patch update cycle through test rings. Maybe it’s a dozen devices, and then 100 devices, and then the rest of the enterprise.”

It’s an AI use that takes the human out of the equation. “The automation, not the human, is pushing the patches out, checking to see if the patches applied successfully and monitoring the CPU to say if there any issues going on,” he said. “Did the patch apply successfully?”

He described another example of how automation can work in an organization with hundreds of thousands of deployed endpoints. Suppose endpoint monitoring detects a device executing Mimikatz,  notorious shareware capable of extracting user passwords and credentials from system memory. Both criminal hackers and security professionals use it.

“I have artificial intelligence and automation looking at it, going, ‘Hey, Mimikatz isn’t normal. It shouldn’t have fired. Let me say with about a 95% chance that is nefarious activity,’” Kinch said.

The system would then halt the Mimikatz process, quarantine the  affected device and analyze the  string of events.

“Within seconds, you’ve already deduced down that you need to isolate that endpoint,” Kinch said. “And you have no human in the loop. I think that’s what we’re trying to get to.”

Discover more tips and tactics shared during by cybersecurity experts on our Industry Exchange Cyber event page.

The post Industry Exchange Cyber 2024: Tanium’s Sam Kinch on taking advantage of automation at the endpoint first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/federal-insights/2024/01/industry-exchange-cyber-2024-taniums-sam-kinch-on-taking-advantage-of-automation-at-the-endpoint/feed/ 0