IT Innovation Insider - Federal News Network https://federalnewsnetwork.com Helping feds meet their mission. Tue, 04 May 2021 15:47:28 +0000 en-US hourly 1 https://federalnewsnetwork.com/wp-content/uploads/2017/12/cropped-icon-512x512-1-60x60.png IT Innovation Insider - Federal News Network https://federalnewsnetwork.com 32 32 Managing, securing a hybrid cloud environment begins, ends with data https://federalnewsnetwork.com/it-innovation-insider/2021/04/managing-securing-a-hybrid-cloud-environment-begins-ends-with-data/ https://federalnewsnetwork.com/it-innovation-insider/2021/04/managing-securing-a-hybrid-cloud-environment-begins-ends-with-data/#respond Fri, 30 Apr 2021 19:03:53 +0000 https://federalnewsnetwork.com/?p=3443387 Agencies will continue to exist in a multi-cloud world for the foreseeable future.

The post Managing, securing a hybrid cloud environment begins, ends with data first appeared on Federal News Network.

]]>

Agencies will continue to exist in a multi-cloud world for the foreseeable future.

But what is changing is how agencies need to protect that multi-cloud environment needs from cyber threats.

Experts say during the pandemic three trends emerged around cybersecurity. First, the acceleration of digital services increased the threat surface putting assets and infrastructure at risk. Second, the risk to systems and data expanded with more employees working remotely. And third, there was a global surge in malicious cyber attacks.

While agencies initially turned to virtual private networks and other security tools, moving forward agencies will need more sophisticated tools and approaches to deal with the complexity of their networks and the need to continually improve their security posture. This has to happen while reducing infrastructure friction and accelerating incident response and mitigation times.

One of the ways to try to stay ahead of threats and ensure your infrastructure meets its needs is through advanced data analytics.

Chip George, the vice president for U.S. public sector at Nutanix, said agencies must continue to improve their cyber postures and deliver on mission needs by taking advantage of the data from their security operations centers and other tool sets.

“This stress on the underlying network if they’ve gone multi-cloud and multi-site and work from home, is caused by all that data,” George said on the Innovation in Government show sponsored by Nutanix. “That takes away from what they want to do, which is address the threats and focus on the actionable intelligence. And that’s  what we’re trying to convey that what we’ve seen is you’ve got to figure out a way in a model that that that brings together or converges a bunch of the major things that you’re attacking in the data center simplifies that, and lets people focus on what they want to focus on, which is the cybersecurity threats.”

Juliana Vida, the chief technical advisor for public sector at Splunk, said by turning to advanced data analytics capabilities, organizations and people will be more productive even while managing through the new complexity of remote work and through the acceleration of moving to the cloud.

“Think about just all the different management challenges, the configuration management and application rationalization everything that has to go along there. So when we talk about that complexity observability or visibility is another benefit that a full data platform like Splunk. We help organizations see not just into one of those clouds in that multi cloud environment, but across all of those clouds and other platforms aren’t going to help customers look into their competing cloud vendor’s environment and help find where they can make tweaks or where they can make adjustments or where there might be cybersecurity challenges,” Vida said. “A neutral data platform like Splunk can help customers look across those different cloud environments provided by different cloud vendors, and give a full view of where there can be tweaks, where applications can be moved and changed, micro services incorporated and so on. It really does help agencies be more efficient and effective in how they work in these multi-cloud environments.”

The data also can help agencies move toward a zero trust framework. By collecting data from a variety of sources, agencies can begin to segment their networks.

“As part of zero trust, you want that almost firewalled experience down at the virtual machine level,” George said. “They’re the data that all this extra data you brought in is now separated and walled off in that micro segmented way and hopefully implemented and managed in a easy to consume way as easy as possible. That is just as important, if not more important, as you get to a hybrid cloud.”

Vida added with the expanded attack surfaces because of remote work as well as the complexity of hybrid cloud, it becomes more important for employees to be authenticated not just at the front door, but at each new point on the network.

“Each one of those points, every one of those assets must be verified, and must be put through some kind of analytics to decide should they be trusted do should they be allowed to come in. That kind of helps give people a framework of what are we talking about when we talk about a zero trust framework, and from that perspective, data truly is the key element,” she said. “If you’re not protecting it, shielding it and leveraging it in the right way, all throughout an ecosystem or an environment, there’s going to be a breach somewhere and you’re going to lose control. So for Splunk, specifically, in the zero trust framework, there is constant monitoring, there is automation and orchestration, there is cybersecurity and network monitoring.”

The other side of the data coin is disaster recovery and continuity of operations.

George said if a breach happens and an agency has to recover, the cloud provides agencies with an ability to have a safe copy of their data stored away from the threats and the ability to have an immediate failover.

“The fact is you need to prepare for it. People know the cloud has helped because sometimes it acts as an off-premise disaster recovery site. Sometimes you’re failing over from a cloud to another cloud,” he said. “These make all of those issues complex, what makes it easier is if you could have the same operating system, that does all of those core functions, like server storage, networking, virtualization, and have that same OS in the cloud that you could recover to right away. That’s where we’re seeing people go. That’s what we’re seeing people ask about, especially in these big data intensive apps, like Splunk that we’re working with.”

The post Managing, securing a hybrid cloud environment begins, ends with data first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/it-innovation-insider/2021/04/managing-securing-a-hybrid-cloud-environment-begins-ends-with-data/feed/ 0
How the pandemic jump started the move to desktop-as-a-service https://federalnewsnetwork.com/it-innovation-insider/2021/02/how-the-pandemic-jump-started-the-move-to-desktop-as-a-service/ https://federalnewsnetwork.com/it-innovation-insider/2021/02/how-the-pandemic-jump-started-the-move-to-desktop-as-a-service/#respond Mon, 01 Feb 2021 18:05:38 +0000 https://federalnewsnetwork.com/?p=3298799 Chip George, the vice president of U.S. Public Sector at Nutanix, said there are ways for agencies to continue to evolve the technology that supports their mission and citizen services.

The post How the pandemic jump started the move to desktop-as-a-service first appeared on Federal News Network.

]]>

Over the last nine months, the coronavirus has jump-started agency IT modernization.

Agencies had to pivot on a dime to make sure hundreds of thousands employees could work remotely. Agencies also leaned heavily on emerging technologies like artificial intelligence and data analytics in the cloud to help find solutions to a range of other challenges.

As agencies move into calendar year 2021, agencies must consider how they can take that momentum from the pandemic around IT modernization and continue to be innovative in how they solve challenges.

Chip George, the vice president of U.S. Public Sector at Nutanix, said there are ways for agencies to continue to evolve the technology that supports their mission and citizen services.

One of the biggest trends that emerged was the use of desktop-as-a-service, George said. Nutanix, which received its authorization under the Federal Risk Authorization Management Program (FedRAMP) just before the pandemic hit, saw a steady increase in the use of DaaS as part of agencies move to public and private cloud services over the last year.

“Some of the studies we’ve seen, McKinsey went and looked and said during the pandemic, ‘did you move slower or faster given this stress of how to put everyone at home in terms of implementing cloud services and desktop-as-a-service. They basically found people moved 40 times faster [toward modernization] than they expected between getting those things out there in the cloud,” George said on the IT Innovation Insider sponsored by Nutanix.

Desktop-as-a-service is similar to a virtual desktop interface (VDI) providing a scalable and flexible approach to ensure capacity for remote workers.

George said part of the reason why desktop-as-a-service caught on was also the broader acceptance of the “pay-by-the-drink” model or the consumption model.

He said agencies grew more comfortable buying software-as-a-service based on what the needs of their employees are versus just doing a best guess estimate.

“We did see that play out even with this accelerant of the pandemic where people are driving to this consumption model that is subscription based. They are moving some of these workloads and applications into the cloud, an all subscription based approach that provides agility for a plenty of government agencies,” he said. “They will spend less. They can also budget more carefully. And certainly, and we saw it again this year with the federal budgets and continuing resolution, a lot of times it is much more stable during those continuing resolution months because you know exactly what you’re spending.”

At the same time, George said agencies also realized during the pandemic that not every application or workload was suited for the cloud or this consumption model.

Agencies are getting better at recognizing that some applications and workloads are better suited to remain on-premise or come back on-premise after being put in the cloud initially.

“We certainly saw that both the move to cloud and fully baked software-as-a-service are getting a lot of investment as agencies hurrying to make applications more available and more scalable because the demands of the pandemic,” he said. “But some of those were looked at and folks said ‘I think we can serve that better on-premise. We saw a significant investment from our federal and our defense agencies hustling to deliver cloud-like capabilities, but doing it from on premise. They were trying the things they wanted out of cloud, which is speed and agility and scale, but doing it on premise.”

George said it’s no surprise that this hybrid cloud set-up will continue to be the way agencies run their infrastructure in the near future.

He said one of the main reasons agencies will stay in this mixed environment is cybersecurity.

The use of micro-segmentation of the network will become more important.

“Basically we’re saying if you get into this data, it’ll be harder to get to the other data because you’ve segmented all your data, and you can’t traverse around very easily,” George said. “It also gives you time to respond and time to recover.”

George added agencies must continue to make remote access to data, applications and systems easy and secure as the IT modernization journey continues to evolve.

“The lift and shift is exactly what people are trying to avoid. That speaks to the how hard it can be to get to cloud. But some of those applications don’t lend themselves to that,” he said. “So the first thing they should do is improve the underlying infrastructure of that application, and then give yourself the chance to say, ‘Did that achieve the speed and agility I was looking for? Or even if it did, then I can move it to the cloud much more easily if what I’ve done is converged and improve the operations and management of that underlying infrastructure on-premise?’ So improve what you’ve got, look at the apps that are appropriate to go to the cloud for cost concerns, then move them and it won’t be lift-and-shift anymore, and it won’t create silos of completely different operations.”

The post How the pandemic jump started the move to desktop-as-a-service first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/it-innovation-insider/2021/02/how-the-pandemic-jump-started-the-move-to-desktop-as-a-service/feed/ 0
Understanding why the consumption model for cloud services makes sense https://federalnewsnetwork.com/it-innovation-insider/2020/10/understanding-why-the-consumption-model-for-cloud-services-makes-sense/ https://federalnewsnetwork.com/it-innovation-insider/2020/10/understanding-why-the-consumption-model-for-cloud-services-makes-sense/#respond Wed, 21 Oct 2020 15:35:57 +0000 https://federalnewsnetwork.com/?p=3130911 This consumption model approach provides agencies with greater flexibilities to take advantage of technology improvements and better support cyber security.

The post Understanding why the consumption model for cloud services makes sense first appeared on Federal News Network.

]]>

Back in January, the General Services Administration set out for comment a draft acquisition letter detailing how buying consumption-based cloud services could work through the schedules program.

In the letter, GSA says this approach would provide cost transparency without burdening contractors with additional transactional price reporting requirements.

It also would promote cost efficiency as it reduces the need to lock into long term contracts in markets where falling prices are anticipated.

This consumption model approach also would provide agencies with greater flexibilities to take advantage of technology improvements and better support cyber security.

Here we are nearly 10 months later, and the status of that acquisition letter is unclear. Did GSA shelve it or is it still working through the regulations process?

Either way, the move to a consumption-based model for cloud services is happening with or without a new policy.

Greg O’Connell, the senior director of federal at Nutanix, said there are several reasons why agencies are realizing the value of this pay-by-the-drink approach.

First, he said is the better understanding that 75% of all cloud workloads are predictable, meaning agencies know when there will be usage spikes and when there will be down time.

Second, he said the coronavirus pandemic has proven that having access to this type of approach was one of the big lessons learned over the last nine months.

O’Connell said for both of these reasons the combination of using the consumption model and a hybrid cloud sets agencies up for success.

“The federal enterprise represents a variety of these consumption models that have really emerged out of necessity,” O’Connell said on the IT Innovation Insider show, sponsored by Nutanix. “There are many instances where it’s unknown when there will be a burst requirement. But I think agencies also recognized there are services that they put or were looking to put into the cloud that were predictable but not really cost effective for the cloud.”

Rick Harmison, the federal cloud economist at Nutanix, said that cost is one of the biggest risks in moving to the cloud.

“In a case where it’s mission critical and it’s running at a fairly stable level, we’ve found it’s better to have it in a private data center or a private cloud versus out in the public cloud,” he said. “Security and cost optimization are factors that go into that burst versus non-bursting for workloads decision.”

A recent Nutanix and Market Connections survey found agencies remain committed to this hybrid approach. Out of 150 government decision makers across civilian, Defense Department and intelligence community found 6 out of 10 respondents say they are considering or already have moved application workloads back on-premise or to a private cloud from the public cloud.

“It’s more than economics. While there are cost overruns and unexpected expenses associated with public cloud, but there are also specific issues around data privacy and sovereignty risks of public cloud that has come into place as well as control over applications,” O’Connell said. “There are still a lot of legacy applications and services in government and there is a concept called data gravity that has to do with having certain applications running on-premise and other servicing applications being nearby that aren’t really capable of being supported in the cloud today. It gets back to the burden of reengineering these legacy applications.”

Harmison said another factor in the decision of which type of cloud is best is the idea of perishability versus non-perishability.

“It’s kind of like if you have home exercise equipment. This is an asset you bought in your home versus a gym membership. If I don’t use that gym membership, I essentially lose the value of that. If I have a physical piece of equipment a server or a data center, it’s still there. If I have to push out a project or not able to stand up a workload, I still have those servers and equipment and it’s still available to me in a month or whenever I need it. I think that’s what you are seeing in the cloud,” he said. “Agencies are aligning procurement, development and other financial aspects that the government and a lot of our private customers are still working through. This use it-or-lose it concept in the cloud is a big thing so cost optimization comes into play to make sure you are using all the resources in the cloud you are paying for.”

Harmison said the cloud is good for workloads and applications that will change and evolve in the short term as opposed to legacy systems that are more stable.

He said using on-premise data centers or private clouds helps control runaway costs or underutilization of compute or storage.

O’Connell said another important consideration is the need for flexibility and diversity during a time of budget uncertainty.

“We are seeing this more and more with the complexity of these contracts and these relationships with managed services, on-premise, private cloud, hybrid cloud and multi-cloud environments and the like. This concept of a subscription based services allowing an operational expense (OpEx) option gives the agencies that much needed flexibility to purchase what they need, when they need it and providing a degree of stability at a time when budget instability is an annual part of government business due to the annual continuing resolutions we are seeing,” he said. “In addition in this COVID environment and the complexities that you are layering on in terms of contracting availability and delivering services in these conditions all comes into play.”

Harmison added the pandemic has showed the cloud also can be the big enabler everyone expected it to be nearly a decade ago. He said agencies can take on projects more quickly because of the agility and limited capital investment.

Basically what it comes down to, Harmison and O’Connell say is moving to the cloud makes sense for the right workloads, and relying on the consumption model will help reduce the risks of cloud services.

The post Understanding why the consumption model for cloud services makes sense first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/it-innovation-insider/2020/10/understanding-why-the-consumption-model-for-cloud-services-makes-sense/feed/ 0
Understanding why taking a multi-cloud approach ensures current, future agility, scalability https://federalnewsnetwork.com/it-innovation-insider/2020/07/understanding-why-taking-a-multi-cloud-approach-ensures-current-future-agility-scalability/ https://federalnewsnetwork.com/it-innovation-insider/2020/07/understanding-why-taking-a-multi-cloud-approach-ensures-current-future-agility-scalability/#respond Fri, 10 Jul 2020 17:23:58 +0000 https://federalnewsnetwork.com/?p=2950254 Agencies can take the lessons learned from the pandemic and continue to apply them to their broader mission areas.

The post Understanding why taking a multi-cloud approach ensures current, future agility, scalability first appeared on Federal News Network.

]]>

The coronavirus pandemic let the telework genie out of the bottle. Sure, federal employees and contractors worked remotely before the national emergency. But the surge of teleworkers put a strain on agency technology infrastructure like never before.

Agencies realized quickly that modern, flexible IT was the difference between getting employees up and running in days versus weeks versus months.

And as the pandemic continues and other challenges emerge, agencies will need to be ready and equipped for broad-scale work from home/anywhere approach.

Over the last several months, working from home revealed some interesting things: Federal employees can work from anywhere and remain productive – and even happy. At the same time, remove working expanded the cyber threat surface agencies face.

It’s clear from the pandemic agencies are re-shaping the way they work and maintains continuity of operations and cybersecurity. Agencies need to maintain this level of agility and security well beyond the COVID-19 emergency.

Agencies can take the lessons learned from the pandemic and continue to apply them to their broader mission areas.

Dan Fallon, the senior director of federal engineering at Nutanix, said agencies have to continue to think about how to ensure their infrastructure continues to move toward one that is resilient and secure.

“The coronavirus lockdown forced agencies to expand overnight so they had to use the elasticity of the public cloud environment and had to have that agility to be able to shift, which isn’t an easy thing,” Fallon said on the IT Innovation Insider. “The bottom line is this is what the cloud was designed for and agencies took advantage of it. Now as we move to the long-term sustainment, agencies need to evaluate what is the most cost-effective way to run the service. They may need to shift a service back on-premise or move a service to a different cloud as part of streamlining the budgets. In the beginning it was about getting operational, but now cost becomes more of a concern.”

Part of that architecture that Fallon talks about is the use of multiple and hybrid cloud approaches. It’s clear from over the past few months that agencies who were already well into moving applications to the cloud fared better than those who were behind the curve.

Greg O’Connell, the senior director of federal at Nutanix, said the pandemic showed whether agencies business continuity strategies were robust enough.

“Agencies need to be confident in their infrastructure and cloud strategies and even stress testing them, and sharing those results to build confidence with their internal customers,” he said. “This is generating some hard questions like is the public cloud model really scalable and resilient under conditions like this? Are the cloud providers maintaining the excess capacity at the scale needed right now?  Supporting the infrastructure services robust enough to continue access to these cloud platforms is paramount.”

O’Connell said this resilience and scalable approach is all about agencies having “cloud optionality,” meaning the ability to move workloads, applications and data across or between platforms.

“It’s back to this idea of not putting all your eggs in one basket,” he said. “We have seen remote workers demanding greater capacity from their network, storage and services. Working remotely is no longer a choice. It’s a mandate so they are expecting more. We are seeing things like spot pricing shifting to surge pricing with some of the cloud providers where costs can quickly escalate due to the work from home mandate.”

Fallon said within the concept of cloud optionality is the recognition that agencies are no longer just behind the “safety” of their data center.

“There are a lot of standards like FedRAMP and other security policies to set that baseline for cloud services. So that helps agencies know that the underlying service meets a very high security bar,” he said. “Those are the things agencies need to think about as they design for multi-tenant environment, which means service level agreements are important, maybe they need to do a reserve instance so they have dedicated horsepower and will not be impacted by a surge event. It may be a little more expensive but they don’t want their email or critical database doing down.”

Fallon said the surge of remote working also expanded each agency’s threat surface, which makes the multi-cloud approach more important because it spreads the risk more broadly.

As the remote working continues for the near future, Fallon said agencies must continue to ensure their applications and data are agile enough to meet the employees’ needs.

“Zero trust will continue to grow. It was already becoming a hot topic before COVID-19, but it’s only going to accelerate because that really allows the federal security team to know that they can give the flexibility to the end user if they have that zero trust posture across their enterprise IT,” Fallon said.

O’Connell added the pandemic will continue to move employees to the edge, which will, in turn, continue to drive the multi-cloud strategy.

The post Understanding why taking a multi-cloud approach ensures current, future agility, scalability first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/it-innovation-insider/2020/07/understanding-why-taking-a-multi-cloud-approach-ensures-current-future-agility-scalability/feed/ 0
Agencies need to rethink what telework preparedness means https://federalnewsnetwork.com/it-innovation-insider/2020/04/agencies-need-to-rethink-what-telework-preparedness-means/ https://federalnewsnetwork.com/it-innovation-insider/2020/04/agencies-need-to-rethink-what-telework-preparedness-means/#respond Mon, 13 Apr 2020 15:30:11 +0000 https://federalnewsnetwork.com/?p=2814340 Federal employees and managers, and their industry partners need to consider how to make it all work from technology, people and security perspectives.

The post Agencies need to rethink what telework preparedness means first appeared on Federal News Network.

]]>

This coronavirus pandemic is challenging our day-to-day status quo from all points of view.

Federal employees and contractors are working remotely in ways and in numbers that are unprecedented.

A recent Federal News Network survey of federal employees found 77% of the more than 1,000 respondents say they are teleworking today because of the coronavirus pandemic. Of those, about 47% said they didn’t already telework before the emergency. This means we have a whole lot of people working from home who aren’t used to it or aren’t happy about it.

All of this means, federal employees and managers, and their industry partners need to consider how to make it all work from technology, people and security perspectives.

Chris Howard, the vice president of U.S. public sector at Nutanix, said there are ways that government can move forward on much-needed telework and related infrastructure initiatives while maintaining mission critical operations as the coronavirus pandemic reshapes how the government works in both the short and long terms.

“A lot of our customers we have been talking to and other folks in government, even if they thought they had a work from home capability, it was always meant to be in very short instances and only a portion of the staff that was potentially teleworking versus, in some cases, we are at 100% of the staff,” Howard said in the Innovation in Government show. “The infrastructure that was built from a concurrent user base, is difficult to support. There’s also networking challenges. There are security challenges. I think the preparedness was not designed for an instance exactly like this.”

Many agencies put their network through a “stress test” before telling a majority of employees to work remotely. From that analysis, many agencies made changes to ensure the network could handle the traffic.

But Howard said it’s difficult to spin up more capacity quickly, especially these days when infrastructure components can be more difficult to find and install.

He said as agencies move forward with network and infrastructure modernization designs, they will have to prepare and plan for the worst case scenarios in a different way than ever before.

Dan Fallon, the senior director of federal engineering at Nutanix, said agencies are expanding existing capabilities either on-premise or in the cloud.

He said there are things like networking and security changes that can’t be done overnight so agencies tend to stick with and improve what they are using today.

“We’ve seen some innovative and creative ways to get around overloading the virtual private network or existing virtual desktop infrastructure,” Fallon said. “We’ve seen agencies do things like a simple virtual desktop in the cloud just to get them the ability to get inside the secure corporate network. Agencies also are off loading things like typical office technology to relieve the virtual desktop and save that capacity for more high-end users so you are splitting the workloads to ease demand on the current infrastructure.”

Howard said two major trends have emerged for how agencies are dealing with the pandemic. He said some agencies are expanding their work-from-home capabilities that tends to be on-premise. Others, however, want to move to a desktop-as-a-service capability where everything is cloud based.

“It’s easier to spin up. You can rent it on a consumption model that is model,” Howard said. “We have a lot of customers who are looking for what solutions are available through the cloud because they don’t have the wherewithal to set up an on-premise telework solution and then still take care of the challenges around bandwidth and security.”

Like most agencies and contractors, Nutanix went 100% teleworking a few weeks ago. Fallon said that decision made company security executives refocus on the idea that the edge of the network is now each employees’ home and device.

“The cyber criminals are not sleeping,” he said. “They are continuing to target the federal networks. There is definitely a need for users to continue to be educated because a lot of it is on the user. Then there are technical solutions. As much as you can do to separate the user’s home environment from the work session. The challenge with your home device, you just aren’t thinking about it as much.”

He said VPN and virtual desktop solutions are some of the ways agencies can secure their networks and applications and separate their environments.

The coronavirus pandemic is opening the eyes of many agency and contractor executives about what the future of work looks like.

Howard said the pandemic will spur innovation in both technology and processes whether that’s an expansion of desktop-as-a-service or virtual desktops or something else.

“I think there will be an increasing demand for multi-cloud and an open architecture that can shift across clouds to deal with the need for additional capacity and expand service rapidly,” Fallon said.

The post Agencies need to rethink what telework preparedness means first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/it-innovation-insider/2020/04/agencies-need-to-rethink-what-telework-preparedness-means/feed/ 0
Why 2020 is the year the consumption model for IT takes hold https://federalnewsnetwork.com/it-innovation-insider/2020/01/why-2020-is-the-year-the-consumption-model-for-it-takes-hold/ https://federalnewsnetwork.com/it-innovation-insider/2020/01/why-2020-is-the-year-the-consumption-model-for-it-takes-hold/#respond Mon, 20 Jan 2020 13:34:17 +0000 https://federalnewsnetwork.com/?p=2660430 As agencies enter 2020, cybersecurity remains at the top of many agency and industry executives’ priority list.

The post Why 2020 is the year the consumption model for IT takes hold first appeared on Federal News Network.

]]>

As agencies enter 2020, cybersecurity remains at the top of many agency and industry executives’ priority list.

But it’s not alone by far. Agencies received good news in December when the Homeland Security Department released more details about the Trusted Internet Connections (TIC) 3.0 implementation. This will make it much easier for agencies to securely move to the cloud.

In fact, when we look back at 2019–and really over the last two years—the Office of Management and Budget updated nearly every major IT policy and took some initial steps toward implementation. This includes Cloud Smart, expanded cybersecurity requirements for high value assets, the federal data strategy, identity management and so much more.

All of these policies set the roadmap for agencies to move more quickly toward IT modernization.

Chris Howard, the vice president of U.S. public sector at Nutanix, said the biggest change in the federal sector will come from the focus on supply chain security and the acceleration of the move toward a consumption model for cloud services.

“The requirements I’m seeing from the government are still the same. For the most part, it’s about how do they bring scale, ease of use, simplicity, the consumption model and a whole bunch of different things into their environment whether that’s on-premise or in the cloud,” Howard said in the IT Innovation Insider show. “I foresee we are still dealing with the same priorities and requirements, but there will be some slight alternations around how things are procured and a lot more inspection on where does the software and the hardware come from.”

Howard said the “pay as you grow” model is coming more to the forefront of acquisition strategies across the government. He said agencies need to get into the model of paying for what they need versus the traditional way of buying too many licenses or seats and potentially wasting money.

“Agencies should no longer go out and buy five years worth of stuff today. If you can do it based on funding and contractual policies, pay for it exactly when you need it,” he said. “That’s where products need to be built so they can scale easily. You have to be able to do things in real time during production hours that don’t require maintenance windows. As long as the technology allows for that pay as you grow, that is, by far, the best situation for the government to be in.”

At the same time, the focus on cybersecurity, particularly within the supply chain, will impact agencies and vendors alike.

Dan Fallon, the senior director of federal engineering at Nutanix, said agencies are doing a better job with the day-to-day security of networks and data, especially with things like patch management and the ability to react quickly to threats.

“Now it’s on to the next phase of things that are maybe a little more remote in terms of how they are attacked,” he said. “Supply chain is big and broad, and it’s probably a little less susceptible than a public facing website that has a vulnerability. But if someone compromises your supply chain, the impact can be huge. That’s the next shift.”

Fallon said the success and evolution of the Federal Risk Authorization Management Program (FedRAMP) is both possible model for how the government has address supply chain threats and sets the expectations of the agencies for vendors in this cybersecurity area.

“The important thing is just because we see positive trends, now is not the time to back off of investments because the threats are continuing to grow,” he said. “Agencies now have visibility of what’s vulnerable. That was step number one. A lot of agencies are now past that point where they can start reacting and be much more proactive to threats.”

Howard added that the IT-as-a-service or consumption model doesn’t just make sense from a business standpoint, but it gives agencies more tools and help with cybersecurity. He said the vendor provider plays a bigger role in securing the infrastructure-, platform- or software-as-a-service.

One way to pay for this consumption model approach is to shift spending from only development or modernization to using operations and maintenance funding for these managed services.

“Throughout the years, there’s been this debate on which is easier and which has more funding, the capex or opex. Now it has shifted toward ‘we wanted to do this either as a managed service or we want to pay for exactly what we use.’ This is a big year to see that shift,” Howard said. “We have been talking about this for five years and it’s been a slow progression point. But this year we are already seeing a lot of requirements where we are being asked to do consumption-based models or capacity services types of contracts. What is the reason behind that? I don’t know if there is more money from an O&M perspective, or if it’s just the government feels that being a fiscal responsibility that they get better money out of the spending.”

Howard said this new model of consumption should be more economical and more valuable to the customer because there is nothing sitting idle.

Fallon said there are several pieces that have to fall into place for the consumption-model to happen, and the evolution of technology with the move to public cloud has helped open the door to change.

“The product has to match the budget and consumption model. If the product doesn’t fit, can’t be scaled easily and granularly, then it’s hard to meet that,” he said. “I would look for products that have the flexibility to run across all the environments. Software licenses that can be transferred from on-premise to the cloud.”

The post Why 2020 is the year the consumption model for IT takes hold first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/it-innovation-insider/2020/01/why-2020-is-the-year-the-consumption-model-for-it-takes-hold/feed/ 0
How secure-by-design reduces the cyber burden on agencies https://federalnewsnetwork.com/it-innovation-insider/2019/10/how-secure-by-design-reduces-the-cyber-burden-on-agencies/ https://federalnewsnetwork.com/it-innovation-insider/2019/10/how-secure-by-design-reduces-the-cyber-burden-on-agencies/#respond Wed, 02 Oct 2019 20:22:47 +0000 https://federalnewsnetwork.com/?p=2449204 Dan Fallon, the senior director of federal engineering for Nutanix, said the way to close many cybersecurity gaps is to adopt the concept of inherent security or secure by original design.

The post How secure-by-design reduces the cyber burden on agencies first appeared on Federal News Network.

]]>

The Office of Management and Budget’s latest Federal Information Security Management Act report to Congress showed that agencies faced more than 31,000 cybersecurity incidents in 2018. This was a 12% decrease over 2017. At the same time, 2018 also marked the first year since the creation of the major incident.

That was the good news.

The bad news is agencies continue to struggle with basic cyber hygiene. OMB released the latest data on the IT modernization cross-agency priority goal that showed 70% of all civilian agencies have implemented hardware asset management, which was up 5% over the previous report card.

OMB also says 83% of all civilian agencies have implemented software asset management, which was up 4% over the previous report card.

So while it’s better, these numbers show agencies continue to struggle in some basic cyber defenses.

Dan Fallon, the senior director of federal engineering for Nutanix, said the way to close many of those cybersecurity gaps is to adopt the concept of inherent security or secure by original design.

An easy way to understand this concept, Fallon said, is to go back to the 1990s when people used to debate whether Apple or Windows computers were better.

“There is more inherent security built into the OS for Apple. You don’t have to install a bunch of extra security agents in the Mac. The operating system, built on a Linux platform, has more security built into the underlying system,” Fallon said on the IT Innovation Insider. “For example, if you are logged in as an administrator, which means if a bad guy gets access, they can take advantage. Apple prompts you to do things like install an application. That is a simple example. Windows has gotten better, but only a couple of versions ago. The Apple devices are much better at prompting so at least the users are aware.”

He said while that debate still happens today, this concept of secure by design or security in the architecture is growing among all vendors.

Among the reasons for this shift, albeit a slow shift, is the ever-growing threat to systems and data.

Fallon said both agencies and vendors see the value of finally making the old adage it’s “better to build in security, than bolt it on after the fact” a reality.

“Secure by design takes more of the burden off agencies. It saves them time and money by the vendor taking more responsibility in the product so your security policies and features are in the system when you take it out of the box,” he said. “That’s a huge cost and time savings. It means more rapid time to deployment to meet mission.”

To put this concept into action, Fallon said vendors must move away from implementing security guidelines from the National Institute of Standards and Technology or the Defense Information Systems Agency as a “custom” add-on after the vendor has installed the product.

Where agencies and vendors need to go is to follow the model of a software-as-a-service provider in the cloud where the product is secure out of the box.

“There is a lot of human error in someone going through a spreadsheet line-by-line trying to match with security controls. But if the product has it built in, you know across all products across your data center it has that underlying baseline,” Fallon said. “We know all our commercial customers will not want to meet the 14-character DoD password guidance. We treat the system as wanting to have the same security baseline and give the commercial customers all that great security posture the federal government provides because it benefits them as well. There are some things that go a little above and beyond like some of the DoD rules with password guidance and warning banners. We actually have a check box that turns on those features. You constantly balancing security versus usability.”

The Federal Risk Authorization Management Program (FedRAMP) has helped push the need mind shift toward security by design.

“When you look at the on-premise in the private cloud, that’s a big reason why you are seeing more products offer more security features built it. They have to keep up with the cloud model,” Fallon said. “But even in the cloud, it’s still a big shared responsibility. The cloud vendor doesn’t own a 100% of the security stack. They are responsible for the baseline infrastructure.”

Fallon said agencies also have to put some tools in place when using the commercial cloud to improve the governance of tracking the security requirements from NIST or DISA.

“You now have this proliferation of cloud services so there is a lot more to track there and it’s outside your data center. It’s even more important for that safety net to catch those security misconfigurations,” he said. “Smart governance and continuous monitoring go hand-in-hand. It’s no longer the old days of a once every three years security audit. That goes out the window in cloud. It’s a continuous lifecycle, daily, if not almost by the minute. It’s clear there needs to be more monitoring and more asset tracking, and the ability to respond if an asset is out of patch compliance or a port that is misconfigured on the open internet, how do I within minutes or seconds reconfigure that so it’s now closed?”

The post How secure-by-design reduces the cyber burden on agencies first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/it-innovation-insider/2019/10/how-secure-by-design-reduces-the-cyber-burden-on-agencies/feed/ 0
Paying for cloud services requires agencies to endorse the consumption model approach https://federalnewsnetwork.com/it-innovation-insider/2019/05/paying-for-cloud-services-requires-agencies-to-endorse-the-consumption-model-approach/ https://federalnewsnetwork.com/it-innovation-insider/2019/05/paying-for-cloud-services-requires-agencies-to-endorse-the-consumption-model-approach/#respond Mon, 06 May 2019 21:25:24 +0000 https://federalnewsnetwork.com/?p=2338219 Chris Howard, the vice president of the U.S. public sector for Nutanix, said applications that benefit the most from cloud services are those that have spikes and troughs in their usage.

The post Paying for cloud services requires agencies to endorse the consumption model approach first appeared on Federal News Network.

]]>
Chris Howard, the vice president of the U.S. public sector for Nutanix, said applications that benefit the most from cloud services are those that have spikes and troughs in their usage.

Since 2010 when the Obama administration launched its cloud-first policy, the momentum of applications and systems moved to public, private or hybrid cloud services has been growing.

Eight years after the cloud-first policy, agency spending on cloud computing services surged to $4.1 billion, according to analysis from Bloomberg Government. BGov says cloud spending grew by 9% among civilian agencies and by almost 30% among defense agencies from fiscal years 2017 to 2018.

The growth curve is expected to continue on the upward climb as the Trump administration finalizes its cloud smart strategy as well as new contracts for common back-office cloud services and shared services.

At the same time there are a host of challenges to consuming cloud services in way that is accessible and immediate.

Chris Howard, the vice president of the U.S. public sector for Nutanix, said as agencies continue to move the cloud, the way they use and pay for these services needs to be front and center.

“We are trying to bring the characteristics of the cloud to where ever it is the customer is going to deploy those IT assets, whether it’s in their own data center, a contractor owned data center or a public cloud company. It’s the consumption model that is important,” Howard said on the IT Innovation Insider show. “One of the benefits of the cloud we’ve realized is when you want an application or workload spun up in the cloud, it’s very easy and it’s fast to market. The agility and the speed to bring those applications so the users can access them is one of those characteristics. Another characteristic is how you consume it and how you pay for it, where you pay for what you use and that’s it.”

Howard said there are certain applications, ones that have spikes and troughs, that make sense to take advantage of the cloud and the consumption model that comes with it. At the same time, some applications may make sense just to modernize but keep on premise.

Dan Fallon, the senior director of engineering of the U.S. public sector for Nutanix, said this is why application rationalization is so important, and a growing trend across government.

“Part of what agencies can do is get the small wins and move the apps that are easy like the external facing things like web servers, email is a classic one. But then when you get into the apps that are on servers, still physical or even still on main frames, this could be a huge undertaking and a lot of budget dollars that may be required. These are the ones you leave for last,” Fallon said. “There is a growing trend of moving to containers when doing the cloud migration, but that does introduce extra complexity.”

As agencies move into these more difficult or complex applications, Howard said approaches such as managed service or shared services are starting to gain popularity. The Trump administration recently issued a new strategy for how agencies should move to back-office shared services, naming four agencies to lead human resources, financial management, grants management and cybersecurity.

“The main trend is agencies want out of the hardware business,” he said. “There are a couple of different ways to do that, and it doesn’t mean just a lift-and-shift to the cloud. Managed services can be accomplished in a lot of different ways. It can done on-premise at the customer’s site. It can be done in a co-location site and it can be done in the cloud. I think it’s about how you consume and what color of money makes the most sense for you. A lot of people are more flush with the operations money versus the capital investment money and that’s a big driver.”

Operational money, known as OpEx, is used mainly to keep legacy systems up and running. Howard said agencies can use that type of funding to pay for managed or shared services more easily.

“No matter what kind of contract you enter into, whether a managed service or shared service, you still want to make sure you have the flexibility to make change. You don’t want to be locked into something for 5 or 10 years that doesn’t give you the same level of innovation or cost protection,” Howard said. “No matter what you go into, and the consumption model generally allows it, allows you to turn it off and pivot when you have to so that is the key takeaway on what would make a successful contract.”

Fallon added there also is a technical side that agencies should consider along with the contracts piece. He said departments also should make sure they can easily move their apps and data between clouds or between a cloud and on-premise.

The post Paying for cloud services requires agencies to endorse the consumption model approach first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/it-innovation-insider/2019/05/paying-for-cloud-services-requires-agencies-to-endorse-the-consumption-model-approach/feed/ 0
As IoT devices, AI grow, are agencies ready to benefit from computing at the edge? https://federalnewsnetwork.com/it-innovation-insider/2019/03/as-iot-devices-ai-grow-are-agencies-ready-to-benefit-from-computing-at-the-edge/ https://federalnewsnetwork.com/it-innovation-insider/2019/03/as-iot-devices-ai-grow-are-agencies-ready-to-benefit-from-computing-at-the-edge/#respond Wed, 06 Mar 2019 19:51:40 +0000 https://federalnewsnetwork.com/?p=2279502 On the IT Innovation Insider, Jason Langone and Greg O’Connell of Nutanix say agencies must address security, governance and a host of other issues to successfully use data collected in the field in near-real time.

The post As IoT devices, AI grow, are agencies ready to benefit from computing at the edge? first appeared on Federal News Network.

]]>
On the IT Innovation Insider, Jason Langone and Greg O’Connell of Nutanix say agencies must address security, governance and a host of other issues to successfully use data collected in the field in near-real time.

The internet of things or connected devices and artificial intelligence are quickly emerging in the federal sector.  These emerging—if we can even call them emerging anymore–technologies are impacting the federal market in a big way.

Over the last few years the use of connected devices has grown from sensors on networks to sensors in the field to measure agriculture output. It’s all about bringing computing to the edge.

At the same time, there are security concerns that come with it. The National Institute of Standards and Technology will be releasing an updated guidance to adopting IoT and addressing security concerns in the coming months.

Agencies have to understand how to harness these opportunities, address the challenges that come with them and, maybe most importantly, take advantage of the power of the technology evolution to bring services, compute power and data to the tactical edge.

Jason Langone, a senior director for IoT and AI at Nutanix, said the agencies are recognizing more and more that much of the data it uses is generated at the field where its employees are meeting their mission and the old approach of sending that information back to a centralized processing center isn’t working.

“The way developers have been developing applications have moved from legacy middleware apps to containerized applications that are much easier to move out to the edge. And everything is IP connected now and has the ability to send data now,” Langone said on the IT Innovation Insider. “We are collecting this data, what can we now do with it and how can we make smart correlations to take intelligent actions.”

Greg O’Connell, a director for Nutanix, said while devices have generated data at the edge for years, the difference is the underlying infrastructure, such as cloud services, can move or process that data quickly, letting users make decisions in near-real time.

O’Connell said research finds that devices and applications at the edge will generate 40 times more data by 2020 than what’s currently being generated.

“With all of this data comes the requirement to manage and process the data,” he said. “There is a broad range of examples that span organizations and agencies within government that are absolutely flat-footed yet need to adopt edge-based capabilities. To give you an example, there is an Air Force program office that is responsible for flight suits and helmets for pilots. We gather terabytes of information on military jets by the minute, yet to date, we gather zero physiological information on the pilots themselves. This is a classic example of IoT and edge computing where if we could better collect information with the sensors and process it in real time…we could take advantage of that to protect the pilots.”

Langone said agencies and developers must keep in mind the challenges of deploying apps to the edge, given in some cases there is low bandwidth or connectivity.  He also said an additional challenge is the number of devices that employees use in the field could number, in some cases, in the hundreds of thousands, which adds more complexity to the effort.

“There are a couple of things to think about. One is the sensor data, where does that live at the edge and how is that encrypted as well as the machine learning logic that is delivering the value?” Langone said. “If that edge device were to grow legs and walk away or to be stolen, how do we ensure that we’ve lost nothing?”

This is why Langone and O’Connell recommend agencies apply IoT devices and AI only after they know what problem they are trying to solve. The technologies and devices have to be a part of a larger business solution.

“One of the working relationships I’ve seen is when the chief data officers is fielding requirements from the business or mission. They typically understand they have a problem with something. And the CDO often responsible for developing that strategy and ultimately deploying the solution to solve that problem” Langone said. “When that is not a connection that is functioning in an agency, those things are in a void and it’s difficult to come up with something specific to solve.”

O’Connell said agencies need to address these challenges today because the growth of IoT, AI and machine learning will contribute trillions of dollars to the U.S. economy over the next 10 years and create tens of millions of new jobs.

The post As IoT devices, AI grow, are agencies ready to benefit from computing at the edge? first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/it-innovation-insider/2019/03/as-iot-devices-ai-grow-are-agencies-ready-to-benefit-from-computing-at-the-edge/feed/ 0
Why a mindset change is needed to deal with the IT rebellion https://federalnewsnetwork.com/it-innovation-insider/2019/01/why-a-mindset-change-is-needed-to-deal-with-the-it-rebellion/ https://federalnewsnetwork.com/it-innovation-insider/2019/01/why-a-mindset-change-is-needed-to-deal-with-the-it-rebellion/#respond Wed, 23 Jan 2019 20:57:55 +0000 https://federalnewsnetwork.com/?p=2223457 Nutanix CEO Dheeraj Pandey said velocity and agility requirements are causing agencies to change their technology delivery models through cloud and hyper-convergence.

The post Why a mindset change is needed to deal with the IT rebellion first appeared on Federal News Network.

]]>
Nutanix CEO Dheeraj Pandey said velocity and agility requirements are causing agencies to change their technology delivery models through cloud and hyper-convergence.

Disruption is a word that is used commonly when it comes to technology, especially over the last decade.

The ever-growing challenge around cybersecurity has been and continues to be a disruptor.

The cloud, many said, was the ultimate disruptor. Until it wasn’t.

In the federal market, there are companies who are supposed to be disruptors, changing how agencies buy and use technology.

For Nutanix CEO Dheeraj Pandey, disruption isn’t a technology or a company, rather it’s a mindset.

“At the core of this is the velocity and agility requirements of our customers. People want to move fast because everything around them, their customers, their consumers and even their adversaries, like the hackers around the world, are moving very fast,” Pandey said on the IT Innovation Insider. “And then you have such high velocity environment, people want to look for ways to consume technology and infrastructure as fast as they can. That also means they don’t have time for specialists. They cannot go to too many teams of people, one doing storage, one doing networking, one doing compute, one doing virtualization and yet another one doing servers and applications.”

Instead, the need for velocity and agility is forcing agencies to move from what Pandey called a “highly fragmented infrastructure” to one that is hyper converged, where services and people are centered around a multi-cloud environment.

“I think there is this rebellion happening right now in the IT industry about where we have too many people doing too many specialists niche things. We need to step back and understand do we simplify technology and have more people use technology to meet their needs,” he said. “What’s happening to personal computing is the same thing that happened to enterprise computing. If you go back in time 10 years ago before the introduction with smartphones, we used to have 50 devices that we’d interact with, not the least of which were music players, cameras, video cameras, GPS devices, flashlights and I can go on and on talking about devices. Then they all converged as pure applications running on a common operating system, whether Android or IOS, and that’s exactly what is happening in enterprise computing as well.”

The combination of velocity and agility as disruptors, the convergence of services, usually in the cloud, and the growing use of artificial intelligence, machine learning and automation is forcing agencies and industry alike to shift their thinking about how they serve customers or meet their mission.

Pandey said hyper-convergence around a multi-cloud approach helps push data and compute power to the edge, whether it’s through mobile devices or how services are consumed by an organization’s customers.

“At the end of the day, the network is the enemy because of the amount of data we are producing is just enormous. Data has immense gravity,” he said. “You really want the applications to move to the data rather than the data to move to a large cloud data center itself. That is what is causing this demand for dispersing computing to where people are, to where machines are and to where the operations are.”

The end goal, in many ways, is to make infrastructure invisible to the user and consumer in such a way it doesn’t matter if the agency owns or rents the servers and cloud instance. Pandey said hyper-convergence makes that happen.

“Many organizations like the Navy, for example, have this view that they need to have a cloud at the edge in these battleships. They need to have extremely space efficient, power efficient and skillset efficient infrastructure that can be used by application folks,” he said. “Then they have remove offices, branch offices and then they have core large data centers. And finally, they also are now scratching the surface of renting it from a secure public cloud service looks like.”

Pandey said hyper-convergence and cloud give the users more power, and the thus the agility and velocity to meet customer needs.

“At the core of all of this is how do you democratize technology and democratize computing and bring it to anyone at a click of a button,” he said. “That is what hyper-convergence aims to do. Bring all this computing power at the click of a button to folks who really run applications because that’s where the business logic runs. These are the people who have deadlines, budgets and heads roll if applications are not available, not reliable or not fast enough.”

The post Why a mindset change is needed to deal with the IT rebellion first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/it-innovation-insider/2019/01/why-a-mindset-change-is-needed-to-deal-with-the-it-rebellion/feed/ 0
DoD’s drive toward better tactical capabilities begins with simplicity, capacity https://federalnewsnetwork.com/it-innovation-insider/2018/11/dods-drive-toward-better-tactical-capabilities-begins-with-simplicity-capacity/ https://federalnewsnetwork.com/it-innovation-insider/2018/11/dods-drive-toward-better-tactical-capabilities-begins-with-simplicity-capacity/#respond Wed, 14 Nov 2018 21:56:13 +0000 https://federalnewsnetwork.com/?p=2132745 Today and tomorrow, cloud services can help transform the warfighter’s ability to meet their mission in a safe and secure manner.

The post DoD’s drive toward better tactical capabilities begins with simplicity, capacity first appeared on Federal News Network.

]]>
Part 2 of the interview:


Recently, the Defense Department participated in the Enhanced Logistics Base or ELB demonstration Norway. The goal of this exercise called Trident Juncture 2018 is to test, refine and further develop existing or new capabilities while coordinating and integrating with NATO and other partners.

The exercise demonstrated future capabilities of autonomous and automatized systems within military logistics. The integrated Enhanced Logistic Base will cover all aspects of future logistics in a military-civilian demonstration to include a fully integrated autonomous and automatic logistics stream.

Sounds like a fascinating effort that can show the potential of technologies like remote machine guns, cubed storage and a field made 3D printer.

But none of these great technologies will work to their full capacity without data and connectivity.

It’s imperative for DoD to ensure warfighters can access data from anywhere, at any time.

One way that’s starting to happen is the increased use of cloud computing services, which many see as critical to maintaining the nation’s military advantage.

Today and tomorrow, cloud services can help transform the warfighter’s ability to meet their mission in a safe and secure manner.

Add to that emerging capabilities like artificial intelligence and machine learning, the potential to make warfighters better and faster is huge.

Maj. Gen. David Bassett, the Army’s program executive officer for command, control and communications–tactical (PEO-C3T), said is following a halt, fix and pivot strategy.

“We will halt efforts which we know will not get us to our end state. We will make changes, fix some programmatic efforts in some new capabilities that we know we can bring to the field quickly. We will pivot to a new process for experimenting and delivering technology as well as a new set of capabilities that will get us to the network that we know we need in the future,” Bassett said on the IT Innovation Insider show. “It will not happen overnight, but we’ve been on that path and have begun experimentation.”

The Army is doing this across four lines of effort:

  • Unified transport, which is about putting the communication infrastructure in place to get data from point A to point B, both in the tactical space and back to enterprise systems.
  • Mission command systems and moving to a common operating environment where the Army doesn’t have systems that are stovepipes, but can leverage software to give soldiers a common operating picture that works across the battlefield applications and reduces the amount of servers and infrastructure needed in the field.
  • Interoperability across services and with allied partners.
  • Making command posts more deployable, more survivable and more capable.

“Across all four of those lines of effort we have efforts underway both programmatic and experimentation,” Bassett said. “We want to learn from immediate soldier feedback so we can move toward a model where don’t necessarily start with a set of requirements that were written in a school house somewhere, but rather get equipment quickly in the hands of soldiers, be able to leverage what technology can deliver and make much quicker decisions about what we can field across the force.”

Part 1 of the interview:

Retired Lt. Gen. Stephen Boutelle, the former Army CIO and now a visiting fellow at MITRE, said the tactical edge means so many different things to each of the services there isn’t a generic approach will not work for all the services.

“It’s really important to define the environment,” he said. “As we look at it, we have to look at the lowest level of the tactical edge all the way up to the enterprise.”

Scot Susi, the director of DoD for Nutanix, said the military, and for that matter than organization that works in an austere environment, must get away from cobbling systems together that are difficult to maintain and are complex to use.

“We need to give the folks in the field a simple interface, things that they are used to interfacing with like the iPhone or iPad that make it as simple as possible and reduce the number of moving parts,” he said. “That way there are fewer things to break, and when they do break, they are easier to fix in the field without having to send a highly trained, highly paid service engineer to complete rebuild an entire application stack.”

Bassett added the Army would layer on functionality after functionality, which added to that complexity. Now, the Army’s changing its current process to make simplicity and usability to the forefront.

“Some of that can be helped along the way by systems that employ artificial intelligence to help abstract away some of that complexity and help commanders turn all that battlefield data into actionable intelligence,” he said. “Some of it is about managing those functions and making sure the things we deliver work together well. We are figuring out how we can leverage commercial capability but not utterly rely on it so we can operate in that congested and contested environment. It’s absolutely at the heart of where we are trying to go with this network modernization.”

The post DoD’s drive toward better tactical capabilities begins with simplicity, capacity first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/it-innovation-insider/2018/11/dods-drive-toward-better-tactical-capabilities-begins-with-simplicity-capacity/feed/ 0
Cybersecurity as a shared responsibility https://federalnewsnetwork.com/it-innovation-insider/2018/10/cybersecurity-a-shared-responsibility/ https://federalnewsnetwork.com/it-innovation-insider/2018/10/cybersecurity-a-shared-responsibility/#respond Wed, 17 Oct 2018 15:07:17 +0000 https://federalnewsnetwork.com/?p=2095391 With October ushering in Cybersecurity Awareness Month, agencies and industry must remember that the challenges they face are among the most common for organizational and personal security.

The post Cybersecurity as a shared responsibility first appeared on Federal News Network.

]]>

With October ushering in Cybersecurity Awareness Month, agencies and industry must remember that the challenges they face are among the most common for organizational and personal security.

The separation between work-life and personal-life are increasingly less distinct, and with more digital natives in the workforce than ever before, cybersecurity is emerging as a fully shared responsibility. This means there are important roles and obligations for everyone, not just the cyber team.

With so much at stake, organizations can’t afford to assume that someone else is handling cyber defense. Instead, they need to remember they are only as secure as their weakest link.

Dan Fallon, the director of engineering for Nutanix, said this idea of shared responsibility becomes even more important as agencies move more applications and data to the cloud especially a hybrid cloud.

“We are looking at what they are doing around data security and data encryption, and what are they doing to automate what clouds they are looking at,” Fallon said on the IT Innovation Insider, sponsored by Nutanix. “We are focusing on the basics there. What are their compliance standards? There are a lot of different standards that are ever-changing. We are showing them how we can check the box with a product that takes security that is more of a built-in than bolted-on approach.”

David Reber, the director of cybersecurity for Nutanix Frame, said because agencies will be operating in a private data center and public cloud for the foreseeable future, they need to have automated security checks and a way to provide visibility to make rapid response decisions when there is a problem.

“How can you get an enterprise cloud view across both ecosystems in a unified manner tends to be one of the biggest challenges for end users,” he said. “How do we automate security checks while the developers are rolling code or capabilities out to the workforce? This way you get real time feedback.”

Reber said the dev/sec/ops model lets agencies balance security and compliance with agility and speed.

The sharing of responsibility means agencies and vendors alike have to start with a consistent baseline that includes a predictable infrastructure. Then the automated rules can kick in to alert chief information officers or chief information security officers if a device or application has fallen out of compliance.

As work and personal lives continue to merge, Reber said the need to have a shared responsibility perspective becomes even greater. He said email remains the biggest attack vector for bad actors and most employees don’t do enough to protect themselves in their personal lives.

Reber said agencies need to understand where the line in the sand is drawn between cloud or on-premise vendor security and a department’s responsibility to protect their data and systems.

“The real truth is can your vendor outline specifically what they do for you, here is how they help and here is where you take control and ownership. You need to define that and make sure you educate your users in that personal responsibility area as well,” he said. “If you are using bring your own devices and stuff like that, you need to make sure there is good education for everybody, specifically at the top of your organization. They tend to be the busiest. They tend not to take the training or, if they do, it tends to be ad hoc. But they are targeted the most. Their names are a Google search away from being targeted. They and their families are at risk.”

Reber said executives have to realize that cybersecurity is a constant effort around training, people, process and technology, and it starts with them.

Fallon added as agencies continue to shift their security model toward continuous monitoring and automation of security standards will further get human errors out of the discussion about how to deal with known and unknown vulnerabilities.

The post Cybersecurity as a shared responsibility first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/it-innovation-insider/2018/10/cybersecurity-a-shared-responsibility/feed/ 0
Hybrid cloud is changing the one-size fits all mindset https://federalnewsnetwork.com/it-innovation-insider/2018/08/hybrid-cloud-is-changing-the-one-size-fits-all-mindset/ https://federalnewsnetwork.com/it-innovation-insider/2018/08/hybrid-cloud-is-changing-the-one-size-fits-all-mindset/#respond Tue, 14 Aug 2018 18:00:44 +0000 https://federalnewsradio.com/?p=2013675 A new survey from Nutanix of federal agencies found 20 percent of all respondents are using a multi-cloud approach, and of them,75 percent say it’s working well or very well.

The post Hybrid cloud is changing the one-size fits all mindset first appeared on Federal News Network.

]]>

If one thing is true, agencies are excited about hybrid cloud.

Research firm Gartner says in 2017 that 75 percent of all IT managers are using hybrid cloud to meet their needs. And adoption of hybrid cloud has increased by 13 percent year to year while overall cloud adoption overall has increased by 2 percent across the government.

This means there is a huge appetite for a multi cloud approach and it’s only getting bigger.

A new survey from Nutanix of federal agencies found 20 percent of all respondents are using a multi-cloud approach, and of them,75 percent say it’s working well or very well.

Chris Howard, the vice president of public sector for the Nutanix, said the results once again reinforces the fact that one-size doesn’t fit all when it comes to cloud services.

“There has to be different approaches. Everyone has different uses cases. There is an openness to evaluate the best place to run applications and data sets,” Howard said on the IT Innovation Insider program. “As technology has evolved, customers have become more open. I’ve definitely seen a shift. But that doesn’t mean there is a legacy mindset in some agencies, and there isn’t a mindset that there is some one-size-fits-all, which isn’t going to be the case and hopefully the government as time progresses continues to expand their use of the public cloud, on-premise and hybrid cloud technologies.”

Howard said agencies also have evolved how they write solicitations, asking less for specific vendors and just highlighting the need for the flexibility and agility of the cloud.

It was no surprise that security remains the top concern about cloud computing, but 44 percent recognized that using multiple clouds makes them more secure.

Howard said agency security requirements around compliance, data sovereignty and sensitivity levels also are driving the decision to move a multi-cloud approach.

“The workload itself was driving whether they ran it at one cloud or another or kept it on-premise,” he said. “Security drove a lot of the on-premise specific workloads, but now people are looking at security and evaluating a multi-cloud approach. The security requirements the application has are actually dictating which cloud they go with now. That’s a big change from 2016 when it was ‘go to the cloud.’ You could check a box by putting an application in the cloud, but there was less thought around security and the multi-cloud approach.”

Another big driver of the multi-cloud approach is an agency’s desire to better understand and control its costs.

Howard said initially moving to the cloud cost wasn’t a main driving factor, but now five years later, agencies want to optimize costs when putting applications in the cloud.

“You should be able to move whether that’s on a weekly or monthly basis based on where the cost is going to be most beneficial,” he said. “That’s where the multi-cloud approach is really winning out because it enhanced competition, innovation and gives us a better measurement of cost because now we have true competition among these cloud vendors.”

Howard said in 2017 survey 50 percent respondents said cost savings would greater if it would be easier to control and manage costs with the public cloud, while 45 percent of the respondents expressed concern that  it was easy for costs to escalate quickly when using a public cloud.

By 2018, Howard said 49 percent of respondents said cloud costs what they thought it would, while 25 percent said the cloud costs more than they expected it would.

At the same time, 72 percent of the respondents are in the process of creating a better management and optimization of their costs in the cloud.

“We feel 25 percent of your applications are best suited for the cloud because they are unpredictable workloads, they scale up real quick and have to scale down,” he said. “But 75 percent of your applications are better suited for on-premise assuming you have an efficient infrastructure. This panacea that the cloud was going to be cheaper than expected, I think, most people realize the cloud is not a cost saver. There are so many more reasons to use the cloud. It’s really how do you optimize it.”

The post Hybrid cloud is changing the one-size fits all mindset first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/it-innovation-insider/2018/08/hybrid-cloud-is-changing-the-one-size-fits-all-mindset/feed/ 0
How to free agencies from vendor lock-in with the cloud https://federalnewsnetwork.com/it-innovation-insider/2018/07/how-to-free-agencies-from-vendor-lock-in-with-the-cloud/ https://federalnewsnetwork.com/it-innovation-insider/2018/07/how-to-free-agencies-from-vendor-lock-in-with-the-cloud/#respond Thu, 12 Jul 2018 15:50:37 +0000 https://federalnewsradio.com/?p=1976454 On the IT Innovation Insider show, Nutanix details a new concept called “Freedom” that promotes agility and flexibility in the cloud.

The post How to free agencies from vendor lock-in with the cloud first appeared on Federal News Network.

]]>
On the IT Innovation Insider show, Nutanix details a new concept called “Freedom” that promotes agility and flexibility in the cloud.

The move to the cloud was supposed to end many of the age-old concerns with on-premise data centers. Changing hardware, updating software and, maybe most importantly, ensuring agencies didn’t get caught in the dreaded “vendor lock-in” were part of the great promise of the cloud.

While the first two concerns seem to be taken care of, the threat of agencies becoming beholden to one cloud vendor remains a real challenge.

“We see this concept of vendor lock-in in requests for proposals with restrictive language,” said Chris Howard, vice president of public sector for Nutanix, on the IT Innovation Insider program. “The most noticeable was the Department of Defense’s initial path with its JEDI acquisition where it wanted to go to a single cloud. That alone would be pretty significant lock-in.”

Howard said agencies don’t just need to move off legacy technology, but move off of a legacy mindset about how they buy and manage technology.

To help facilitate that change in mindset, Nutanix is promoting a concept called “Freedom.”

“At the heart of this campaign is about how do we give our customers the freedom to build and modernize the data centers they always wanted to build or the freedom to run the workloads that they want to run where they want to run them, whether it’s on their own private cloud or the public cloud,” said Ben Gibson, the chief marketing officer for Nutanix. “It’s about the freedom to make those decisions, freed up with simplicity and with the knowledge in terms of what’s the most cost effective cloud platform to run different applications on.”

Gibson said the “Freedom” initiative is really about the commoditization or electrification of cloud services—no matter what service or application an organization has it can be plugged in and played on any cloud.

And this concept becomes even more important as agencies continue to implement a hybrid cloud approach.

“We like to talk about ‘one-click.’ With one click, you can manage applications and application mobility across different cloud environments,” Gibson said. “Also, it’s about making informed choices. Every work load has cost implications dependent on which cloud it runs on. The more we can provide that visibility into that kind of information, the smarter our customers become, the more informed decisions they make, and ultimately the can impact both top and bottom line for their organization’s operations.”

There are five key concepts around the “Freedom” concept:

  • Freedom to build—Modernizing your data center environment to simplify your architecture environment and reduce costs and other resources.
  • Freedom to run the applications where you chose—This means having application mobility across different environments.
  • Freedom to cloud—Almost have a brokering environment where you can make decisions based on cost and performance requirements.
  • Freedom to invent—Gives IT professionals more time to think of new applications or innovate rather than maintain legacy systems.
  • Freedom to play—“Part of the promise we’d like to see with our experience with our customers that they have some better work-life balance. They are not being called in on weekends with some kind of availability issue and instead because of radically simplifying their private cloud and moving into hybrid cloud environments, they have time to have fun,” Gibson said.

Gibson said the reason cloud lock-in remains a challenge for agencies is every public cloud has its own set of application programming interfaces (APIs), has its own set of security implementations and other features that may be hard to break free from.

“We think this is an opportunity for an IT organization to reclaim some strategic control over what in many cases has become a bit of an uncontrollable environment with a lot of different organization firing up a new workload in a new public cloud platform at any given time,” he said.

Howard said the “Freedom” concept also will help agencies as they continue to push toward technology modernization.

“We want you to be able to run your application in any cloud you want with the freedom to move it anytime you want based on security, based on cost, based on governance or for whatever reason,” he said. “You need to have the freedom to move and be flexible.”

The post How to free agencies from vendor lock-in with the cloud first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/it-innovation-insider/2018/07/how-to-free-agencies-from-vendor-lock-in-with-the-cloud/feed/ 0
Hybrid cloud means understanding predictable, unpredictable apps https://federalnewsnetwork.com/it-innovation-insider/2018/05/hybrid-cloud-means-understanding-predictable-unpredictable-apps/ https://federalnewsnetwork.com/it-innovation-insider/2018/05/hybrid-cloud-means-understanding-predictable-unpredictable-apps/#respond Wed, 02 May 2018 14:41:20 +0000 https://federalnewsradio.com/?p=1888857 Chris Howard, the vice president of public sector for the Nutanix, said agencies need help in managing dual cloud environments.

The post Hybrid cloud means understanding predictable, unpredictable apps first appeared on Federal News Network.

]]>

Chris Howard, the vice president of public sector for the Nutanix, said agencies need help in managing dual cloud environments.

The IT Innovation Insider is a monthly show focused on the emerging technologies and trends that are moving federal agencies toward a more secure and modernized future. In June, we will focus on best practices and stories about customers successfully living in a hybrid cloud world from the .Next conference in New Orleans, Louisiana.

Over the last seven years, two kinds of agencies emerged when it came to cloud computing—the early adopters and the uphill accelerators.

The early adopters were the ones that hoped for savings from moving to the cloud and moved things like email or collaboration to these off premise service providers.

The uphill accelerators are those who started slowly, unsure of the security of cloud and if the buzz around savings was more than marketing speak, but over time have hastened their move to the cloud.

Chris Howard, the vice president of public sector for the Nutanix, said new data shows those federal IT customers in that early adopter group started to pull back from putting everything in the cloud.

The data also showed the uphill accelerators now, seven years after the cloud first mandate from the Office of Management and Budget, are more forward leaning when it comes to putting data and applications in the cloud.

“They are taking an different approach to the early adopters that we’ve seen before,” Howard said on the IT Innovation Insider, sponsored by Nutanix.

As for the early adopters pulling back, Howard said it’s now about understanding what systems or applications are cloud-ready.

“Based on the study, I don’t think you are seeing a wholesale pull back, but when people first got into the cloud, they tried to put everything into the cloud,” he said. “The way we look at the cloud is 25 percent of your applications are unpredictable or elastic enough to where it makes 100 percent sense to put them into the cloud. It could be a 30-day big data job or some end of month financial run. It doesn’t make any sense to buy infrastructure and run it. It makes more sense to put it into the cloud, use that cloud for that 30 days or 15 days that you need it, and then pull back that data. Where we see the benefit to the on-premise and where people may be pulling back is when they put, what we call, predictable workloads in the cloud, workloads where you know what you are getting and you are running full-time for three or five years.”

Howard said if those predictable workloads are done in the agency’s data center or in the agency’s private cloud, and done efficiently, then those activities will be less or at least equal in cost, and maybe more secure and have better governance.

All of this movement around cloud is leading agencies toward having to manage a dual environment with on-premise and off-premise cloud instances.

The General Services Administration says in its 2017 Hybrid Cloud Almanac that a recent Gartner survey of federal IT managers found 75 percent indicated plans to implement a hybrid cloud solution by the end of 2017. The biggest challenges to managing a hybrid cloud include a lack of resources and the expertise in the workforce. GSA says agencies should consider several factors as they implement hybrid cloud, including integration of different clouds using application programming interfaces (APIs), cloud management and orchestration frameworks, and the organizational impact of hybrid cloud, which is another way to say agencies need to have the right people resources because hybrid cloud is not a typical IT project.

“Any customer out there wants choice and they don’t want to be reliant on one specific technology and it’s the same with cloud. They want to be able to put data into all the big cloud providers out there,” Howard said. “The challenge with that is now you have siloes of cloud. So that’s where multi-cloud management is really coming into play. It’s a hard problem to solve.”

He said the federal data center consolidation and optimization effort as well as the initiatives around IT modernization all area leading the agencies to understanding that need to manage their assorted clouds in a new way.

Howard said two other related trends Nutanix is seeing from its agency customers the adoption of a software-driven IT environment, which is leading to using automation to move away from “low-value” work, and the impact of the Internet of Things devices to give agencies new insights into both data and applications.

“The ability to ability reduce the significantly reduce that foot print of your data centers is through software is one easy way to see modernization and consolidation,” he said. “Software can automate a lot of the environment. The automations good because it requires less human interaction and so therefore your people are modernizing themselves because they are no longer worried so much about the infrastructure and the hardware piece and they are focusing more on the application, the service level agreements to the customers and uptime for the agency.”

Howard said these and other trends are leading the way for agencies to get off legacy systems and provide better services.

“A lot of agencies are at the point where they know they have to make a change and just need to know they can manage all of this change through a single management plane,” he said. “Everyone wants to get rid of siloes. The agencies who are willing to take a leap, even just for a workload or a certain use case, will see the benefits and will expand more quickly into the software world.”

The post Hybrid cloud means understanding predictable, unpredictable apps first appeared on Federal News Network.

]]>
https://federalnewsnetwork.com/it-innovation-insider/2018/05/hybrid-cloud-means-understanding-predictable-unpredictable-apps/feed/ 0