GSA Explored Emerging Technologies and Acquisition Innovation in 2019
Further streamlining of federal procurement processes is likely to come in 2020.
As the chief acquisition agency for the federal government, the General Services Administration demonstrated innovative thinking in its approach to procurement in 2019. It recognized that it can learn from industry best practices and continued shortening the acquisition timeline to ensure that the tools the government needs to serve its citizens arrive as soon as possible. Moreover, it led by example in thoughtfully implementing emerging technologies to reduce costs and streamline business processes across the agency.
One technology GSA has implemented successfully is a hybrid cloud infrastructure, enabling the agency to save on IT expenses and respond quickly to changes in its needs. In September, Deputy CIO Beth Killoran shared that GSA saves $8.5 million annually by closing its data centers entirely and relying on its cloud environment.
The hybrid cloud environment, she added, gives GSA the freedom to increase and decrease its data storage and processing capacity in tune with the demands of the acquisition schedule. Furthermore, in the event that something breaks — as had happened in September — the CIO’s office can restore functionality to employees and their customers in a matter of hours, not days or weeks.
GSA has established itself on the front lines of AI and RPA development and implementation for the government. Until his November departure for the private sector, Senior Advisor to the CFO Ed Burrows was co-chair of the RPA Community of Practice for the government, and the CFO’s office continues to be a leader in the effort.
“The objective was to bring together agencies who wanted to learn about RPA, but also that were successful in RPA,” CFO Gerard Badorrek explained at the AI & RPA CXO Tech Forum in December. “The two things we wanted to do were — one, we wanted to solve problems that were common across government and come up with solutions. And the second is educating agencies [about RPA].”
There are 33 different applications for RPA at GSA today, Badorrek added, with another 40 use cases under review for future implementation. His office is also working to respond to other agencies who see GSA as an example to follow for automation.
“What we’re working on right now is a playbook to come up with solutions to these challenges depending on what stage of [RPA] maturity you’re at,” Badorrek said.”
In May, GSA further outlined its plan to develop an e-commerce portal for government acquisition. Phase two took place during 2019, focusing on supply chain and customer data processes with the agencies and industries it has partnered with. In 2020, GSA expects to expand the pilot program and issue a multi-award contract for the marketplace.
Led by its Centers of Excellence, GSA is shortening its procurement timelines, increasing communication with stakeholders and finding innovative solutions to current acquisition hurdles. The acquisition Center of Excellence is a small office, but it is using transparent, open-source options to identify best practices.
“We put all of our procurement documents on GitHub, and one of the things you can do is look at our … evaluation factors,” said Acquisition Lead Omid Ghaffari-Tabrizi in May. “It was feedback we got from small business that are specialists in key areas [on the RFI] … it said to come up with that challenge question and scenario question that relates to soft skills focused on getting stakeholder buy-in.”
That industry engagement is driving GSA efforts to consolidate the number of federal procurement schedules and break down barriers in acquisition.
FedRAMP certification is another process receiving special attention, both at GSA and on Capitol Hill. As Congressman Gerry Connolly explained in August, FedRAMP certification should take “six months and a quarter of a million dollars,” but recently has been a much longer and more expensive process. The delay not only keeps cutting-edge tools from reaching the federal marketplace, but also impedes many small- and medium-size companies from entering public space entirely, as they cannot afford the costs.
In a CyberCast episode in October, Connolly reiterated his plans to introduce legislation to codify FedRAMP into law, establishing a baseline set of risk management requirements for all agencies to follow and providing support for the program.
“We’re trying to help expedite the process and return it to its original goal,” Connolly said. “One thing we do is the presumption of adequacy, so that if you’ve been certified over here, we presume you are qualified to be certified over there.”
Connolly added that the bill would include approximately $25 million “toward increased capacity for FedRAMP.”
This is a carousel with manually rotating slides. Use Next and Previous buttons to navigate or jump to a slide with the slide dots
-
Effective Cloud Governance Balances Innovation, Security
ULA and AWS leaders discussed strategies for secure cloud adoption, emphasizing effective permissions to balance innovation and security.
2m read -
GenAI Automates Space Force Data Discovery
Space Force’s Will Haskell discusses GenAI advancements that are streamlining data access and analysis and enhancing workflows.
2m read -
CBP Leads Federal Post-Quantum Cryptography Work
The agency began its post-quantum cryptography migration two years ago and thinks others would benefit from its lessons learned.
4m read -
How Marines' Project Dynamis is Supporting CJADC2 Data Effort
Col. Jason Quinter delves into the origins of Project Dynamis and how the program builds upon the Pentagon's larger strategy.
5m read