“Army Futures Command has picked Austin Community College District as the home of its software factory to collaborate with student software developers at a time when the service is hungry to bring software developers to its ranks as it modernizes in a digital age.
The factory will offer training in new technologies like data science and artificial intelligence and curriculum will be developed by the college, the AFC leadership and help from global software development companies.“
“The factory “will be the first of its kind and will provide a training pipeline for soldiers and ACC students,” a Sept. 17 Army press release states. “The factory is designed to help students rapidly scope and solve real-life problems through advanced software development processes.”
Through collaboration between Army soldiers, the students at the college and the greater community, “it’s going to force us to think differently about how we think about the future,” Army Futures Command Commander Gen. Mike Murray said in the statement. “There’s nothing but goodness here in terms of bringing fresh ideas to solve problems.”
The entity is the first soldier-led software factory in the Army and “the vision is to develop a pathway to two- and four-year degrees and connect soldiers and students with industry partners,” the statement reads.
After a nationwide search, the Austin-based college, which is also home to Futures Command headquarters, was chosen based on its reputation for “being a feeder for talent, innovation of its advanced ACCelerator learning lab and launch of its recent bachelor’s degree in software development,” according to the statement.
Software development has become central to the way the Army wants to modernize going forward. It is playing a key role in the development of new weapons systems. Software factories, most notably the Air Force’s Kessel Run program, have become popular in recent years across the department as a way to quickly boost capabilities.
The Army is in the middle of a major campaign of learning called Project Convergence at Yuma Proving Ground, Arizona, which is seeking to connect assets across the battlefield to fight together to shorten the decision cycle and the kill chain against near-peer adversaries.
Murray told Defense News in a recent interview there are three key technologies today that when paired together in novel ways can provide a strong advantage against possible conflict. Those technologies, which are entirely reliant on software development, are artificial intelligence, autonomy and robotics in the air and on the ground.
“To make those three work in a digital environment, you have to have an underlying robust and resilient network,” Murray said, “and you have to have a data architecture and the data and the talent to put all that together.
A recent ground robotic combat vehicle exercise at Fort Carson, Colorado, is another snapshot highlighting the need to focus on software development.
As the Army ventures into developing robotic vehicles that don’t just do the dull, dirty and dangerous work, “the biggest thing is going to be software development, improving autonomous and automation software,” Lt. Col. Chris Orlowski, the service’s robotic combat vehicles product manager, said at a recent AUVSI unmanned systems defense conference.
“Teleoperation is nice, it works okay if you’ve got the right radios and the right environments, but long term, when those environments become tested, I think teleoperation will be less viable and we will have to really push the automation and autonomy on these platforms,” he said.
The software factory will open in January with 30 soldiers and civilians. More than 15,000 service members expressed an interest in participating within the first week after an internal announcement.”
“With many agency leaders facing a “use-it-or-lose-it” deadline for utilizing their budgets before the end of the fiscal year, the use of OTAs may be a critical and needed weapon for agencies to acquire the resources needed to execute on their fiscal 2020 initiatives.“
“The federal government’s year-end fourth quarter spend has become an annual ritual in Washington, D.C., every September, similar to the anticipation of kids returning to school and the start of fall sports.
It’s more than just the numbers that have revealed this new reality for fourth quarter spending. The entire contracting market has felt it. Government contractors anticipate the release of RFPs over a typical “federal summer” with contract awards to follow in the fall.
In 2018, the year-end spend was referred to as a spending spree of “historic proportions” with nearly 40 percent of contracts being awarded in the last quarter of the government’s fiscal year. Even in a normal year, it was anticipated that 2020 would continue the trend and final quarter spending was predicted to increase. Of course, 2020 has been anything but normal. And it appears this year’s final quarter spend will surpass the previous ones, but with good reason.
The Impact of COVID-19
Suffice to say, 2020 has had a few more challenges than previous years for federal executives. The COVID-19 pandemic has upended all facets of life, business and government. With so much uncertainty throughout the year, many agencies were not able to move forward with some initiatives as quickly as they hoped, thus they did not spend as much as anticipated in the spring and summer months.
Combined with the now-annual Q4 spending frenzy, it means that this year’s fourth quarter will be unlike any before when it comes to government spending. In IT spending alone, Bloomberg Government estimates that nearly $28 billion will be spent by agencies in fourth quarter of fiscal 2020.
An example of the spending push comes from the Department of Veterans Affairs (VA). The VA has spent only $1.7 billion on IT contracts so far this fiscal year, compared to its $7.8 billion IT budget request, which leaves a massive $6.1 billion that must be spent in the final quarter. Furthermore, the CARES Act appropriated the VA an additional $2.2 billion to modernize its IT and electronic health records systems.
OTAs May Be the Answer
While GSAs push to best in class (BIC) government wide acquisition contracts, has created preferred paths to the market, and the use of those BICs can often lend themselves to short turnaround times to make awards, OTAs seem to be on the rise as another alternative.
Other transaction authority (OTAs) have soared in recent years, as they allow agencies to enter into contracts quicker than some of the traditional procurement methods. These have been most popular within the Department of Defense, as they aim to seek to enlist capabilities faster. Their spending through OTAs has increased from about $1 billion in fiscal year 2015 to $7.8 billion in 2019, according to Bloomberg Government data.
However, the use of OTAs could quickly expand to other agencies in a response to COVID-19 and the need to utilize funding in a timely manner to keep pace with the increased demands that have risen in helping our Country get past this pandemic.
According to federal market analyst Chris Cornillie, this may also become the new normal for procurement moving forward.
“We expect that HHS will continue to use OTAs as it meets the needs of the COVID response, and we may see HHS normalize OTAs as a larger part of their R&D portfolio in future years,” he said.”
“The cyber workforce is one of the most challenging cybersecurity issues, and one of the key topics the Cyberspace Solarium Commission addressed in its final report on developing a new strategy to defend the United States in cyberspace. In its March 2020 report, the commission makes clear that its recommendations rest on the people that will inevitably work to implement them. While demand for cyber talent will grow in the coming years from the current level of more than half a million unfilled jobs, the U.S. government and private industry must, as a matter of national and economic security, eliminate the systemic barriers that prevent employers from tapping into women as a source of potential cybersecurity talent.
The math is simple and points to a clear solution: We must recruit more women into cybersecurity. America needs women in the workforce, especially during times of heightened national security challenges. During World War II, Rosie the Riveter was crucial to the war effort, maintaining continuity of manufacturing as men deployed to war. The government undertook immense efforts to get women into the workforce.
The current dearth of cyber talent at a time when the number of cyberattacks continues to increase on a day-to-day basis should encourage the government to embark on a similarly ambitious agenda of enticing women into the cybersecurity workforce to fill these vacancies and strengthen our defenses when and where we need it most.
We are currently letting a trillion-dollar economic engine sit idle. “If women participated in the U.S. workforce at the same rate as men, it would add $4.3 trillion to the American economy by 2025.” Cybersecurity is already a discipline limited by national security imperatives, so hiring more women expands the pipeline and increases the percentages of U.S. citizens contributing to this ecosystem. This huge, untapped population mixed with a dearth of cyber talent leads to a logical solution for boosting the economy and protecting America in cyberspace.
According to Priscilla Moriuchi, director of strategic threat development at Recorded Future, “the wider variety of people and experience we have defending our networks, the better our chances of success,” and systemic barriers are preventing women from participating fully in the cybersecurity field.
To resolve this imbalance, the United States must reinvent Rosie the Riveter for a new age: a “Claire the Coder” program, inspiring women to serve in the nation’s cyber ranks. Such a program will create the same “can do” spirit Rosie the Riveter inspired in the generation of women in World War II.
If this program is to be a success, the United States must create affordable solutions to provide child care to enable more women to join the workforce — the No. 1 concern of families in double-income households. A major success factor during the push in WWII was the federal program for child care, equally important now as we strive to strengthen participation in the workforce and invigorate our economy. As the pandemic makes glaringly obvious, there can be no functional economy and full workforce participation without adequate child care.
With proper, targeted investments, the U.S. government can incentivize women into the national cybersecurity workforce and infuse the industry with their diverse viewpoints and skills. We can capitalize on the talent emerging from U.S. educational systems, of which women make up 60 percent of graduates. The U.S. government and private industry can invest in programs that promote cybersecurity and computer skills in women and girls. Public and private sectors can reduce stereotypes of women in the workforce (and inspire women to serve) by exposing them to highly successful female cybersecurity leadership — such as the women who served on our commission, Ms. Suzanne Spaulding and Dr. Samantha Ravich.
Leveraging women’s perspectives and bringing balance to the cybersecurity workforce is not only the right thing to do — it’s the smart thing to do. Let the commission’s report encourage the United States to embark on what is necessary — a Rosie the Riveter for cyber programs — and make a significant investment in cybersecurity by bringing women into the workforce. The time to act is now.”
“Now in its third year, the once dreaded department-wide financial audit is now firmly part of the Pentagon’s culture, Deputy Secretary of Defense David Norquist said Sept. 10.
“Perhaps no change highlights the difference in attitude more than the audit,” he said at the Defense News Conference.
For more than 20 years under multiple administrations, the Pentagon declined to carry out a full-scope financial statement audit, he noted. The attitude was, “Why should we conduct an audit until we are certain we can pass?” he said.
“It doesn’t work that way,” he added. “No one wants to get audited, but if you want to improve, the most important step you can take is to start the audit.”
Auditors, for example, have identified unused or mislabeled inventory that could be put back in the system to save money. In two cases, the department saved $81 million and $53 million at two different bases by labeling items correctly. Additionally, Defense Logistics Agency supply discrepancy reports have resulted in $287 million in back orders being filled, he said.
“I recognize that this is not as dramatic as directed energy or hypersonics, but sometimes changing the management culture is one of the hardest and equally important changes the department has to make,” Norquist said.
Savings were a result of auditors filling out reports called “Notice of Findings and Recommendations” that identify areas of corrective actions and are used to track progress. Last year, the department closed out about 615 of the recommendations, which was about 25 percent of the total filed, Norquist said. He expects about 20 percent to be closed by the end of this fiscal year despite the challenges posed by the COVID-19 crisis.
More accurate data is also a cost saver, and manually inputting data is inefficient, the department has learned, he noted.
“When data passes automatically between systems — and the quality control is up front — the accuracy goes up and the costs go down,” Norquist said. For example, the Navy saved $65 million by creating an automated feed to the Defense Finance and Counting Service, he said.
The audit has also uncovered “dormant accounts” where there had been little activity to identify potential areas of savings that could be reinvested in other programs. The result has been $2.6 billion in readjusted funds, he added.
The audit will ultimately bring more accurate data to the department, resulting in the ability to carry out the same kind of data analytics now common in the private sector, Norquist said.”
“The Defense Information Systems Agency awarded its first other transaction authority production contract for $199 million to By Light Professional IT Services to support its cloud-based internet isolation (CBII) pilot program, the agency announced August 19.
DISA previously indicated it wanted to scale the program from the initial 100,000 users to 3.5 million as the Defense Department has embraced telework on unprecedented levels this year to cope with COVID-19 pandemic..
DISA is moving its users first before other DOD organizations, Sherri Sokol, CBII’s program manager told reporters Sept. 3.
Steve Wallace, systems innovation scientist for DISA’s Emerging Technology Directorate, said the agency will look at bandwidth savings and the number of cybersecurity vulnerabilities avoided to measure the program’s success.
Wallace said the tool allows DISA to actively see how much bandwidth a user is taking up and allows for a “much deeper view into the user’s interactions with websites and the internet,” including domains or sites that were good once initially clicked but developed vulnerabilities from third-parties working in the background of a site.
“If that site goes from good to bad, we will have already isolated that user’s interactions with the site,” Wallace said.
CBII also helps manage content downloads, with an eye to reducing network congestion as well as providing safe surfing. “When a user clicks on a pdf or an office document or something like that, the CBII renders that document remotely and then if the user chooses to, then they can download it to their machine,” Wallace said, adding that such downloads to the endpoint device dropped 70% with the tool.
In addition to new tracking capabilities, DISA is pleased with its first foray into OTAs with a program that required swift changes during a global emergency.
Sokol said being able to prototype with different organizations in DOD helped shape requirements and would be something, combined with the OTA use, DISA wants to keep using in the future.
“Some of our requirements changed because of the stress that was being put on the network and we were able to work through that with the vendors to meet our needs based on what was going on with the pandemic,” said Vanessa McCollum, DISA’s chief contracting officer for emerging technologies in the Defense Information Technology Center.”
That’s really not a surprise. The surprise is how much more they cost than we’ve been told.
It might help to think of the nation’s post-9/11 wars in Afghanistan and Iraq like a pair of icebergs. The Pentagon has a web page that tells us how much we’ve each paid for the wars. But that only tells us how much of those icebergs we can see above the waves. While it includes totals for war fighting, it doesn’t track the Pentagon’s bigger war budget, interest paid on money we’ve borrowed to fight the wars, veterans’ care, and other ancillary costs. There’s a whole lot more hidden beneath the waves. The real issue isn’t whether the cost of war is high; the issue is why the U.S. government keeps under-estimating it, and why U.S. citizens and taxpayers keep tolerating it.
The cost versus benefit of the nation’s post-9/11 wars was highlighted December 9 when the Washington Post began publishing a blockbuster series detailing how poorly the war in Afghanistan is going. The series is based on more than 400 internal government interviews that the Post largely pried from the congressionally created and independent Special Inspector General for Afghanistan Reconstruction under the Freedom of Information Act. The stories show how U.S. government officials have misled the American public over the past 18 years by publicly declaring how well the war was going while privately acknowledging the opposite.
It echoes much of the analysis on Afghanistan we’ve done regularly here at the Military Industrial Circus (May 2017’s “What kind of military willingly walks onto a perpetual treadmill when the chance of prevailing is next to nil?”) about the rampant truth-fudging (August 2017’s “One can only take the constant spinning for so long before becoming dizzy and cynical over can-do officers who can’t-do.”), the hiding of key indicators about the war’s progress from the American people who are paying and dying for it (November 2017’s “When things are going well, there’s no shutting up the Pentagon.”), and the blindness of our national leaders through three administrations (last March’s “American hubris is always amazing to see, especially in hindsight.”).
For those too young to remember, the nation’s seemingly never-ending post-9/11 wars began as an invasion of Afghanistan. It was designed to crush its Taliban-run government for offering sanctuary to Osama bin Laden and al Qaeda prior to the 9/11 attacks. But it quickly morphed into a “Global War on Terrorism” that has involved U.S. military action in about 80 nations. In 2003, the U.S. also invaded Iraq, arguing—wrongly as it turned out—that Baghdad had weapons of mass destruction and played a role in the 9/11 attacks.
The global war on terrorism has killed 7,028 Pentagon personnel, both military and civilian, since 9/11 (at least 7,800 others, employed by private U.S. contractors, have also died in Afghanistan and Iraq.) But its mission creep has also created a non-nuclear chain reaction: The U.S. repeatedly decided it needed more troops, which has led to more veterans. Many of those heroes thankfully have survived wounds that would have killed them in prior wars. But that will boost the cost of their care for decades to come. The Department of Homeland Security, which the government cobbled together from existing agencies in 2003, was padded out with its own bureaucracy. The State Department and the U.S. Agency for International Development got their own off-budget accounts too. And the federal government began borrowing money to pay for all this.
You might think, as a taxpayer, that you could just wander over to defense.gov and look up the cost of those two wars. After all, they’ve been the Pentagon’s focus, fiscally and otherwise, for nearly 20 years. But you’d be wrong. The Pentagon, whether reporting on wars or weapons, is remarkably opaque when it comes to spelling out how much they cost. So outsiders have had to step in to make cents of how much our recent wars have cost.
Even more amazingly after nearly 20 years of war, keeping track of how much the U.S. is spending on the wars may be getting tougher. “In some instances, DOD, State Department and Department of Homeland Security Budgets are opaque,” notes a recent report by the Costs Of War Project, which consists of a team of about 50 experts. “Indeed, because of recent changes in budgetary labels and accounting at DOD, DHS, and the State Department, understanding the costs of the post-9/11 wars is potentially even more difficult than in the past.”
Those interested in minimizing war’s costs will limit their ledger to what the Pentagon actually is spending on combat. A more complete accounting will add in additional military spending routinely ladled into Pentagon coffers during wartime. A still-fuller accounting will add veterans’ care, homeland security, and interest on the money we’ve borrowed to fight the war.
There’s a lot of wishful thinking involved when the U.S. is thinking of going to war. If the government were simply sloppy and slipshod, its estimates would be both low and high. But invariably, they are low, which suggests there’s a motive to the math: Low-balling the cost of war makes it more likely war will happen.
The bureaucratic imperative of how the Pentagon buys its wars and weapons is the “buy-in,” a rosy projection designed to show that the conflict or hardware is a relative bargain. Yet once the war or hardware has achieved escape velocity, its price begins escalating.
The Pentagon argues the nation’s investment in any particular piece of shiny new weapon has grown so massive that abandoning the effort would send those sunk costs spinning down the drain. Likewise, war costs soar because of mission creep—rebuilding Afghanistan instead of simply ousting the Taliban following the attacks of September 11, 2001, for example—and concern that pulling out before achieving victory would mean the lives of those Americans already killed in the effort would have been wasted.
Of course, no one can predict the final cost of a war before it has begun. Yet before it begins the government tends to speak of a war’s monthly cost. In Iraq, for example, that led to an early claim that the war would cost $2 billion a month, totaling perhaps $50 billion. Those relatively low numbers, in Pentagon terms anyway, grease the skids to war.
But watch how they grow.
The litany of minimized post-9/11 war-cost estimates is long. It got off to an ignoble start when one White House official suggested the Iraq war might cost more than his finger-crossing political masters wanted to admit. In September 2002, White House economic adviser Lawrence Lindsey played the skunk at the Garden of Eden party (Iraq has several sites vying to be the biblical paradise) when he suggestedthe Iraq war’s cost to the U.S. could range between $100 billion and $200 billion. He tried to gussy up his then-exorbitant estimate: “The successful prosecution of the war,” he argued in the Wall Street Journal, “would be good for the economy.”
Nonetheless, Lindsey was unceremoniously combat-booted from the White House three months later. Mitch Daniels, the director of the White House’s Office of Management and Budget at the time, said the war’s cost couldn’t be estimated. But he declaredLindsey’s estimate was “likely very, very high.”
By January 2003, two months before the invasion of Iraq, then-Defense Secretary Donald Rumsfeld uncharacteristically deferred to Daniels’ bean counters when it came to projecting the war’s cost. “Well, the Office of Management and Budget has come up with a number that’s something under $50 billion for the cost,” saidRumsfeld, who seemingly rarely embraced others’ views when he believed strongly in his own.
In April 2003, just after the U.S. invaded Iraq, the Pentagon saidthe Iraq war would cost about $2 billion a month. But three months later, Rumsfeld raised lawmakers’ eyebrows when he doubledits estimated monthly cost to $3.9 billion (along with nearly $1 billion a month for Afghanistan).
The avarice avalanche had begun.
By July 2006, nearly five years after the 9/11 attacks, the Government Accountability Office (GAO) saidCongress “has appropriated about $430 billion to DOD and other government agencies for military and diplomatic efforts in support of GWOT [the Global War on Terrorism].” (You know you’ve reached the Big Time in Washington when your pet project rates its own acronym.) That translated into about $7.4 billion a month.
But the numbers were squishy. “GAO’s prior work found numerous problems with DOD’s processes for recording and reporting GWOT costs, including long-standing deficiencies in DOD’s financial management systems and business processes, the use of estimates instead of actual cost data, and the lack of adequate supporting documentation,” top U.S. Bean Counter David Walker (officially known as the Comptroller General of the United States, the position that runs the GAO), told a congressional panel. “As a result, neither DOD nor the Congress reliably know how much the war is costing.”
“[N]either DOD nor the Congress reliably know how much the war is costing.”
DAVID WALKER, COMPTROLLER GENERAL OF THE UNITED STATES
That’s quite a statement coming from the congressional Bookkeeper-in-Chief.
By 2014, the Congressional Research Service said that the U.S. had spent $1.6 trillion “for military operations, base support, weapons maintenance, training of Afghan and Iraq security forces, reconstruction, foreign aid, embassy costs, and veterans’ health care for the war operations initiated since the 9/11 attacks.” That worked out to about $10.3 billion a month.
But even that eye-watering sum misses the mark. The Costs of War Project has spent the past decade pawing through government documents to try to tote up the post-9/11 wars’ total cost. Its latest calculation, released in November, says the U.S. will have spent $5.4 trillion on the global war on terrorism by the end of the current 2020 fiscal year, along with an additional $1 trillion for veterans’ care beyond that. That’s about $20,000 per American.
“There are many hidden or unacknowledged costs of the United States’ decision to respond to the 9/11 attacks with military force,” the group, run out of Brown University’s Watson Institute for International and Public Affairs, says on its website. “We aim to foster democratic discussion of these wars by providing the fullest possible account of their human, economic, and political costs, and to foster better informed public policies.” The group’s work is largely funded by the Carnegie Corporation, the Colombe Foundation, the Open Society Foundations, and Boston and Brown universities.
“We go to war with optimistic assumptions” of duration, cost, and casualties, says Neta Crawford, head of Boston University’s political science department and one of the Costs of War Project’s leaders and author of its latest study. “Most people believe that force is effective, but the history of war is that [winning] doesn’t happen at least half the time,” Crawford told POGO.
And it isn’t just fusty academics who feel that way. “No government-wide reporting consistently accounts for both DOD and non-DOD war costs,” advises an April reportfrom the Congressional Research Service. Not only hasn’t the government been able to win its post-9/11 wars; after nearly two decades it can’t tell us how much it has spent failing to do so.
Put that in your howitzer and light it.
Something to keep in mind the next time the Pentagon predicts a war is going to cost $2 billion a month.”
“The Defense Department released a new directive for its overarching policy guidelines governing its buying practices.
DOD issued Directive 5000.01, the overarching guidance that focuses on the roles and responsibilities for DOD’s acquisition process, DOD announced Sept. 9. The update also includes new tenets implemented by the Adaptive Acquisition Framework, a streamlined set of pathways aimed to help speed up buying and delivering DOD’s needs.
Ellen Lord, DOD’s acquisition chief, has made it a priority to overhaul the defense acquisition practices during her tenure by reforming the DOD 5000 series instructions with a simplified rewrite to improve the process of buying everything from software to services.
“We have a much more flexible way of doing business now codified in policy,” Lord said at the Defense News virtual conference Sept. 9 regarding the move that comes after previous updates in the last year to mid-tier and urgent acquisition policies.
Lord said the changes to 5000.01, paired with the already released 5000.02 instructions that address the use of the Adaptive Acquisition Framework pathways, lay a foundation for more flexible acquisition in DOD and can help get technology fielded to the warfighter faster.
“Now we have software policies where we can move in a much more modern way,” she said.”
“Ubiquitous application of cyber business interruption (cyber BI) coverage across the Department of Defense’s (DoD’s) supply chain would materially improve supply chain resiliency and reduce unwanted supply chain behaviors.”
“One way the Pentagon could improve the cybersecurity of its supply chain would be for standardized insurance policies that cover cyberattacks, according to a new report to be released today from the Foundation for the Defense of Democracies.
Cyber disruptions can lead to a freeze in contractor operations, a failure to perform under contract or the need to find replacement parts to due to supply chain disruption. However, “due to a lack of Defense Federal Acquisition Regulation Supplement requirements, many critical DIB businesses still lack this important coverage,” the report states.
What’s more, traditional insurance models are based upon factors that don’t always apply to defense companies such as personally identifiable information or health information.
Thus, getting insurers, defense companies and the DoD itself on a level playing field is of the utmost importance, Trevor Logan, one of the report’s authors, told C4ISRNET.
He said a contractor may spend a lot of money on a cyber insurance policy only to later discover a breach wasn’t covered in the policy.
“That’s pretty scary stuff,” he said. “There’s not a standardization behind how this process is laid out, but it is ultimately what is going to get us better cybersecurity at the company level, which is better for all of us.”
The report recommends DoD study how cyber insurance across the industrial base improves the security and resiliency of the supply chain.
Logan said potential disruptions can hurt small and medium sized businesses the most because those companies may have to cease manufacturing when notifying the insurance broker of the cyber incident. This puts companies in a rough place putting in jeopardy their ability to fill contract requirements thus hurting the reliability DoD needs to get equipment, parts and systems to operators that need them.
Logan also noted that there needs to be greater clarity between how cyber insurance polices relate to the forthcoming Cybersecurity Maturity Model Certification (CMMC), a tiered cybersecurity framework that grades companies on a scale based on the level of classification and security that’s necessary for the work they’re performing.
“There are unclear and sometimes conflicting standards and models for cybersecurity, leaving companies – especially the small and medium-sized enterprises critical to the DIB – confused and uncertain,” one of the report’s key findings state. “Some insurance companies are seeking to underwrite to Cybersecurity Maturity Model Certification (CMMC) guidelines. DoD could advocate for this approach (or others) to be included in cyber insurance underwriting and could help socialize these guidelines to the broader DIB.”
He said clarifying how cyber insurance impacts underwriters’ assessments as it relates to CMMC compliance would be helpful.”
“The JAIC is “considering” starting a competition for a 501(c) nonprofit manager or managers of its prototype “Artificial Intelligence Acquisition Business Model” that looks to use other transaction authorities to more quickly purchase AI products.
“The JAIC will therefore prototype a new AI Acquisition Business Model to assess the potential for non-FAR-based contracts mixed with FAR-based contracts to meet JAIC requirements.”
“The Pentagon’s top artificial intelligence office released a request for information Aug. 28 outlining interest in establishing a new acquisition approach for standardizing the development and procurement process for AI tools according to the solicitation.
The JAIC’s prototype business model could deliver “AI capabilities through meaningful market research/front-end collaboration and optimal teaming arrangements of both traditional and non-traditional companies for AI product procurement,” the RFI said. If the plan moves forward, the JAIC would also “explore the possibilities of using the model to enable agile AI acquisition processes to the DoD at scale.”
The JAIC is the Defense Department’s main hub for artificial intelligence and is responsible for increasing adoption of AI across the department. It works with the services and combatant commands to develop AI tools that have practical use.
To meet the military’s needs, the JAIC uses the traditional government contracting process, known as Federal Acquisition Regulation-based contracts, and works with the General Services Administration, the Defense Information Systems Agency and the Defense Innovation Unit. The traditional acquisition strategy currently being used is unlikely sufficient enough to help the JAIC carry out its mission, the RFI stated.
“To scale this strategy to other DoD service requirements or respond to emergent requirements such as COVID-19 is challenging and may not be the most efficient use of acquisition tools,” the RFI read.
JAIC’s goals are to streamline awards while maintaining flexibility between FAR and non-FAR awards, and to maximize competition while minimizing restrictions, the RFI explained.
The JAIC recently awarded major contracts through DISA and GSA. In May, it awarded a five-year contract with an $800 million ceiling to Booz Allen Hamilton through the GSA for its new joint war-fighting national mission initiative, though JAIC officials have continuously noted that the value of the contract won’t hit $800 million.
The JAIC also announced a $106 million contract award to the consulting firm Deloitte for its Joint Common Foundation, a critical element for sharing datasets and AI tools across Department of Defense components.
The JAIC has said for several months that it needs its own acquisition authority to be effective. Before he retired in June from his position as JAIC director, Air Force Lt. Gen. Jack Shanahan called on Congress to give the center its own acquisition authority. He said on a webinar in late May that the center’s lack of acquisition tools will hinder the organization’s ability to increase AI use across the DoD.
“It’s not going to be fast enough as we start putting more and more money into this capability development,” Shanahan said, speaking on a webinar hosted by the AFCEA Washington, D.C., chapter. “We need our own acquisition authority. We have to move faster.”
The solicitation outlined six “high-level goals” for the prototype AI Acquisition Business Model.
“Maximize outreach to non-traditional (e.g.., small business) industry and academic partners.
“Create an acquisition model that is utilized by the Services and DoD agencies.
“Maximize use of automated processes (e.g., online portal for requirements definition, collaboration, source selection, and performance monitoring).
“Facilitate integration and transition to Acquisition programs of record (PoR) using agile and DevSecOps practices.
“Increase use of agile methods for training, tools, and policy development.
“Maximize utilization of the JAIC’s Joint Common Foundation (JCF) AI Development Platform.”
“For every one of the Defense Advanced Research Projects Agency’s wild successes, there seem to be a plethora of wild failures – projects like mechanical elephants or telepathy research. What makes DARPA so unique is its ability to go outside the red tape of bureaucracy to innovate.
DARPA isn’t subject to the same acquisition rules as other agencies, which means it has fewer restrictions on the scientists and innovators it can hire and the salaries it can offer.
Here are some of the more interesting projects to come out of DARPA’s “high-risk, high-reward” environment.
1. Plant-eating robots
Perhaps the most aptly named project on this list, the Energy Autonomous Tactical Robot program sought to create robots that could feed off plants just as animals do. EATR would have enabled robots to remain in surveillance or defensive positions without resupply much longer than humans or robots with more limited power sources.
“We completely understand the public’s concern about futuristic robots feeding on the human population, but that is not our mission,” Cyclone Power Technologies CEO Harry Schoell said in a press release.
Before the project stopped development in 2015, its engineers estimated that EATR would be able to travel 100 miles for every 150 pounds of biomass consumed.
2. Houses that repair themselves
Imagine soldiers fashioning buildings and fortifications out of lightweight scaffolds instead of plywood, two-by-fours, and heavy sandbags. Then, those scaffolds quickly begin to fill in with durable material all on their own. And when that material is damaged, it grows right back to where it was.
That’s the goal of DARPA’s Engineering Living Materials program – to create building materials that can be grown where needed and repair themselves when damaged. As researchers make progress with 3D printed organs and tissues, DARPA hopes to use similar technologies to create hybrid materials that can shape and support the growth of engineered cells.
“Instead of shipping finished materials, we can ship precursors and rapidly grow them on site using local resources. And, since the materials will be alive, they will be able to respond to changes in their environment and heal themselves in response to damage,” project manager Justin Gallivan said.
3. Lab-grown blood
Blood pharming successfully decreased the cost of transfusable units from more than $90,000 to less than $5,000.
Blood pharming is the process of creating red blood cells from cell sources in a lab rather than inside a human body. DARPA’s Blood Pharming program was projected to increase the efficiency of production and lower the high costs associated with growing red blood cells.
If completely successful, the program would have greatly increased access to transfusable blood for soldiers and hospitals around the world and reduced the risk of disease transmission during a transfusion.
The program was successful in decreasing the cost of synthetic blood from over $90,000 down to under $5,000 per unit, a 2013 press release stated, but new information has not been released since, and the program was not listed in recent budget documents.
4. Cyborg insects
Unmanned Aerial Vehicles may be all the rage, but they’re clunky and require people to design and assemble every piece. What if there were a way to piggyback sensors on flying creatures for free?
Within a few years, researchers had developed interfaces capable of controlling insects’ actions. And if plain old spy bugs weren’t wild enough, the insects eventually received nuclear power as well.
In 2009, Cornell engineers revealed a prototype of a radioactively-powered transmitter for the cyborg insects. Nickle-23 isotopes would provide ample power to the sensors and transmitters the bugs might carry while remaining harmless to humans.
5. Brain implants for PTSD
DARPA doesn’t just focus on cool gadgets for fighting wars. The agency also funds research on solutions for the negative effects war can have on soldiers.
The Systems-Based Neurotechnology for Emerging Therapies program is tasked with creating “an implanted, closed-loop diagnostic and therapeutic system for treating, and possibly even curing, neuropsychiatric illness,” according to a DARPA press release.
Basically, the program wants to make a brain implant that will help soldiers struggling with PTSD, traumatic brain injuries, anxiety, substance abuse, and more.
Because of the ramifications of such a device, SUBNETS has special ethics experts to help them create a safe piece of neurotechnology.
6. Robotic infantry mules
The Legged Squad Support System (LS3) walks around the Kahuku Training Area July 10, 2014 during the Rim of the Pacific 2014 exercise. (U.S. Marine Corps photo by Sgt. Sarah Dietz/RELEASED)
Heavy lifting is one of the largest challenges affecting troops’ health and performance. Recognizing the affect the weight of soldiers’ loads can have on them, DARPA began working with robotics company Boston Dynamics to create the Legged Squad Support System.
Capable of carrying 400 pounds, the LS3 is intended to deploy with an infantry squad. DARPA’s website states the program’s goal as “to develop a robot that will go through the same terrain the squad goes through without hindering the squad’s mission.”
7. Nuke-propelled spaceship
DARPA also invests in researching space travel. Project Orion is a program from 1958 intended to research a new means of spaceship propulsion. This hypothetical model of propulsion relied on nuclear bomb detonations to power a craft forward and was supposedly capable of hitting astonishing speeds.
However, DARPA officials were worried about nuclear fallout, and when the Partial Test Ban Treaty of 1963 outlawed detonations of nuclear weapons in outer space, the project was dropped.
8. Mechanical elephants
In the 1960s, DARPA began researching vehicles that would enable troops and equipment to move more freely in the dense terrain of Vietnam.
Following the footsteps of Hannibal before them, DARPA researchers decided that elephants could be the right tool for the job. They began one of the most infamous projects in DARPA history: the quest for a mechanical elephant. The end result would be capable of transporting heavy loads with servo-actuated legs.
When the director of DARPA heard of the project, he immediately shut it down, hoping that Congress wouldn’t hear of it and cut the agency’s funding, according to New Scientist.”