Tag Archives: technology

Army Turns To Industry For Network Overhaul

Standard
Army-command-post

Army Command Post

“BREAKING DEFENSE”

“The Army today has about 20 different software “baselines,” with different units and offices using inconsistent and often incompatible programs, often because their hardware is too old to handle anything better.

The resulting patchwork of networks is expensive to operate and difficult to secure against cyber attack. So the service wants to upgrade everyone to a single, consistent, up-to-date baseline within two years.

Want to sell information technology to the US Army? Then you need to write this down: Paul.A.Ostrowski.mil@mail.mil. That’s the email of the generalseeking industry’s input — historically something of a struggle for the service — as the Army reviews and overhauls its networks.

The Army’s long-term goal: a single unified network connecting everything from the home base to the battlefield, easy for the service to upgrade, easy for soldiers to use amidst the stress of combat, and hard for enemies to take down. The Army’s immediate question for industry: Can you build it?

Lt. Gen. Ostrowski, the director of the Army Acquisition Corps, wants you to write him if you want in on a series of roundtables the Army is holding with selected companies, hosted by the federally funded Institute for Defense Analyses (IDA). One roundtable was personally led by the Army Chief of Staff, the hard-charging, wisecracking Gen. Mark Milley, who is taking a hands-on role in the review he launched in May.

“Who’s in charge? The Chief’s in charge…. he and the Secretary of the Army,” Ostrowski said at yesterday’s Association of the US Army conference on networks. Those top leaders have brought together the Army’s Chief Information Officer/G-6 (chief signals officer), the Army resourcing staff (G-8), the Training & Doctrine Command that brainstorms future warfare concepts and writes requirements for new systems, and the acquisition officials who buy them.

“What’s different is the involvement of the leadership,” said Army CIO Gary Wang, who’s leading the review for Gen. Milley. While the Pentagon bureaucracy does plenty of reviews, he told me, “oftentimes it’s delegated down to a much lower level.” This time, though, the severity of the Army’s “financial constraints” have gotten the Chief of Staff and Acting Army Secretary Robert Speer personally involved, Wang said.

There’s another reason Wang didn’t mention: the savage criticism in Congress of the Army’s flagship battlefield network, WIN-T. Gen. Milley himself said the network is too “fragile” and “vulnerable” for future battles against high-tech adversaries like Russia or China, because its transmissions are too easily detected and then jammed or hacked.

Beyond WIN-T

“WIN-T’s our current network,” Ostrowski said when I asked him about the system. “We’re an Army that has to fight tonight, and WIN-T will be very much part of that. Period. That gets that off the table.” Then he moved on to other topics — notably not saying what this review would mean for WIN-T in the future.

But this review goes well beyond WIN-T, Milley and Speer have emphasized. It covers all the Army’s networks, both for combat units and back-office business operations. The crucial issue, Ostrowski said, is “how do we simplify the network? Right now we have a lot of parts and pieces. We’ve gone out and bought a lot of stuff that’s incredible in terms of its capabilities. but we’ve got to simplify: We’ve got to make this soldier-intuitive; we’ve got to make it soldier-maintainable and soldier-operable.”

The Army today has about 20 different software “baselines,” with different units and offices using inconsistent and often incompatible programs, often because their hardware is too old to handle anything better. The resulting patchwork of networks is expensive to operate and difficult to secure against cyber attack. So the service wants to upgrade everyone to a single, consistent, up-to-date baseline within two years.

What’s more, cybersecurity in the narrow sense is not enough. The Army can’t just focus on hackers sending malicious code over the internet: It also has to worry about electronic warriors jamming, triangulating, or eavesdropping on radio transmissions. That’s a uniquely military problem. Yes, civilian mobile phones also rely on radio — that’s what “wireless” means — but only to reach the nearest cell tower, which is often plugged into fiber optic cable; battlefield wireless networks rely on long-distance radio, which is much more vulnerable.

A Daunting Task

So what does the Army want from its future network, and therefore from industry?

First and most fundamentally, Ostrowski told the AUSA conference, the review is driven by rapidly evolving threats, because the network needs to be ready to go to “fight and win our nation’s wars” against those threats. The Army must stand ready “to deploy rapidly, anywhere, anytime, to shape, prevent, and win, against any foe in any domain — domain being cyber, space, air, land, or maritime — and any environment — environment being megacity, desert, jungle, arctic.” So the network must be able to operate, and the soldiers using it must be able to reliably communicate, in all those conditions, under attack by any of those threats, and on the move, without stopping to set up radio antennas or lay fiber optic cables.

To that end, the network must be “simple and intuitive,” Ostrowski said, easy for soldiers to operate without extensive training or constant tweaking. Soldiers must be able to keep it running without relying on legions of industry Field Service Representatives (FSR), as was often the case in Afghanistan and Iraq.

The network must also be easy to upgrade as technology changes, without having to start the whole laborious procurement process over again, and without being locked in to one company’s intellectual property that no one firm can touch. “I will tell you up front, that if you’re going to bring proprietary solutions to the table, don’t come,” Ostrowski said. Instead, the network must be built on open standards, allowing any company to offer upgrades just as any company that meets Apple’s standards can sell apps for the iPhone.

Just as the network has to be open to different companies’ products, Ostrowski continued, “it has to be accessible to our allied partners,” allowing friendly nations’ networks to connect with ours.

Finally, the network must be secure against cyberattack, resilient to the damage of those attacks that do get through, and able to transmit its wireless signals in a way the enemy cannot easily detect. (The technical terms are Low Probability of Detection (LPD) and Low Probability of Intercept (LPI)).

This is a daunting list of desiderata, but engineers from both the Army and “numerous companies” are already “whiteboarding” how they would achieve them, Ostrowski said. “My name and number (are) up there,” he said, pointing to his slides. “I need you to let me know if you want to play.”

Who’s facilitating all this interaction? The Institute for Defense Analyses (IDA), a federally funded research & development cooperation that Congress had already chartered to study the Army network, said Maj. Gen. Peter Gallagher, who works for CIO Wang as director of architecture, operations, networks, and space. Gallagher told me he doubted if he’d ever seen a review this intensive, adding the full-court outreach to industry was “something Gen. Milley personally directed.”

“We rely on industry for everything we do,” Gallagher said simply.”

http://breakingdefense.com/2017/07/army-chief-milley-turns-to-industry-for-network-overhaul/

 

 

 

DoD Is Buying Fewer Commercial Items. Oops!

Standard

DOD Fewer Commercial Items DIUX-poster

“BREAKING DEFENSE”

“One constant in the acquisition reform debate of the last two decades ……  “buy more commercial items in a commercial fashion, and do it quickly and cheaply.”

But a report by the Government Accountability Office analyzing a decade of the federal acquisition database finds that Pentagon’s purchase of commercial items has declined since 2007.

Now, nobody argued that you could buy F-35s or ships that way, but as competitors such as China and Russia fielded weapons in double-quick time and software and computer hardware became increasingly important to a weapon’s effectiveness, so did speeding up purchases and lowering their costs grew in importance.

To build bridges with the commercial sector and to ensure the military sped up its adoption of technology advances — especially in software and commercial IT — former Defense Secretary Ash Carter created the the Strategic Capabilities Office and the Defense Innovation Unit Experimental, fondly known as the DIUX. They were supposed to help accelerate the purchase of commercial technology, bolstered by a raft of legal and policy changes over the last decade.

“The data now supports what was long suspected — that the purchase of commercial items was declining. The question is why? The answer can likely be found in the overreaction to the perceived contracting abuses of of the Iraq War.

“While commercial items and the Iraq War shouldn’t be linked, they became so in the so-called ‘war on profits’ that was initiated early on in the Obama Administration,” Greenwalt argues. “In a typical overreaction applied to a different set of circumstances….the DOD bureaucracy, instead of going after bloated cost-type contracts and move to a more fixed-price, commercial-like, performance-based contracting approach, decided to do the opposite and reign in commercial item contracts where profit margins are traditionally higher.”

Part of the problem appears to be that Pentagon acquisition officials just don’t know much about buying commercially. To cope with that, the Defense Contract Management Agency (DCMA) created six Commercial Item Centers of Excellence staffed with engineers and price/cost analysts to advise contracting officers in how to determine what can be bought commercially.

“According to DCMA officials,” the GAO report says, “experts at these centers began reviewing cases in June 2016 and since then have examined 437 cases that contained approximately 2300 items. They recommended that the contracting officer make a determination that an item was commercial in 94 percent of the cases reviewed.”

But Greenwalt isn’t really optimistic, even though he pushed hard to make sure the acquisition community had the policy and legal tools to buy more commercially.

“The linkage between higher profits and higher risks and performance that occurs on commercial item contracts was forgotten in order to keep as many traditional cost-type programs (with somewhat reduced fees) going during a budgetary downturn,” he says. “Congress acted in the last two NDAAs to try and roll back this situation, but since none of the rules to implement new commercial item legislation have been enacted yet, it is doubtful we will see much improvement soon in the statistics.”

http://breakingdefense.com/2017/07/dod-is-buying-fewer-yes-fewer-commercial-items-oops/

 

The U.S. And North Korea – Warpath Paved With Rational Decisions?

Standard

Stratfor U.S. and N. Korea War

“STRATFOR”

“Neither wants war; each side strongly prefers an alternative path to resolve the core issues underlying the crisis.

Yet their differing strategic imperatives and desired end states leave little room for compromise.

War is rarely the first option for countries trying to preserve or enhance their strategic positions. The United States and North Korea alike would rather avoid a conflict on the Korean Peninsula, which would be complicated and costly for all parties involved.

As North Korea draws closer to achieving long-range missile capabilities, something it sees as a security guarantee, the United States faces mounting pressure to act. But as Washington tries to coerce North Korea to end its quest for more sophisticated arms, Pyongyang feels compelled to accelerate its nuclear weapons and missile development. Each country is merely acting to preserve its interests. But their interests are driving them closer to a physical confrontation.

The Rational Assumption

Geopolitics teaches us to assume rationality on the part of actors on the international stage. The assumption doesn’t suppose that individual leaders are somehow beyond the influence of emotion, misinformation or miscalculation. Rather it acknowledges the deeper forces at work, from the interactions of place and people that shape national characteristics and strategic culture to the systems and structures that develop in countries over time. No leader operates free of these constraints and compulsions. Though they still have leeway to shape their policies and actions, leaders, as individuals and as a collective group, do so within limits defined in large part by the environments in which they emerged. The rationality we assume from leaders is not universal; it is the product of their place and time under the influence of factors such as history, geography and economics.

The key, then, is to understand what guides the rationality of a country’s leadership, on an individual level and in the government as a whole. After all, no one individual rules a country, since no single person could extend power over an entire population without the help of intermediaries. And each layer of leadership adds another set of constraints to the exercise of power. Disagreements arise in governments and in the populations they preside over. But the forces that influence the options available to leaders are far larger than the concerns of the individual. It is an analyst’s job to understand and explain these factors, and a policymaker’s job to take them into account when considering how to achieve a desired outcome.

Even so, it is sometimes simpler in international relations to assume one’s adversaries are crazy. They don’t follow the desired path or react in the anticipated way, so they must be acting irrationally. If one makes the wrong assumptions of an adversary (or even of an ally), however, the response to a given action may be far from what was intended.

Of course, understanding the other side doesn’t guarantee the desired outcome, either. Irreconcilable differences in interests and perceptions of risk can get in the way of compromise. The most viable solution often is to constantly adjust one’s actions to manage these contradictions, even if they prove insurmountable. At times, though, the differences can be so intractable as to drive nations into conflict if each side’s pursuit of contrary interests leads to fear and insecurity for the other. Moves by one nation to constrain the threatening behavior it perceives from another then perpetuate the cycle of action and reaction. In the case of North Korea and the United States, the contradiction in their interests is growing ever starker as Pyongyang accelerates its nuclear weapons program and nears its goal of developing a missile capable of striking the continental United States.

As Pyongyang draws closer to the deliverable long-range nuclear weapon it has long pursued, Washington will be forced to decide whether to accept North Korea as a nuclear-armed state and live with that reality or to take the necessary steps to disarm it.

A Mutual Misunderstanding

Misunderstandings, misapplied assumptions and mismatched goals have characterized relations between the United States and North Korea for decades. Washington expected — or at least hoped — that North Korea would collapse on its own under the force of economic and social pressures. The evaluation misjudged the country as the Asian equivalent of an Eastern Bloc state waiting for the Soviet Union’s demise to break free from the shackles of a foreign-imposed power structure. North Korea hasn’t collapsed. In fact, in times of trouble, its neighbors (and even the United States) have helped stabilize the government in Pyongyang for fear that the consequences of the country’s failure would be more dangerous than the risks entailed in its survival. North Korea, meanwhile, considered itself a fixture on the United States’ target list, a remnant of the Cold War that Washington was trying to toss on the ash heap of history.

The two have had many opportunities for some form of reconciliation over the years. Time and again, though, progress has run afoul of perceived threats, diverging commitments, changing priorities, domestic politics and even extraregional events. As Pyongyang draws closer to the deliverable long-range nuclear weapon it has long pursued, Washington will be forced to decide whether to accept North Korea as a nuclear-armed state and live with that reality or to take the necessary steps to disarm it. The cost of action is high, but so is the perceived threat of inaction.”

STRATFOR – On a Warpath Paved With Rational Decisions

Kill The Open Internet and Wave Goodbye to Consumer Choice

Standard
kill net neutrality

Image: Dan Wasserman Tribune Media Services “The Week”

“WIRED”

“By Terrell McSweeny (@TMcSweenyFTC) a commissioner of the Federal Trade Commission and Jon Sallet (@jonsallet), the former general counsel of the Federal Communications Commission. Both are alumni of the antitrust division of the Department of Justice.”


“Since the Bush administration, both Republican and Democratic FCC chairs have emphasized that they would take action to protect the open internet, and they have done so.

An Open Internet has worked for America, creating a virtuous circle of innovation, trust, adoption, and further innovation. That circle should not be broken.

The Net Neutrality debate can seem complicated. But at its heart, the issue rests on two simple realities: First, for more than a decade, the status quo in the US has been an open internet that supports thriving innovation among websites, apps, and new digital services. Second, innovators and consumers are dependent on a few large broadband providers that serve as gatekeepers to the internet.

In 2015, the FCC adopted its Open Internet Order to guarantee that consumers aren’t blocked or manipulated when they use their broadband connections and ensure that competition from the internet isn’t artificially squelched. The two goals work hand in hand, because residential broadband connections are the pathways on which consumers travel to the modern world and through which the content and services of the internet reaches residential users.

Two years later the new majority at the FCC has announcedthat it intends to undo the 2015 order. That includes the prohibitions on blocking, throttling, and paid prioritization. But the FCC has also proposed eliminating the General Conduct rule, which protects competition.

The FCC would be mistaken to unravel a bipartisan approach that has worked. Since the Bush administration, both Republican and Democratic FCC chairs have emphasized that they would take action to protect the open internet, and they have done so. Like a police officer keeping a watchful eye at a busy intersection, the FCC’s presence has both stopped and deterred harm to consumers, competition, and innovation.

The threat to the open internet is real because competition in US broadband markets is limited, to the extent that it exists at all. About 90 million US households subscribe to the kind of broadband that runs on wires to their homes. The top four providers—two cable and two telecom—together claim three-quarters of all residential customers.

Of course, consumers can only choose among the broadband networks that reach them. Roughly 21 percent of US census blocs have no high-speed landline broadband provider, and 37 percent have only one option. This is no choice at all. For downloading data at 100 Mbps, 88 percent of the country has either no option or just one provider.

In rural America it’s much worse: More than half of rural census blocs have no choice of a high-speed broadband provider, which condemns them to slow speeds for any service they can get. Even where there are choices, the FCC has found that consumers face significant costs in switching between broadband providers. Moreover, broadband providers have the ability to target content creators selectively, making it harder for consumers to understand why they’re having trouble accessing certain content.

So it’s clear that most US consumers depend upon a few big players in order to access the internet. Therefore, the critical question is whether these companies have the incentive and ability to harm consumers and competition. That is, are they motivated to control what kinds of innovations come to consumers? And do they have the tools to do so? Both the FCC and the Department of Justice have recognized in recent proceedings that the answers are yes and yes.

Broadband providers have the power and the motivation to curb any competition that uses their networks in order to reach consumers. And we know that eliminating competition—via mergers, for example—risks consumers paying higher prices and receiving lower quality products and services. It doesn’t seem like a coincidence that the so-called new Golden Age of TV has flourished at a time when Amazon, Hulu, Netflix, and other services are producing popular, award-winning shows in direct competition with more established players.

Here’s why there’s a problem: The big broadband companies also supply video programming, which means that those firms’ revenues are directly threatened when consumers use their broadband connections to access competing video providers. The incentive for broadband companies to discriminate against online video providers will only grow stronger as the market becomes more competitive, as it has recently with the arrival of services that carry live television channels just like traditional cable operators.

When reviewing the proposed (and ultimately failed) merger of Comcast and TimeWarner Cable, economists at the Department of Justice concluded that the merged firm’s power would likely reduce competition in the video and broadband markets, leaving consumers with fewer choices, higher prices, and lower quality. And when the Department of Justice considered a proposed (and ultimately successful) merger of Charter Communications and TimeWarner Cable, it recognized the ability of cable and telephone companies to take action against new video competition and limited the new company’s ability to seek terms in programming contracts that could harm online video providers.

The 2015 Open Internet Order set forth 16 pages of economic and technological analysis to support the conclusion that “broadband providers (including mobile broadband providers) have the economic incentives and technical ability to engage in practices that pose a threat to Internet openness by harming other network providers, edge providers, and end users.”

Some argue that using traditional antitrust rules can get the same job done, and just as well. While the two of us both believe strongly in the importance of antitrust enforcement, these laws cannot duplicate the kind of prospective, industry-wide rules contained in the 2015 Open Internet Order.

Supreme Court Justice Anthony Kennedy faced precisely this argument when he wrote the majority opinion in the Supreme Court case upholding requirements that cable systems carry broadcast stations. He wrote that regulation could be preferred to antitrust because of “the considerable expense and delay inherent in antitrust litigation, and the great disparities in wealth and sophistication between [TV stations and cable systems],” as well as the burden of bringing a case, which would require “considerable expense and delay.” All of this is even more true in disputes between large broadband providers and their customers. That’s why open internet rules make sense: They let the industry know what is required while giving consumers an avenue of relief at the FCC that doesn’t require long and expensive antitrust litigation.

The economic facts are telling, but that’s not all. Consumers should be able to use their broadband connections to access the lawful content of their choosing. The FCC is reconsidering whether broadband providers should be given the new freedom to block or interfere with the ability of consumers to express their thoughts or to listen to the views they want to hear. And that threatens the kind of free speech on which America was built. In 1776, Thomas Paine didn’t need the permission of any other content creator or distributor to circulate Common Sense. But without rules prohibiting blocking, throttling, and the like, broadband providers would gain the power to limit what unpopular content flows over their networks—to the detriment of consumers and democracy. One challenger to the 2015 Open Internet Order argued exactly this to the DC Circuit: that the rules violated its right to block legal but unpopular content.

An Open Internet has worked for America, creating a virtuous circle of innovation, trust, adoption, and further innovation. That circle should not be broken.”

https://www.wired.com/story/kill-the-open-internet-and-wave-goodbye-to-consumer-choice/

 

 

 

 

Neutrality Matters

Standard
Net Neutrality CNN dot com

Image:  CNN.com

“WIRED”

“In a time when there are too few companies with too much power – we need net neutrality now more than ever.

Getting rid of Title II would lead to even more centralization, handing more power to the largest Internet companies while stifling competition and innovation.

Next month, Amazon, Netflix, and dozens of other companies and organizations will host a “day of action” aimed at saving net neutrality as we know it. The Federal Communications Commission, meanwhile, is on the verge of revoking its own authority to enforce net neutrality rules, and the country’s biggest telecommunications companies are cheering along. The future of the internet is on the line here, but it’s easy to be cynical about the conflict: What does it matter which set of giant corporations controls the internet?

Under the current net neutrality rules, broadband providers like Comcast and Charter, and wireless providers like AT&T and Verizon, can’t block or slow down your access to lawful content, nor can they create so-called “fast lanes” for content providers who are willing to pay extra. In other words, your internet provider can’t slow your Amazon Prime Video stream to a crawl so you’ll keep your Comcast cable plan, and your mobile carrier can’t stop you from using Microsoft’s Skype instead of your own Verizon cell phone minutes.

If the Trump administration gets its way and abolishes net neutrality, those broadband providers could privilege some content providers over others (for a price, of course). The broadband industry says it supports net neutrality in theory but opposes the FCC’s reclassification of internet providers as utility-like “Title II” providers, and that consumers have nothing to worry about. But it’s hard not to worry given that without Title II classification, the FCC wouldn’t actually be able to enforce its net neutrality rules. It might be less alarming if the internet were a level playing field with free and fair competition. But it’s not. At all.

If you want to search for anything online, you’ve got to go through Google or maybe Microsoft’s Bing. The updates your Facebook friends share are filtered through the company’s algorithms. The mobile apps you can find in your phone’s app store are selected by either Apple or Google. If you’re like most online shoppers, you’re mostly buying products sold by Amazon and its partners. Even with the current net neutrality laws there’s not enough competition—without them, there will be even less, which could stifle the growth and innovation that fuels the digital economy.

Fast lanes or other types of network discrimination could have a big impact on the countless independent websites and apps that already exist, many of which would have to cough up extra money to compete with the bigger competitors to reach audiences. Consider the examples of Netflix, Skype, and YouTube, all of which came of age during the mid-2000s when the FCC’s first net neutrality rules were in place. Had broadband providers been able to block videos streaming and internet-based phone calls in the early days, these companies may have seen their growth blocked by larger companies with deeper pockets. Instead, net neutrality rules allowed them to find their audiences and become the giants they are today, and without net neutrality, they could even potentially become the very start-up-killers that would’ve slowed or stopped their own earlier growth. Getting rid of net neutrality all but ensures that the next generation of internet companies won’t be able to compete with the internet giants.

The end of net neutrality could also have ranging implications for consumers. Amazon, Netflix, YouTube, and a handful of other services may dominate the online video market, but without net neutrality, broadband providers might try to make it more expensive to access popular streaming sites in an attempt to keep customers paying for expensive television packages. “[Net neutrality] protects consumers from having the cost of internet go up because they have to pay for fast lane tolls,” says Chris Lewis, vice president of the advocacy group Public Knowledge.

Lewis also points out that there are a few other consumer friendly protections in the FCC’s net neutrality rules. For example, the FCC rules require internet service providers to disclose information about the speed of their services, helping you find out whether you’re getting your money’s worth. They also force broadband providers to allow you to connect any device you like to your internet connection, so that your provider can’t force you to use a specific type of WiFi router, or tell you which Internet of Things gadgets you can or can’t use.

“The Internet is as awesome and diverse as it is thanks to the basic guiding principle of net neutrality,” says Evan Greer, campaign director for Fight for the Future, one of the main organizers of the net neutrality day of action, which will take place on July 12 and try to raise awareness about net neutrality across the web.”

https://www.wired.com/story/why-net-neutrality-matters-even-in-the-age-of-oligopoly/

National Geospatial Intelligence Agency (NGA) To Offer Data to Industry for Partnerships

Standard
NGA Federal News Radio

NGA Headquarters – Image:  “Federal News Radio”

“BREAKING DEFENSE”

“The idea: offer companies chunks of the “wonderland” of unclassified NGA data so they can use them to build new products or to test algorithms key to their products.

It’s a bold and rare move by a large and largely secretive government agency.

The top two leaders of the National Geospatial Intelligence Agency, Robert Cardillo and Susan Gordon, met with Anthony Vinci, now NGA’s director of plans and programs, to discuss ways to get more value from the agency’s incredibly valuable pools of data.

Using The Economist‘s description of data as the oil of today — the most valuable commodity in our economy — Vinci argued the agency must deploy it and help pay the American people back for the investment they have made in building the agency. If data is the new oil, Vinci said companies should “turn it into plastic,” adding value.

Cardillo told reporters would NGA would create a B corporation — in effect a non-profit government company — and hire an outsider to run it.

This, I think it’s fair to say, is not a slam dunk. Culturally, it will be challenging, Vinci admitted. “It’s straightforward, but it sort of breaks every rule we have in the IC (Intelligence Community).” The IC doesn’t share data and it doesn’t partner with outsiders, except for allied and friendly governments when needed.

This process may sidestep the whole process of generating a requirement for an intelligence system. “I don’t think that’s how problems can be solved any more,” Vinci said. The current system, which can be circumvented if an urgent need exists, is generally slow and restrictive, one that the Pentagon and the IC are increasingly trying to amend.

I spoke with three senior industry officials who listened to Vinci’s presentation and they were hopeful but cautious. All three said they thought the new effort could yield unexpected and useful returns on taxpayer’s investments in the data.

The biggest obstacle may be Congress. Although NGA would not be making money from the data sharing and it would not be releasing any data that could help our enemies, they would be sharing a government resource which voting taxpayers paid for and over which lawmakers have oversight. Whether the products resulting from the data would be licensed back to NGA, or allowed to generate profits for companies is all still to be determined.

“That’s part of what were trying to figure out Vinci told me,: “taxpayers paid for this data and how can we get that value back to them.”

http://breakingdefense.com/2017/06/nga-to-offer-data-to-industry-for-partnerships/

 

VA Will Shift Medical Records To DOD’s “In-Process” Electronic Medical Records System

Standard

 

Veterans Gaming the System

Image:  Military Times

Total Investment To Date Now Projected at Nearly $10 Billion

“MILITARY TIMES”

VA has already spent more than $1 billion in recent years in attempts to make its legacy health record systems work better with military systems.

The military’s health record system is still being put in place across that department, more than three years after the acquisition process began. The initial contract topped $4.6 billion, but has risen in cost in recent years.

Shulkin did not announce a potential price tag for the move to a commercial electronic health records system, but said that a price tag of less than $4 billion would likely be “unrealistic.”


“Veterans Affairs administrators on Monday announced plans to shift veterans’ electronic medical records to the same system used by the Defense Department, potentially ending a decades-old problematic rift in sharing information between the two bureaucracies.

VA Secretary David Shulkin announced the decision Monday as a game-changing move, one that will pull his department into the commercial medical record sector and — he hopes — create an easier to navigate system for troops leaving the ranks.

“VA and DoD have worked together for many years to advance (electronic health records) interoperability between their many separate applications, at the cost of several hundred millions of dollars, in an attempt to create a consistent and accurate view of individual medical record information,” Shulkin said.

“While we have established interoperability between VA and DOD for key aspects of the health record … the bottom line is we still don’t have the ability to trade information seamlessly for our veteran patients. Without (improvements), VA and DoD will continue to face significant challenges if the departments remain on two different systems.”

White House officials — including President Donald Trump himself — hailed the announcement as a major step forward in making government services easier for troops and veterans.
Developing implementation plans and potential costs is expected to take three to six months.

But he did say VA leaders will skip standard contract competition processes to more quickly move ahead with Millennium software owned by Missouri-based Cerner Corp., the basis of the Pentagon’s MHS GENESIS records system.

“For the reasons of the health and protection of our veterans, I have decided that we can’t wait years, as DOD did in its EHR acquisition process, to get our next generation EHR in place,” Shulkin said.

Shulkin for months has promised to “get VA out of the software business,” indicating that the department would shift to a customized commercial-sector option for updating the health records.

The VA announcement came within minutes of Trump’s controversial proposal to privatize the nation’s air traffic control system. The president has repeatedly pledged to make government systems work more like a business, and in some cases hand over public responsibilities to the private sector.

Shulkin has worked to assure veterans groups that his efforts to rely on the private sector for expertise and some services will not mean a broader dismantling of VA, but instead will produce a more efficient and responsive agency.

He promised a system that will not only be interoperable with DOD records but also easily transferable to private-sector hospitals and physicians, as VA officials work to expand outside partnerships.

Shulkin is expected to testify before Congress on the fiscal 2018 budget request in coming weeks. As they have in past hearings, lawmakers are expected to request more information on the EHR changes then. ”

http://www.militarytimes.com/articles/va-share-dod-electronic-medical-records-decision

 

 

4 Ways to Protect Against the Very Real Threat of Ransomware

Standard
ransomware-495934588-s

“Getty Images”

“WIRED”

“You’re still largely on your own when it comes to fighting ransomware attacks, which hackers use to encrypt your computer or critical files until you pay a ransom to unlock them.

Ransomware is a multi-million-dollar crime operation that strikes everyone from hospitals to police departments to online casinos.

It’s such a profitable scheme that experts say traditional cyberthieves are abandoning their old ways of making money—stealing credit card numbers and bank account credentials—in favor of ransomware.

You could choose to cave and pay, as many victims do. Last year, for example, the FBI says victims who reported attacks to the Bureau enriched cyber extortionists’ coffers by $24 million. But even if you’ve backed up your data in a safe place and choose not to pay the ransom, this doesn’t mean an attack won’t cost you. Victims of the CryptoWall ransomware, for example, have suffered an estimated $325 million in damages since that strain of ransomware was discovered in January 2015, according to the Cyber Threat Alliance (.pdf). The damages include the cost of disinfecting machines and restoring backup data—which can take days or weeks depending on the organization.

But don’t fear—you aren’t totally at the mercy of hackers. If you’re at risk for a ransomware attack, there are simple steps you can take to protect yourself and your business. Here’s what you should do.

First of All, Who Are Ransomware’s Prime Targets?

Any company or organization that depends on daily access to critical data—and can’t afford to lose access to it during the time it would take to respond to an attack—should be most worried about ransomware. That means banks, hospitals, Congress, police departments, and airlines and airports should all be on guard. But any large corporation or government agency is also at risk, including critical infrastructure, to a degree. Ransomware, for example, could affect the Windows systems that power and water plants use to monitor and configure operations, says Robert M. Lee, CEO at critical infrastructure security firm Dragos Security. The slightly relieving news is that ransomware, or at least the variants we know about to date, wouldn’t be able to infect the industrial control systems that actually run critical operations.

“Just because the Windows systems are gone, doesn’t mean the power just goes down,” he told WIRED. “[But] it could lock out operators from viewing or controlling the process.” In some industries that are heavily regulated, such as the nuclear power industry, this is enough to send a plant into automated shutdown, as regulations require when workers lose sight of operations.

Individual users are also at risk of ransomware attacks against home computers, and some of the suggestions below will apply to you as well, if you’re in that category.

1. Back Up, as Big Sean Says

The best defense against ransomware is to outwit attackers by not being vulnerable to their threats in the first place. This means backing up important data daily, so that even if your computers and servers get locked, you won’t be forced to pay to see your data again.

“More than 5,000 customers have called us for help with ransomware attacks in the last 12 months,” says Chris Doggett, senior vice president at Carbonite, which provides cloud backup services for individuals and small businesses. One health care customer lost access to 14 years of files, he says, and a community organization lost access to 170,000 files in an attack, but both had backed up their data to the cloud so they didn’t have to pay a ransom.

Some ransomware attackers search out backup systems to encrypt and lock, too, by first gaining entry to desktop systems and then manually working their way through a network to get to servers. So if you don’t back up to the cloud and instead backup to a local storage device or server, these should be offline and not directly connected to desktop systems where the ransomware or attacker can reach them.

“A lot of people store their documents in network shares,” says Anup Ghosh, CEO of security firm Invincea. “But network shares are as at risk as your desktop system in a ransomware infection. If the backups are done offline, and the backup is not reachable from the machine that is infected, then you’re fine.”

The same is true if you do your own machine backups with an external hard drive. Those drives should only be connected to a machine when doing backups, then disconnected. “If your backup drive is connected to the device at the time the ransomware runs, then it would also get encrypted,” he notes.

Backups won’t necessarily make a ransomware attack painless, however, since it can take a week or more to restore data, during which business operations may be impaired or halted.

“We’ve seen hospitals elect to pay the ransom because lives are on the line and presumably the downtime that was associated, even if they had the ability to recover, was not considered acceptable,” says Doggett.

2. Just Say No—To Suspicious Emails and Links

The primary method of infecting victims with ransomware involves every hacker’s favorite bait—the “spray-‘n’-pray” phishing attack, which involves spamming you with emails that carry a malicious attachment or instruct you to click on a URL where malware surreptitiously crawls into your machine. The recent ransomware attacks targeting Congressional members prompted the House IT staff to temporarily block access to Yahoo email accounts, which apparently were the accounts the attackers were phishing.

But ransomware hackers have also adopted another highly successful method—malvertising—which involves compromising an advertiser’s network by embedding malware in ads that get delivered through web sites you know and trust, such as the malvertising attacks that recently struck the New York Times and BBC. Ad blockers are one way to block malicious ads, patching known browser security holes will also thwart some malvertising.

When it comes to phishing attacks, experts are divided about the effectiveness of user training to educate workers on how to spot such attacks and right-click on email attachments to scan them for malware before opening. But with good training, “you can actually truly get a dramatic decrease in click-happy employees,” says Stu Sjouwerman, CEO of KnowBe4, which does security awareness training for companies. “You send them frequent simulated phishing attacks, and it starts to become a game. You make it part of your culture and if you, once a month, send a simulated attack, that will get people on their toes.” He says with awareness training he’s seen the number of workers clicking on phishing attacks drop from 15.9 percent to just 1.2 percent in some companies.

Doggett agrees that user training has a role to play in stopping ransomware.

“I see far too many people who don’t know the security 101 basics or simply don’t choose to follow them,” says Doggett. “So the IT department or security folks have a very significant role to play [to educate users].”

3. Patch and Block

But users should never be considered the stop-gap for infections, Ghosh says. “Users will open attachments, they will visit sites that are infected, and when that happens, you just need to make sure that your security technology protects you,” he says.

His stance isn’t surprising, since his company sells an end-point security product designed to protect desktop systems from infection. The product, called X, uses deep learning to detect ransomware and other malware, and Ghosh says a recent test of his product blocked 100 percent of attacks from 64 malicious web sites.

But no security product is infallible—otherwise individuals and businesses wouldn’t be getting hit with so much ransomware and other malware these days. That’s why companies should take other standard security measures to protect themselves, such as patching software security holes to prevent malicious software from exploiting them to infect systems.

“In web attacks, they’re exploiting vulnerabilities in your third-party plug-ins—Java and Flash—so obviously keeping those up to date is helpful,” Ghosh says.

Whitelisting software applications running on machines is another way Sjouwerman says you can resist attacks, since the lists won’t let your computer install anything that’s not already approved. Administrators first scan a machine to note the legitimate applications running on it, then configure it to prevent any other executable files from running or installing.

Other methods network administrators can use include limiting systems’ permissions to prevent malware from installing on systems without an administrator’s password. Administrators can also segment access to critical data using redundant servers. Rather than letting thousands of employees access files on a single server, they can break employees into smaller groups, so that if one server gets locked by ransomware, it won’t affect everyone. This tactic also forces attackers to locate and lock down more servers to make their assault effective.

4. Got an Infection? Disconnect

When MedStar Health got hit with ransomware earlier this year, administrators immediately shut down most of the organization’s network operations to prevent the infection from spreading. Sjouwerman, whose firm distributes a 20-page “hostage manual” (.pdf) on how to prevent and respond to ransomware, says that not only should administrators disconnect infected systems from the corporate network, they should also disable Wi-Fi and Bluetooth on machines to prevent the malware from spreading to other machines via those methods.

After that, victims should determine what strain of ransomware infected them. If it’s a known variant, anti-virus companies like Kaspersky Lab may have decryptors/a> to help unlock files or bypass the lock without paying a ransom, depending on the quality of encryption method the attackers used.

But if you haven’t backed up your data and can’t find a method to get around the encryption, your only option to get access to your data is to pay the ransom. Although the FBI recommends not paying, Ghosh says he understands the impulse.

“In traditional hacks, there is no pain for the user, and people move on,” he says. But ransomware can immediately bring business operations to a halt. And in the case of individual victims who can’t access family photos and other personal files when home systems get hit, “the pain involved with that is so off the charts…. As security people, it’s easy to say no [to paying]. Why would you feed the engine that’s going to drive more ransomware attacks? But … it’s kind of hard to tell someone don’t pay the money, because you’re not in their shoes.”

https://www.wired.com/2017/05/ransomware-meltdown-experts-warned/

 

Pentagon Networks of Expendable Platforms

Standard

33817-DARPA-Swarm-oldsite

Photo: DARPA’s swarming concept (DARPA)

“NATIONAL DEFENSE MAGAZINE”

“Teams of lower-cost, unmanned systems that don’t need to return from battle will be critical for future warfighting, the head of the Pentagon’s Strategic Capabilities Office said March 28.

Potential adversaries are developing new military technologies that are putting expensive U.S. military platforms and personnel at greater risk, William Roper noted at an Air Force Association conference in Arlington, Virginia.

“Increasingly we’re going to ask our designers, including those in industry, to help us shift all of the dangerous jobs in combat — as many of them as we can do in an ethical way — to machines that can take the brunt of at least that initial edge of conflict so that … we have the maximum number of our operators returning home safely,” he said.

Much of the technology required already exists, he said.

The Strategic Capabilities Office, also known as the SCO, has partnered with Defense Department research laboratories and other organizations on a number of projects along these lines.

One, called Perdix, demonstrated the ability of a fighter jet to launch a swarm of autonomous drones capable of performing intelligence, surveillance and reconnaissance missions.

Another, called Avatar, is a robotic “wingman” concept that would pair unmanned aircraft with a manned fighter. Doing so would reduce the number of pilots in harm’s way. The SCO is working on a similar concept for the Army, Roper said.

The office also has a program aimed at creating “a ghost fleet of expendable boats” that could team with U.S. Navy vessels, he said.

These types of systems offer an advantage over most of today’s platforms, he noted.

“All the things we build are expensive, and if they take off it’s our expectation that they come home and land,” he said. “That hasn’t been an issue until now” when there is greater concern about fighting advanced adversaries.

Requiring a high level of survivability is a huge constraint for system designers and operational planners, Roper said. Manned platforms have to be protected and refueled. They also require more maintenance and sustainment. That translates to higher costs for the Defense Department, he added.

Using relatively low-priced robotic systems to perform high-risk missions would provide greater operational flexibility and lower the costs of a loss or mishap, he said.

“There’s a reason why we don’t take fine china and crystal to have picnics anymore,” he said. “Once you’ve used paper plates and Dixie cups, you’re not coming back from that. It makes it a completely different experience. We haven’t had that equivalent in the military.”

Advances in autonomy, teaming technologies, artificial intelligence and machine learning are enabling a greater reliance on robots, Roper noted.

“I think you’re going to see that more and more,” he said.  “Making a team of things perform a function that only an expensive thing would have done in the past.”

Despite these advances, humans will not be completely cut out of the loop. Nor will the Pentagon cease to buy high-ticket equipment, Roper said. But the role of high-priced, manned platforms could change.

“What I think … our high-end tactical systems will become is less weapon-slingers and they’ll become more like command hubs,” he said.

Roper likened the human warfighter of the future to an NFL quarterback running an offense. “They’re the ones that call the audibles … and it’s the team [of robots] that runs the play that has been picked,” he said.

This manned-unmanned teaming concept is driving much of what the Strategic Capabilities Office is working on, he said.

While machines are becoming smarter and more capable, they still have limitations, he noted. “Autonomy is very good at making brute force elegant,” Roper said. “But it’s very difficult for it to make strategic choices especially outside of the data set on which it’s built.”

Machines are more likely to fail when presented with a decision that they haven’t been programmed to make, he said.

“What that tells me is that I’m going to need people connected to the machine to help make choices when it’s that thing that hasn’t been seen before,” he said. “People are great at …  quickly being able to think strategically [and] get down to action in a way that’s cognizant of the risks that are being taken.”

http://www.nationaldefensemagazine.org/blog/Lists/Posts/Post.aspx?ID=2465

We Need to Audit the Pentagon

Standard

videopentagon575

“THE PROJECT ON GOVERNMENT OVERSIGHT (POGO)”

“In 1994 Congress passed legislation requiring every federal agency to be auditable.

Since then every agency has complied—except for the Department of Defense.

“We have known for many years that the Department’s business practices are archaic and wasteful, and its inability to pass a clean audit is a longstanding travesty,” Chairs John McCain (R-AZ) and Mac Thornberry (R-TX) of the Senate and House Armed Services Committees said recently in a joint statement. “The reason these problems persist is simple: a failure of leadership and a lack of accountability.”

The Department’s… inability to pass a clean audit is a longstanding travesty

Increasing Pentagon spending under these circumstances is the opposite of fiscal responsibility. In fact, giving the Pentagon $54 billion and finding out why later is bad budgeting.

Both the Republican and Democratic party platforms included the need to audit the Pentagon, and Congress should resist calls to give more money to an agency they know to be irresponsible with taxpayer dollars.

You can learn more about the seemingly endless saga surrounding the Pentagon’s utter failure to get a clean audit opinion here.”

http://www.pogo.org/straus/issues/defense-budget/2017/pentagon-audit-needed-oversight.html