Skip to content

July 20 2013

July 22, 2013




DHS’s Napolitano Resigns: The Impact

Secretary Leaving to Head University of California System

By Eric Chabrow, July 12, 2013. Follow Eric @GovInfoSecurity


The resignation of Janet Napolitano as homeland security secretary could have an adverse impact on the nation’s cybersecurity policy, at least temporarily, considering the posts of deputy secretary and deputy undersecretary for cybersecurity remain vacant.

“It is never good when leadership positions are vacant, as there is a loss of momentum for any initiatives under way,” says Karen Evans, who served as de facto federal chief information officer in the George W. Bush White House.

Napolitano announced on July 12 that she is leaving as homeland security secretary to become chancellor of the University of California system. She plans to leave the department by early September, according to a letter she sent to staff.

Deputy Secretary Jane Holl Lute and Deputy Undersecretary for Cybersecurity Mark Weatherford resigned this spring [see DHS’s Mark Weatherford Resigning]. Unlike the secretary and deputy secretary positions, Weatherford’s former job does not require Senate confirmation. That means a candidate for that job could be named before Napolitano departs.

“They have candidates for all the positions and are very close to announcing selections,” says James Lewis, senior fellow at the Center for Security and International Studies, a Washington think tank. “All the selections are good.”


Meeting Milestones, Remaining Calm

Patricia Titus, the onetime chief information security officer at DHS’s Transportation Security Administration, says the acting officials are performing excellently in their temporary roles. “The career employees have been meeting milestones, remaining calm and carrying on with the work at hand,” Titus says. “Filling these key roles will be imperative, however, there certainly are still great people working hard.”

Rand Beers, undersecretary for national protection and programs, is acting deputy secretary; Bruce McConnell, senior counselor for cybersecurity, is filling Weatherford’s old job temporarily; and Bobby Stempfley, deputy assistant secretary, is the interm assistant secretary for cybersecurity and communications, replacing Michael Locatis, who resigned in January [see DHS Losing a Senior Cybersecurity Leader].

Jay Carney, the president’s press secretary, says he has no names to float for a Napolitano replacement. “The president will be very deliberate about looking at potential successors for that very important position,” he says.


Cybersecurity Focus

During Napolitano’s tenure, cybersecurity became a national priority and a core mission at DHS. Here’s how Napolitano put it in her third annual address on the state of homeland security delivered in February:

“There’s perhaps no better example of how DHS has evolved to address new and evolving threats than the issue of cybersecurity. The cyber realm wasn’t even a major focus of the early department. Now, it is one of our five core mission areas.”

Sen. Jay Rockefeller, the West Virginia Democrat who has cosponsored comprehensive cybersecurity legislation, worked closely with Napolitano, “particularly when few people were thinking about this issue. It took no time for her to understand the enormous consequences of inaction.”

Cybersecurity should play a significant role during confirmation hearings for Napolitano’s and Lute’s successors. Questions from senators could focus on the role of government and DHS in protecting the mostly privately owned national critical infrastructure.

“However, I don’t expect that any serious nominee will have trouble with these questions, in light of the conventional wisdom about the importance of the private sector leading the way,” says Allan Friedman, research director of the Brookings Institution’s Center for Technology Innovation.

Another line of questioning could focus on recent disclosures about National Security Agency programs collecting information on American citizens. “We can expect questions about how they will work with the defense and intelligence community, but I would be surprised if this became a serious political obstacle,” Friedman says.


A Top Spokesperson

Since her confirmation as secretary in 2009, Napolitano had become one of the administration’s top spokespersons on cybersecurity. That’s especially true in testimony before Congress, where because of executive privilege, the White House’s cybersecurity coordinator and other Obama advisers will not testify.

But being a face for administration cybersecurity policy doesn’t mean the policy is where it should be. “The big problems at DHS are lack of authorities and lack of a clearly defined mission,” CSIS’s Lewis says.

Brookings’ Friedman picks up on that theme, saying the absence of any clear strategic program in cybersecurity could be seen as a failure or a success.

“On one hand, there is no clear sign or accomplishment to demonstrate progress made in the last four years,” Friedman says. “On the other hand, the failure of large initiatives was largely a failure of legislation, and DHS has been instrumental at coordinating numerous initiatives across the federal government. In retrospect, small improvements and programs might be seen as better than a failed comprehensive plan.”


New Blood

Larry Clinton, chief executive of the industry group Internet Security Alliance, points out that Napolitano endorsed the administration’s earlier position that the government could regulate the cybersecurity of critical private businesses in certain instances, a position groups like his opposed and the White House eventually abandoned.

The administration’s new position, reflected in President Obama’s cybersecurity executive order and backed by Napolitano, would have the federal government, working with industry, develop cybersecurity best practices that critical infrastructure operators could voluntarily adopt, an initiative being carried out by the National Institute of Standards and Technology [see Man Behind the Cybersecurity Framework ].

“It may be a good thing to make the change now so a new person who is more committed to the new model can be brought on and help it through its current formative stage and remain in place as it is implemented,” Clinton says.

Day-to-day activities are not likely to be adversely affected by the vacancies, but high-level relations could, says Evans, who now heads the U.S. Cyber Challenge. “There are good career staff in place who will continue the administration’s priorities,” she says. “B But you do need the leadership there in the meetings with the DoD and intelligence community to ensure the balance of all the agencies as well as private sector.”.


Web industry officials balk at domain expansion plan

By Craig Timberg, Published: July 14


A plan to dramatically expand the number of possible Web addresses — by adding more than 1,000 new domains such as “.buy,” “.casino” and “.gay” — could cause widespread disruption to Internet operations, say some industry officials.

Efforts to augment existing domains such as “.com” and “.gov” have been underway for several years and are entering a critical new phase as industry officials meet at an international conference that began Sunday in Durban, South Africa. By summer’s end, the new domains could be going live at a pace of 20 or more each week.

The plan has touched off a scramble among investors eager to gain control of the virgin Internet real estate, potentially worth billions of dollars in annual licensing fees. But a vocal group of critics is calling the speed and scale of the expansion reckless, given its possible impact on the Internet’s global infrastructure, which relies on interactions among computer networks owned by companies, universities and individual users.

Particularly troubling is the possibility of widespread “name collisions” that could happen when domains used by internal corporate computer systems — such as “.corp” or “.home” — get assigned to the Web more broadly. This could cause systems to fail, blocking access to e-mail or other internal programs, and also could open sensitive information to theft, some experts say.

“This could affect a million enterprises,” said Danny McPherson, chief security officer for Verisign, which is based in Reston and manages several of the most popular existing domains. “It could absolutely break things.”

McPherson and other security experts say the nonprofit group that oversees the designation of Web addresses, the Internet Corporation for Assigned Names and Numbers (usually known by its acronym, ICANN), has not done enough study on the impact of the new domain names and does not have procedures in place to respond quickly if systems malfunction. Among those posing risk could be domains such as “.med” or “.center” that might be critical to the functioning of medical systems or emergency-response networks.

Similar concerns have been expressed by the Association of National Advertisers, which represents hundreds of major companies, and the Internet commerce site PayPal, which issued a letter in March saying, “The potential for malicious abuse is extraordinary, [and] the incidental damage will be large even in the absence of malicious intent.”


Defenders of the plan have called such fears overblown, arguing that the potential problems have been long understood and will be resolved before new domains are approved. Because the new domains will be released gradually, over the course of months, there will be time to manage problems as they arise, said Jeffrey Moss, chief security officer for ICANN.

“It’s not like it’s a runaway train without recourse,” Moss said. “We’re not going to do anything that harms the security or stability of the Internet.”

U.S. officials who oversee Web security issues through the Commerce Department’s National Telecommunications and Information Administration expressed confidence in the management of the domain program, issuing a statement saying, “We would expect these issues to be discussed and resolved within the ICANN multistakeholder process.”

Whoever wins control of the new domains will be allowed to sell licensing rights for the resulting new Web addresses, typically for annual fees, with a portion going to fund ICANN, which is based in Southern California. Just bidding for a domain costs $185,000.

Donuts Inc., an investment group that made the largest number of bids, with 307, said Verisign’s criticism of the process for launching the new domains was a result of self-interest. The company controls the popular “.com” and “.net” domains — giving it a degree of market power that could be diluted if new ones gain widespread acceptance.

“ICANN was created in large part to break Verisign’s monopoly over domain names,” Donuts spokesman Mason Cole said in a statement. “Now that the organization is on the verge of achieving that goal, it’s not surprising that Verisign is uncomfortable.”

Verisign officials say they support the program for adding new domains but believe the rollout should proceed more cautiously than currently planned.

The stakes are high in an era when a large and growing share of the world’s economic activity happens over the Internet. Even traditional brick-and-mortar businesses use online systems to communicate, manage inventories and interact with customers. Many also count on the security of networked computer systems to protect lucrative intellectual property and other valuable strategic information.

Moss, the ICANN security chief, acknowledged that some internal corporate systems will malfunction as new domains are created, and he said it would be the responsibility of company officials to resolve these problems.

“We want everything to work, and we’re going to try to make everything work, but we can’t control everybody’s networks on the planet,” he said.

Moss said the number of domains likely to cause problems is a “really, really small number.”

But critics have said it is irresponsible for ICANN to approve new domains before it knows the extent of the problems they would create and has plans in place to fix them. The cost of repairing systems — or the loss of security — would be borne by private companies that in most cases have little to gain from the hundreds of new Internet domains.

In addition to expressing such security concerns, corporate leaders have been complaining that the sheer number of new domains will cause a sharp rise in fraud and abuse as criminals buy up Web addresses intended to deceive consumers. Already, many companies are attempting to defend against this by acquiring many different Web addresses that include their corporate names. But that will become far more difficult, they say, with hundreds of new domains, including “.corp,” “.inc” and “.sucks.”

“If everything ran perfectly, this would extraordinarily transform the Internet,” said Dan Jaffe, executive vice president of the Association of National Advertisers. “There is every reason to believe that, as of now, there could be serious problems.”



Sign up today to receive #thecircuit, a daily roundup of the latest tech policy news from Washington and how it is shaping business, entertainment and science.


New Data Link Enables Stealthy Comms    

Defense News

Jul. 14, 2013 – 04:36PM |

By AARON MEHTA         


WASHINGTON — Pentagon officials have long identified the F-35 joint strike fighter as key to the future of America’s defense, in large part due to stealth capabilities that should allow the plane to travel in contested environments that older fighters would struggle to penetrate.

The problem is, these planes need to talk to each other without sacrificing stealth. To tackle that problem, the F-35 has incorporated Northrop Grumman’s Multifunction Advanced Data Link (MADL), a system that’s undergoing testing in the California desert.
MADL is a digital waveform designed for secure transmission of voice and data between F-35s, with the potential of linking F-35s to ground stations or other aircraft, Northrop said.

Think of the system as a computer. The communications, navigation and identification (CNI) system on an F-35 can manage 27 different waveforms, including MADL. The data comes through the antenna, is turned into digitized bits, and is crunched by the on-board systems to get the relevant information to the pilots.

The system will be included in the 2B software package that the US Marine Corps’ F-35B jump-jet variant and the US Air Force’s F-35A conventional version will use when they reach initial operating capability in 2015 and 2016, respectively. It also will be included in all international versions of the jet. The US Navy’s F-35C carrier variant is expected to reach IOC in 2019 with the block 3F software, which will incorporate MADL and other capabilities.

What makes MADL more than just a communications tool is its ability to connect with other planes and automatically share situational awareness data between fighters. The more planes in the network the greater the data shared and the more comprehensive a picture is formed.

Picture a group of jets flying in formation. The pilot farthest to the right will have a different situational awareness picture than the pilot on the left. But once they’re networked, all the information is automatically shared among the pilots.

Prior to takeoff, planes would be designated with partners to form the network. When a plane gets within range, the network is automatically created.

“Like on your computer, your network into the local area, we’re building that network in the sky and it’s keeping up with all the dynamics and spatial changes,” said Bob Gough, director of CNI technology and programs at Northrop. “MADL has the smarts to keep up with all of that and keep the network in place so they can share the same data.”

Gough declined to say how close jets need to be to trigger the network link, but did say tests have shown “very fast” acquisition times once within range.

Live flight system tests at Edwards Air Force Base, Calif., began late last year and have continued throughout this year. Initially, the tests involved networking a pair of planes, but recently, test pilots began regularly flying four-plane networks. Those tests are proceeding smoothly, said Joe DellaVedova, a spokesman for the Pentagon’s F-35 Joint Program Office.

“MADL testing is performing as planned,” DellaVedova wrote in an email. “Development of the advanced data link is currently tracking to deliver the phased capability expected by the end of development.”

The system is designed for plane-to-plane communications only, something Gough expects to continue in the near term. But he did not rule out experimenting with data transfer to other terminals.

“We have postulated MADL terminals on ships and we have built a MADL test ground station, so it could be done,” he said. “But it’s more about the logistics of where F-35s will be flying and how close to the ground they would be. It would be mission-scenario dependent, but it’s all technically possible.”

In the long term, Northrop hopes to expand the technology to other fifth-generation planes. That’s not a new idea; in 2008, MADL was slated to go on the F-22 Raptor fighter and B-2 bomber. But it never went on those jets, something the former Air Force chief of staff, Gen. Norton Schwartz, blamed on the technology’s lack of maturity during congressional testimony in 2009.

“We believe as the flight test program matures, it will be more likely” to end up on other platforms, Gough said.

That could include using MADL to communicate between fifth-generation fighters like the JSF and fourth-generation fighters, such as an F-16. Gough said he hopes to begin research on fifth-to-fourth generation data transfers “as soon as” next year.



Where’s the Best Place to Put a Wind Farm?

Scientific American

Sunday, July 14, 2013 | 7


If you want to generate electricity from the sun or wind, it makes sense to go where it’s sunniest and windiest. But part of the reason to generate electricity from such renewable resources is to cut back on the pollution that comes from burning fossil fuels. And if you take that into account, the best places change.

Researchers at Carnegie Mellon University totaled up the health and environmental benefits of renewable electricity across the U.S. Such benefits range from decreased respiratory disease thanks to less soot all the way to mitigation of climate change.

And when you consider all those factors, a solar panel in New Jersey delivers more overall benefits than photovoltaics in far sunnier Arizona.

The difference comes down to what the renewable power replaces. For example, a wind turbine in West Virginia that cuts down on coal burning avoids 33 times as much health and environmental damage as would the same wind turbine in California. The analysis is in the Proceedings of the National Academy of Sciences.

So where’s the best place to put renewable power? According to this study, we need to be building wind farms in Indiana. They’ll fit in nicely among the corn.

—David Biello


Climate change will disrupt energy supplies, DOE warns

Wendy Koch, USA TODAY 8:42 a.m. EDT July 12, 2013


U.S. energy supplies will likely face more severe disruptions because of climate change and extreme weather, which have already caused blackouts and lowered production at power plants, a government report warned Thursday.

What’s driving these vulnerabilities? Rising temperatures, up 1.5 degrees Fahrenheit in the last century, and the resulting sea level rise, which are accompanied by drought, heat waves, storms and wildfires, according to the U.S. Department of Energy.

“It (climate change) is a very serious problem and it will get worse,” says Jonathan Pershing, who oversaw the report’s development. While impacts will vary by region, “no part of the country is immune,” he says. He adds that climate change is exacerbating extreme events.

“Sea level rise made Sandy worse,” Pershing says, noting that it intensified flooding. When the superstorm slammed the East Coast last year, it took down power lines, damaged power plants and left millions of people in the dark.

The report comes one week after President Obama, describing climate change as a threat to future generations, called for action to address the problem “before it’s too late.” He said he aims to cut heat-trapping greenhouse gas emissions from new and existing power plants.

Echoing other research, the DOE report makes the case for why such reductions are needed. It says coastal power plants are at risk from sea level rise and power lines operate less efficiently in higher temperatures.

“The report accurately outlines the risks to the energy sector in the United States” and should serve as a “wake-up call,” says Jennifer Morgan, deputy director of climate and energy at the World Resources Institute, a non-profit that advocates for sustainability.

The report cites prior climate-related energy disruptions. Last year in Connecticut, the Millstone Nuclear Power Station shut down one reactor because the temperature of water needed to cool the facility — taken from the Long Island Sound — was too high. A similar problem caused power reductions in 2010 at the Hope Creek Nuclear Generating Station in New Jersey and the Limerick Generating Station in Pennsylvania.

Reduced snowpack in the Sierra Nevada mountains last year cut California’s hydroelectric power generation 8%, while drought caused the U.S. Army Corps of Engineers to stop the transport of oil and coal along the Mississippi River, where water levels were too low, according to the report. Also, in September 2010, water levels in Nevada’s Lake Mead fell to a 54-year low, prompting a 23% loss in the Hoover Dam’s generation.

While climate change is not the sole cause of drought, climate scientists say rising temperatures can exacerbate it by causing more moisture to evaporate from the soil. They say those temperatures, which the third federal National Climate Assessment says could rise 3 degrees to 10 degrees Fahrenheit by 2100, will contribute more to drought in the future.

In Texas, which is suffering a three-year drought that now affects 87% of its land, conflicts are arising over the water-intensive process of extracting oil or natural gas from shale deposits, known as hydraulic fracturing, or fracking. In 2011, Grand Prairie became the first in the state to ban city water for fracking. Other municipalities have restricted water use for that purpose.

Nationwide, 47% of fracking wells are in water-stressed areas, according to a report in May by Ceres, a Boston-based non-profit that promotes corporate sustainability.

The DOE report cites research indicating that nearly 60% of current thermoelectric power plants, which need water cooling to operate, are located in water-stressed areas.

It says higher temperatures will boost the demand for air conditioning, which could threaten energy security by forcing the nation’s power system to operate beyond ranges for which it was designed. It cites a study by DOE’s Argonne National Laboratory that found such peak demand, given current population levels, will require additional electricity equal to 100 new power plants.

The dire tone of the DOE report, while warranted, can “give a reader a sense of fatigue,” says Joe Casola, a senior scientist at C2ES, formerly the Pew Center on Global Climate Change. Yet he says it also points to solutions such as water-efficient technologies and protection for energy infrastructure.

“It’s technologically within our means to address some of these issues now,” Casola says. “There are a lot of things we can do.”

DOE’s Pershing agrees. “It’s a problem we need to work on,” he says. He notes that the billions of dollars in losses already incurred from climate-related disasters show the need for additional measures.



Combat air forces to resume flying

from Air Combat Command Public Affairs


7/15/2013 – JOINT BASE LANGELY-EUSTIS, Va. — Combat Air Forces units from multiple commands will begin flying again July 15 after many stopped flying in April of this year due to sequestration.

The restored flying hour program represents $208 million of the $1.8 billion reprogramming allocation authorized by Congress. The money re-instates critical training and test operations for the CAF fleet across the Air Force for the remainder of FY13. This impacts not just Air Combat Command units, but also CAF units assigned to United States Air Forces Europe and Pacific Air Forces.

For ACC, the restored flying hours will be allocated to combat aircraft and crews across the command’s operational and test units, including the Air Warfare Center’s Weapons School, Aggressors and the Thunderbirds aerial demonstration team.


While the return to the skies means a return to crucial training and development for pilots, navigators, flight crews, mission crews and maintainers, the leader of the Air Force’s CAF fleet cautions that this is the beginning of the process, not the end.

“Since April we’ve been in a precipitous decline with regard to combat readiness,” said Gen. Mike Hostage, commander of Air Combat Command. “Returning to flying is an important first step but what we have ahead of us is a measured climb to recovery.”

“Our country counts on the U.S. Air Force to be there when needed–in hours or days, not weeks or months,” General Hostage said. “A fire department doesn’t have time to ‘spin up’ when a fire breaks out, and we don’t know where or when the next crisis will break out that will require an immediate Air Force response.”

The restoration of flying hours only addresses the next two and half months of flying up until October 1.

“This decision gets us through the next several months but not the next several years,” the general said. “While this paints a clearer picture for the remainder of FY13, important questions remain about FY14 and beyond. Budget uncertainly makes it difficult to determine whether we’ll be able to sustain a fully combat-ready force.”

Additionally, the restoration comes at a cost to future capability, including reduced investment in the recapitalization and modernization of the combat fleet.

“We are using investment dollars to pay current operational bills, and that approach is not without risk to our long-term effectiveness,” General Hostage said. “We can’t mortgage our future. America relies on the combat airpower we provide, and we need to be able to continue to deliver it.”


Why the enterprise can’t shake its email addiction

Forget new (and better) technologies — email is as entrenched in the business world as it’s ever been. Here’s why we can’t break free.

Howard Baldwin

July 15, 2013 (Computerworld)


Atos CEO Thierry Breton caught a lot of flak last year when he announced he wanted his employees to give up email, but he may have been onto something.

Kids these days don’t use email — digital market research company comScore found that use of Web-based email dropped 31% among 12- to 17-year-olds and 34% among 18- to 24-year-olds in the period between December 2010 and December 2011.

And consumers in general are also off email. The Radicati Group, which tracks use of email and other messaging media, projects the number of consumer emails will decrease by 3% to 4% each year between 2012 and 2016 (see chart, below right).

Then again, there was a reason Breton came in for so much derision: Enterprise email isn’t going anywhere. Or, more precisely, enterprise email usage isn’t going anywhere but up. Radicati is projecting the number of business emails to increase by 13% every single year between now and 2016.

For businesspeople, that means more time scrolling through the inbox (not only on PCs and laptops but now on tablets and smartphones) clicking past newsletters, social media notifications and spam in search of the messages they truly need to do their jobs, and then later filing, archiving and retrieving those messages.

For IT, that means more complaints from users about storage limits being too low (especially when Google lets them keep everything), as well as worries about security, archiving, retention, e-discovery, deletion and syncing mail between mobile devices. And then there’s the cost: In 2010, Gartner estimated that the various costs tied to email add up to $192 per user per year.

Why do we subject ourselves to this madness? Because for all its aggravations, email works. “It’s still an efficient way of communicating, almost in real time,” says Phil Bertolini, CIO of Michigan’s Oakland County, who’s responsible for 10,000 email boxes.

“It does what it’s designed to do quite well, which is allow us to securely communicate on a one-to-one or one-to-few basis,” says Rob Koplowitz, an analyst at Forrester Research.

Simply put, we may hate email, but we can’t work without it. But CIOs and messaging experts agree that something must change that if enterprise email volume is going to boom the way Radicati’s numbers indicate. Email is going to have to get more sophisticated and, at the same time, easier to use. And the people doing the using, who often make life harder for themselves, need to evolve, too.


Why We Love Email

We love email because it’s useful and ubiquitous. It keeps us connected and updated without requiring sender and recipients to be online at the same time, thanks to its asynchronous nature. Everyone doing business today can reasonably be expected to have an email address, whereas only some people use alternative tools like chat, videoconferencing or SMS texting.

Beyond that, email creates a de facto audit trail with a record of who sent what to whom when. And, barring space limitations, that trail is readily available on one’s computer.

The result of this success? “Nobody can live without it for more than two minutes,” says Sara Radicati, president and CEO of The Radicati Group.

From Unix mail (b. 1972), IBM PROFS (b. 1981) and DEC All-In-1 (b. 1982) to email clients, integrated email (think Lotus Notes) and Web-based mail to today’s cloud-based options, email has evolved because we have needed it.

Bertolini is a big fan of email — since the public sector is still heavily paper-based, email still counts as a big technological step forward. “We can chase new technologies, but I need something that’s trusted and used by the masses. Even though there are people clamoring for newer ways to communicate, email is our main form of communication,” he says.


Why We Hate Email

Unfortunately, email’s positives — its utility and ubiquity — have become its negatives as well.

Consider this complaint: “It doesn’t matter if the message comes from a spammer hawking Viagra, your wife asking you to pick up some wine, your boss telling the company that Monday is a holiday, or a client asking for a meeting at his office at 11 a.m. In today’s inboxes, all email messages are equal,” journalist Om Malik wrote six years ago, in 2007. If anything, the situation has only gotten worse.

The problem, says Koplowitz, is that “we use email for things it wasn’t designed to do.” Hooked on email, users default to it for scheduling, workflow planning, resource management, archiving, document management, project management and even knowledge management. Often, ideas that should be shared widely are locked up in an email chain among a narrow list of recipients. “The things it does poorly have become problematic,” Koplowitz sums up.

Over the years, developers have tried to break through users’ dependence on email with software that’s more sophisticated and better suited to certain enterprise tasks — often with only limited success.

Knowledge management systems, touted in the 1990s as the next big thing, failed to catch on, while collaboration systems such as Lotus Notes and Microsoft SharePoint have been variously successful; the inclusion of Chatter into the system serves specific needs of salespeople.

But typically these systems have failed to become as widespread as email because, while they offered a solution that may indeed have been superior to email, they did so only for a narrow population of users.

“There’s a high correlation in the success of these tools when they’re aligned with recognizable business value,” says Koplowitz. Unfortunately, he adds, there’s frequently an organizational mismatch. The tools that work for one department (e.g., sales) may not work for another (e.g., customer service).

And when a new communication tool like Yammer or Chatter does take hold throughout the enterprise, what happens? Users route their notifications to the one place they’re most likely to see them first — the omnipresent email inbox.


IT’s Email Burden

For IT, email is an ongoing headache. Niraj Jetly, CIO at Edenred USA, the Newton, Mass.-based U.S. division of a global developer of employee benefits and incentive solutions, cites a quartet of hassles: the sheer volume of messages; compliance and security concerns; the risks that arise when users access corporate email on their personal devices; and international routing problems.

“No one can support ever-increasing mailbox sizes,” he says. “At the same time, we have to ensure the safety and security of sensitive data being transmitted. We have to ensure the availability of emails archived by users on their laptops or desktops.”

As a divisional CIO within a multinational organization, Jetly also says getting email from continent to continent is a challenge. “It gets very tricky when different government [regulations] and private-sector contracts restrict email routing,” he explains. For instance, certain Payment Card Industry Data Security Standard mandates require that emails originating in the U.S. stay in the U.S.

The bring-your-own-device trend also worries him. “If an organization needs encrypted email but also supports BYOD, supporting access to corporate email on personal devices becomes a never-ending challenge,” Jetly says. “And if a user loses a personal device, who has liability for the loss of data?”

Pete Kardiasmenos, a systems architect at SBLI USA, manages the New York-based insurance company’s Exchange servers and gets involved with “anything relating to email.” His biggest issue: users turning to free external email systems, such as Yahoo Mail and Gmail, to circumvent corporate storage limits.

“They don’t have bad intentions. They want to know why they’re limited to 500 megabytes when Gmail is unlimited. It’s because the more space you have, the more time backup takes, the more complicated disaster recovery is. We have to constantly communicate our policies,” he says. Like a lot of big enterprises, SBLI USA has had to block access to public email systems from company-owned computers as a security measure, and it has had to limit space in Exchange for most users because of the cost of storage.

Even then, he says, email is still a headache. “People keep email in their inbox the same way they keep files on their desktop, to keep them handy. They send the same file back and forth as an attachment until you have 10 versions that you have to store.”

For Oakland County’s Bertolini, management is the challenge — managing passwords, and managing Outlook’s .pst backup files when they get too big. At least, he says, when those files get too large, they start to generate error messages. “We find out about it when [users] have a problem,” Bertolini says with a sigh.

“In one case, we discovered thousands of emails dating back to 2001,” Bertolini recalls. “And the real problem is that most of them dealt with trivia like meeting for lunch. There’s a cost to maintaining and managing email over time.”

IT’s biggest email-related burden is simply uptime, says Radicati. “The overriding concern for IT is making sure that it’s up and running and available,” she says.


Human Roadblock

Email’s People Problem


Is the enterprise’s email addiction rooted in technology or in user behavior? Both, analysts say. “Email is only as good as the person who organizes it,” observes Sara Radicati, president and CEO of The Radicati Group, which tracks use of email and other messaging media.

Over the years, enterprise email systems have added an ever-increasing number of sophisticated organizational tools, but “users still have to train the system, which is where it breaks down,” Radicati explains. “Users forget how they set it up a certain way, and why. Somebody who is highly organized and structured will do well with these tools, and someone who is naturally chaotic will be chaotic.”

Adam Glick, Microsoft’s product manager for Exchange and Outlook, acknowledges that “you can change the tools, but you can’t change the people.” Citing one example of how the tools are changing, he notes that the current version of Office 2013 has an option that lets users ignore any email with a particular subject line if that thread has become irrelevant to the recipient. On a grander scale, Exchange and Outlook are becoming more of a communication hub, with greater integration of chat and unified communications, Glick says.

But all those advances will be meaningless if people don’t take advantage of the new functionality — and IT must help them do that.

“IT needs to explain how and when to use these features,” says Radicati, “and people need to learn to improve their efficiency.”


— Howard Baldwin


Email in the Cloud

So what’s IT supposed to do? Certainly, the cloud offers one of several ways to view email differently. Radicati is optimistic about email in the cloud. “It’s absolutely the way to go,” she says. “A lot of cloud-based email providers have archiving and compliance capabilities in place, and if you want more features, you can purchase them as an additional capability.”

In Oakland County, Bertolini is investigating using Microsoft Office 365 in the cloud. “There’s still a cost associated with storage, but part of our ROI analysis will be comparing the cost of storage in the cloud versus letting people keep more email,” he says, adding that he’s worried that if “you give them more storage, they will fill it up.”

But he also sees other advantages. “If I can host email externally and still have the safety and security the county government needs, I can save millions in the long term. We’d need two to three people to manage Microsoft Exchange, but if I go to the cloud, I don’t need those people. And in three or four years, I’m not replacing my mail servers.”

Still, questions remain. “A lot of IT departments are investigating moving email to the cloud,” Radicati says, “but there is still concern about whether it will be private enough, secure enough and reliable enough.”


Merging Communications Tools

Like many systems IT has to deal with, email’s boundaries are expanding, which means IT needs to begin thinking about email less as a silo and more as one component of a multimodal communications system.

Bertolini notes that the new generation of employees clamors for instant messaging — and he’s not against it. “They use it to collaborate. When they have chat, they can get things done in real time.” He’s also looking at more videoconferencing, first on a one-to-one basis from desktop to desktop, and then from conference room to conference room, and then into a multipoint video arraignment system for the public safety team, because it saves having to transport the county’s prisoners among facilities.

Fortunately, these communication mechanisms will start to merge, analysts predict. Two to five years from now, email won’t look tremendously different, but we won’t talk about it as a stand-alone tool as much as we do today, says Radicati. Instead, we’ll have a communications dashboard that includes email, instant messaging and social media.

These hubs will come about thanks to new open APIs, not only for social media applications like Facebook and LinkedIn, but also for unified communications protocols like Session Initiation Protocol (SIP) and Extensible Messaging and Presence Protocol (XMPP).

Forrester’s Koplowitz concurs. “Over the next few years, we’ll see greater integration across these tools. Think about how messaging is integrated into Gmail — you don’t have to switch back and forth because they’re all integrated together,” he says, citing similar functionality in systems from IBM (with Connections and Notes), Microsoft (with SharePoint and Yammer) and Facebook.

“We’ll have a new environment with new aspects of communication,” Koplowitz predicts. “Today they’re different tools, but in the next three to five years, they’ll be integrated.”

A Silicon Valley-based freelance writer, Baldwin is a frequent Computerworld contributor.

This version of this story was originally published in Computerworld’s print edition. It was adapted from an article that appeared earlier on


Top of Form

The tech that will make drones safe for civilian skies

By Patrick Marshall

Jul 12, 2013

Unmanned aircraft have proven their capabilities in Afghanistan and Pakistan, undertaking reconnaissance and combat missions without putting the lives of pilots at risk.  And now they’re coming home. “We are not darkening the skies yet,” said Richard Christiansen, vice-president of NASA contractor Sierra Lobo Inc., “but we are poised.”

Unmanned aerial vehicles (UAVs) are already used in domestic airspace. Police departments have tested them for surveillance operations, for example, and state environmental departments have used them to survey forests and wildlife. And the Department of Homeland Security has a fleet of eight drones tasked to monitor activity at borders.

So far, however, these deployments have only been permitted under carefully monitored exemptions to Federal Aviation Administration rules. But the FAA plans to integrate unmanned aircraft into civilian airspace by 2015, and it is currently in the process of selecting six locations around the county to explore the potential extent of such integration.

The major barrier to wider deployment is that current FAA rules require the pilot of a UAV to maintain line-of-sight contact with the aircraft. If that limitation is removed, and UAVs are integrated into civilian airspace, analysts expect the market to grow rapidly. An industry trade group, the Association for Unmanned Systems International, projects sales of $90 billion over the next decade.

In the meantime, the rush is on to develop “sense-and-avoid” systems that will allow untethered flights.

Researchers agree that the basic technologies are already available to deliver effective collision warning and avoidance systems.  The challenges, they say, are primarily in engineering and systems integration.

“We know we can technically do it,” said Sanjiv Singh, research professor at Carnegie Mellon University’s Robotics Institute. “The question is whether we can do it within all of the other constraints.” And when it comes to UAVs, especially small UAVs, the primary constraints are size and weight.

Whether a sense-and-avoid system uses electro-optical cameras, laser radar (LIDAR) devices or transponders, the challenge is to make the devices small and light enough to be deployed on small UAVs. “It’s getting close,” said Ian Glenn, CEO of ING Robotic Aviation, a Canadian manufacturer. “We’re making them smaller and smaller. Absolutely the technology will get there.”

ADS-B taking off

According to Glenn, the simplest way to protect against mid-air collisions — whether the aircraft has a pilot or not — is to require the use of ADS-B transponders on all aircraft.  “These transponders can turn an uncooperative environment into a cooperative environment,” he said.

ADS-B (automatic dependent surveillance-broadcast) transponders not only broadcast aircraft location in real time, they also deliver information on altitude and velocity. What’s more, they can deliver data on other aircraft, weather and terrain to the vehicle that is carrying them. In fact, ADS-B transponders will replace radar as the primary technology for tracking air traffic,  and the FAA will require the majority of aircraft operating in U.S. airspace to be equipped with ADS-B by Jan. 1, 2020.

Using ADS-B transponders on many UAVs was infeasible until recently, when Sagetech Corp., an avionics company based in White Salmon, Wash., developed one that weighed only 3.5 ounces.

Last November, ING Robotic Aviation equipped one if its 22-pound Serenity UAVs and manned Harvard Mark IV, a fixed-wing, single-engine aircraft operated by Canada’s National Research Council, with Sagetech ADS-B transponders and Clarity receivers. The two aircraft were then flown for 90 minutes above an Ottawa air park.
“We were able to prove that we could fly, and they could see us a long way away,” said Glenn. “We were able to pick up [data on] traffic coming to Ottawa even as it was crossing New York’s border.” 

Even better, he said, the Clarity receiver is independently powered, which means that even older aircraft can adopt the technology. “One of the arguments by regulating agencies is, ‘Well, that’s great, but we have people flying Sopwith Camels. They don’t even have power supplies, and they don’t have a radio,'” said Glenn. “Well, here’s a little box that you can stick on the dashboard, with a battery that will last longer than your bladder. And it has its own little Wi-Fi and it’ll display on an iPad.”

One system for all sizes

Until all aircraft are equipped with ADS-B transponders and receivers, of course, UAV sense-and-avoid systems must rely on other technologies to detect other aircraft and to undertake evasive maneuvers.

One of the most ambitious civilian efforts under way to develop an integrated system for UAVs is the Mid-Air Collision Avoidance System (MIDCAS), which is being developed by five European countries — Sweden, Germany, France, Italy and Spain — and 11 industrial partners.

The four-year, $65 million project is expected by 2014 to deliver an automated sense-and-avoid system that will not depend on transponders. While it is being designed to integrate ADS-B, MIDCAS also includes two visible-band electro-optical cameras and one infrared camera for aircraft to use in identifying other aircraft. In addition, the team’s developers are designing image-processing algorithms, processing units and integration with on-board avionics.

Key to the project, said Saab Aerosystems’ Johan Pellebergs, MIDCAS project manager, is developing a generic set of sensors and processing modules. “By generic, we mean that it should be able to work on any type of UAS,” Pellebergs said. “It should be adaptable. So we try to keep all of the vehicle-specific parts well contained so that they can easily be adapted to all the different types. The variety in UAS is very big, ranging from the Global Hawk, which is very big, all the way down to small ones that you can hold in your hand.”

Pellebergs said the international team has developed a prototype system and is ready to test it on a manned aircraft. “The collision avoidance part is fully automatic,” he said. “The remote pilot does not need to do anything. If the system detects something, it calculates when it needs to activate. And when the aircraft gets to that point, it triggers and executes the moves automatically.”

It is the system’s control over evasive maneuvers that requires adaptability to each model of UAV. “That’s where the vehicle specifics come in,” Pellebergs said. “You need to be able to model the performance and limitations of each of the vehicles. There are large differences between air speed and maneuverability in these vehicles.”

That’s one reason MIDCAS is working closely with manufacturers of UAVs and sensors.
Another challenge has been designing the software to process the various sensor data.  According to Pellebergs, “The data fusion module takes the information from different sensors and makes one picture. Then it is sent over to the avoid part, where you calculate the maneuvers and execute them. It also sends information down to the ground control station.”

Low-altitude hazards

Of course, the hazards for aircraft — manned and robotic — aren’t limited to other aircraft. 

“There are very few things above 500 feet to run into,” Singh said. But landing, taking off or operating under 500 feet — which is where many UAVs are designed to spend most of their flight time — there are many hazards, including trees, buildings and wires. “You have to go up and come down,” noted Singh. “I think that last hundred feet is pretty important.”

Accordingly, Singh has set his sights, in a sense, lower. “I work on the aspect of UAVs flying intelligently so that they can fly in what we call ‘near-earth environments.’ They are aware of their environment, they are aware of what they can do, they are aware of environmental conditions like wind, and then they plan their actions in such a way that they can stay safe.”

And being closer to the ground introduces other challenges. “Maybe you need to fly close to things, so the GPS is blocked by trees and buildings,” he said. “Maybe you need to operate in dusty conditions or at night. The problem is complex.”

As a result, Singh is working to integrate a variety of sensors and to develop the software to make them usable in UAVs. In addition to the visual sensors (cameras and infrared imagers), Singh is working to incorporate far infrared (effective for detecting features through fog or rain), radar (which can penetrate obstacles) and LIDAR (which is effective in detecting contours of objects). 

Singh was part of a team that recently enabled a full-size, autonomous helicopter to fly at low altitude, avoid obstacles, choose a landing site in unmapped terrain and successfully land. In June 2010, the team tested the sensor and navigation system at a Boeing test facility in Mesa, Ariz. Employing a laser scanner and 3D mapping software, the unmanned helicopter was able to avoid a 60-foot crane and high-tension wires, as well as other smaller obstructions, such as four-in-high pallets, chain-link fences, vegetation and even people.

Government in the way?

While there is a long way to go before completely autonomous UAVs can safely operate in all environments and conditions, researchers say the basic technologies are already in place that would allow for widespread deployments right now if government would move to set standards.

“The real challenge is not technological,” Glenn said. “The real challenge is regulatory acceptance. I think we’re close enough. The key is that we are able to be as good as manned aviation. So the issue is how to get federal aviation authorities around the world to get their minds around it.”

According to Glenn, if government regulatory agencies would specify the performance standards UAVs need to meet, he can design appropriate equipment. “You tell me what the requirement is, and I will build it,” he said. 

Pellebergs agrees. “No one really knows what the requirements are for sense-and-avoid for UASes in civilian airspace, so we need to get a set of standards in place,” he said. “I think that’s what’s holding up a lot of the progress in this area.”

Unfortunately, the Federal Aviation Administration declined our requests for comment.


Colorado town to consider drone hunting license, bounty

Dylan Stableford, Yahoo! News

July 17, 2013

A small town in Colorado is considering an ordinance that would create a license and bounty for hunters to shoot down drones.

“We do not want drones in town,” Phillip Steel, the Deer Trail, Colo., resident who drafted the ordinance, told Denver’s ABC7 affiliate. “They fly in town, they get shot down.”

Steel’s proposal, recently submitted to the town board, calls for a $25 drone hunting license and outlines “rules of engagement” for hunters looking to shoot down the unmanned aerial devices:

The Town of Deer Trail shall issue a reward of $100 to any shooter who presents a valid hunting license and the following identifiable parts of an unmanned aerial vehicle whose markings and configuration are consistent with those used on any similar craft known to be owned or operated by the United States federal government.

Steel said that while he’s never seen a drone flying in Deer Trail, the ordinance is a “symbolic” one.

“I do not believe in the idea of a surveillance society, and I believe we are headed that way,” he said.

According to the U.S. Census Bureau, Deer Trail’s population was 559 in 2011.

“They’ll sell like hotcakes,” Steel said of the proposed drone license. “It could be a huge moneymaker for the town.”

David Boyd, one of Deer Field’s seven board members, supports the drone ordinance.

“Even if a tiny percentage of people get online (for a) drone license, that’s cool,” Boyd said. “That’s a lot of money to a small town like us. Could be known for it as well, which probably might be a mixed blessing, but what the heck.”

There’s even talk of the town—which claims to be home to “the world’s first rodeo”—hosting the world’s first drone hunt. “A skeet, fun-filled festival,” town clerk Kim Oldfield said.

The board will consider the drone hunting ordinance on Aug. 6.


Air Force 2027: Fewer pilots, more drones, more challenges

AF Times

Jul. 12, 2013 – 06:00AM |

By Stephen Losey

The Air Force of the future is likely to be slightly smaller and more reliant on remotely piloted aircraft, face growing challenges from the rise of Asia and rapidly increasing space traffic, and struggle to maintain its technological superiority as the United States produces fewer scientists, engineers and other highly skilled graduates.

And to survive, former Chief Scientist Mark Maybury said in a June 21 report, the Air Force is going to need to adopt several “game-changing” strategies to keep up with emerging challenges and global threats between now and 2027. That could include adopting speedy acquisition strategies from commercial space companies such as SpaceX, making greater use of lasers and other directed energy weapons, and adopting advanced manufacturing techniques such as 3-D printing.

“If we presume the future is going to look like today, I think we are going to be sorely mistaken,” Mica Endsley, the Air Force’s new chief scientist, said at a July 11 breakfast hosted by the Air Force Association. “The future environment, even in the next decade, has some really significant potential threats that we need to be aware of and we need to be thinking about carefully.”

In the report, called “Global Horizons: United States Air Force Global Science and Technology Vision,” Maybury said that the Air Force’s manned air fleet is likely to shrink slightly by 2027. But the Air Force’s fleets of remotely piloted aircraft and their missions are likely to grow significantly.

The United States also needs to pay attention to the worldwide proliferation of RPAs as their cost plunges, Endsley said.

“It’s not just us, and that’s the big change that we need to take a look at,” Endsley said.

The report said that most RPAs today are designed for intelligence, surveillance and reconnaissance, but nations are investing significantly in combat drones. Some may even be capable of delivering weapons of mass destruction such as chemical, biological or nuclear weapons, and the U.S. needs to develop ways to detect and defeat them, the report said.

Advanced fighter aircraft are also likely to spread throughout the world, the report said, which could shrink the U.S. air superiority advantage. By 2025, the report said, 70 percent of foreign combat air forces will be made up of modern fourth- or fifth-generation aircraft, and adversaries are also likely to gain advanced systems, such as advanced missiles and other munitions that could challenge the Air Force.

One possible way the Air Force could counter other nations’ improving air capabilities is to develop and deploy high-powered lasers, the report said.

“Nothing moves faster than light, and advances in efficiencies, power levels, thermal management and optics made directed energy weaponry a game-changing contender,” the report said.

Adopting a more adaptive architecture for aircraft, including “plug-and-play” modular components like sensors and seekers, will allow the Air Force to quickly upgrade weapons systems, Maybury said. However, this approach also carries its own threats unless the Air Force builds in strong cybersecurity.

“Agility has to extend into our technologies,” Endsley said. “We can’t just build a system that is going to be stationary, that maybe we’ll upgrade someday in the future. We have to make sure the systems we’re building are built … in a modular way, in such a way that they can be modernized and upgraded very efficiently.”

And to make its acquisition process more nimble, Maybury said the Air Force of the future needs to learn a few lessons from SpaceX, Scaled Composites and other small, private space exploration companies.

The Air Force’s current acquisition process is incapable of producing innovative systems quickly and affordably, the report said. And the increasing complexity of integrating advanced technology into aircraft such as the F-35 will likely further slow the development process in the future. This “threaten[s] to erode the current decisive advantage” the Air Force now enjoys over its adversaries, Maybury said.

Maybury said the Air Force needs to emulate the rapid prototyping processes used by SpaceX and Scaled Composites, which he said produce aerospace vehicles 50 percent faster than under traditional acquisitions. SpaceX produced the Dragon capsule, which last year became the first commercial vehicle to dock with and deliver supplies to the International Space Station, and Scaled Composites won the Ansari X Prize for its SpaceShipOne, the first private manned spacecraft.

The Air Force needs to refocus its prototype efforts to provide early proofs of concepts and reduce technical uncertainty, the report said. And emphasizing technology demonstrations and creating open challenges could lead to more innovative technological breakthroughs, fill gaps in the Air Force’s capabilities, reduce risk aversion and energize the workforce, the report said.

“The key to doing science and technology well is making sure we build it on a platform that involves rapid innovation and effective testing,” Endsley said. “A lot of what we’ll be looking at is how can we enable that, both within our labs and in industry, to make sure that we are building these systems in effective ways, instead of investing a lot of money and then finding out way down the line that something’s not working like you thought it would.”

Maybury also said the Air Force could cut its development cycle time by 25 percent by using advanced, physics-based modeling and simulation tools. Those tools could help the Air Force assess how feasible and expensive it would be to integrate technologies into a system, identify technology that isn’t ready to be incorporated into systems, quantify risk at critical decision points, and avoid discovering defects late in the development process.

The report also said that using small, low-cost launch capability being developed by commercial industries such as SpaceX would allow the Air Force to more easily access space. This will become more important as space becomes increasingly competitive and congested, and more nations launch satellites and other spacecraft.

The nation’s eroding manufacturing base endangers the Air Force’s ability to design, develop, manufacture and deploy reliable and advanced technologies, the report said. But the rise of 3-D printing could help counter that decline. For example, 3-D printing could allow researchers to rapidly print parts needed for prototypes. And airmen in the field could 3-D print parts quickly to repair battle-damaged systems.

And the Air Force should also expand its use of a flexible hiring authority program, called the Laboratory Personnel Demonstration Project, to the entire acquisition workforce, the report said. This would allow the Air Force to hire scientists, engineers and other technical workers 70 percent faster than under the normal hiring process, which sometimes takes five months or more.

But Maybury is also concerned about the increasing competition worldwide for top scientists and engineers, and the United States’ decline in science, technology, engineering and mathematics, or STEM, graduates. This could erode the nation’s advantage in producing new technologies, he said.


Sequester, Part II,Gives ObamaControl of the Budget 1/3

By ERIC PIANIN, JOSH BOAK, The Fiscal Times July 1 6, 2013

Congress appears to be sliding toward a second year of sequestration—meaning the slashed budgets of this past y ear will soon receive an equally unpopular sequel. That might just mean more power for President Obama, as the executive branch must juggle budgetary choices that Congress refuses to make. The same political gridlock that set off the decade-long sequestration earlier this y ear has only worsened in recent months. House Republicans and Senate Democrats have proven incapable of agreeing to the basic contours for a 201 4 budget. Nor can they work together on immigration reform. And with the Farm Bill, the fate of the food stamp program remains up-in-the-air.

What was once unthinkable—broad cuts in defense and domestic discretionary spending—has become standard operating procedure. The administration still bemoans the sequester cuts that were introduced in a failed attempt to force a compromise on deficit reduction in the 201 1 Budget Control Act. Sequestration has had serious impact on some government employ ees and recipients

of federal aid, but it never kicked the legs out from under the recovery . In fact, the economy is expected to improve as the spending reductions mount. And if Congress cannot provide an alternative to the sequester, it may unwittingly give the White House more control than it already has over how federal dollars are spent.

President Obama has not engaged House Republicans on replacing the sequester, while the issue has been on the backburner until Congress must confront the need to increase the government’s borrowing authority this fall. By October, Congress must also finalize a budget for Obama, or continue to lean on continuing resolutions. “I don’t see a viable majority in the two houses of Congress in fav or of getting out of it in a way that the president would accept,” said William Galston, a budget expert with the Brookings Institution. “And as far as I can tell, there hasn’t been a lot of serious discussion for months.”

“I’d have to say at the moment the chances are in fav or of a second round because it is the status quo and it’s been very difficult for Congress to change the status quo,” said Robert Bixby , executive director of the Concord Coalition, a budget watchdog group. “They would have to change the law, and agreement has been awfully hard to come by .”

Congressional leaders and the administration could still surprise the nation and scratch out a deal. But any compromise would likely have to preserve more than $1 trillion worth of deficit savings over 1 0 y ears. The government can still survive on ad hoc measures that raise the debt ceiling on a short-term basis, while enabling the White House to blunt the impact of sequestration.

For sure, neither side relishes having another crisis like the near default on the U.S. debt in the summer of 201 1 . But Obama has pretty much abandoned hopes for a “Grand Bargain” of spending cuts, tax increases and entitlement reforms to keep the government on a long-term path to deficit reduction.

At the same time, many on Capitol Hill have concluded that the nearly $90 billion worth of sequester cuts that began to take hold this y ear weren’t nearly as harmful to the economy or government programs as Obama and others claimed they would be.

Outside analy sis by the Federal Reserve backs up this sentiment. When the spending cuts and government employ ee furloughs began, other parts of the economy —such as a rebound in home prices—kept growth going. Fed officials noted last month that gross domestic product should increase by more than 2 percent this y ear, despite the 1 .5 percent hit caused by sequestration.

And while congressional budget leaders and appropriators would much prefer to make decisions on future savings, there is a growing tendency to either kick the can down the road, or essentially leave it up to department and agency heads to figure out how to absorb the mandated cuts that will once again total about $90 billion for the fiscal y ear starting on Oct. 1 .



Steve Bell, a former GOP Senate budget adviser and now a senior director at the Bipartisan Policy Center, contends that Congress has been gradually and inadvertently “handing control of the purse over to the executive branch” by allowing sequestration to continue. Relatively anony mous bureaucrats are now making budgetary choices that once belonged to committee chairmen.

“I wonder when people realize that a series of continuing resolutions, plus the sequester together, has taken most of the influence Congress would have over the agencies and has really damaged that relationship,” Bell told Times.

The arrangement might not increase the level of discretionary spending, but it creates a scenario where the White House should have more discretion over where the money goes. Bell said that conserv ative House Republican Houses – especially the newest members who have blocked efforts to reach bipartisan agreements—are in effect handing over power to the executive branch to make many of those decisions.

“It’s just the opposite of what they think,” Bell said. At the same time, Obama would be able to capitalize on both a shrinking deficit and continued economic growth. Fed officials expect the economy to strengthen in 201 4 and 201 5, estimating that GDP will increase by more than 3 percent next y ear despite the second round of sequestration coming. Macroeconomic Advisers, a priv ate forecaster, estimated that GDP would climb by 3 percent next y ear and 3.4 percent in 201 5 as the economy accelerates “past the epicenter of fiscal restraint,” according to a June 1 0 commentary .

Secondly , the size of the sequestration cuts as a share of the economy will slow over time because much of the harshest chopping has already occurred. While the absolute size of the discretionary spending will decline, the rate of this drop-off will slow after this y ear, according to May estimates by the Congressional Budget Office.

House Appropriations Committee Chairman Hal Rogers (R-KY) has called the sequester situation “idiotic” but said his hands are “tied” by the Budget Control Act, the House-passed budget which insists on the lower spending level. Unable to pass a budget, the government has operated under a continuing resolution that sets discretionary spending at $1 .043 trillion. In order to comply with the Budget Control Act that established the sequester, $7 6 billion would have to be cut from that sum for fiscal 201 4.

That cut would need to be made 1 5 day s after Congress adjourns at the end of the y ear. Implementing it without furloughs or lay offs could be very tough for agency heads, who have already struggled to find one-time savings this y ear, aides said. The Senate-passed Democratic budget would offset the sequester with additional tax hikes on wealthier Americans, but that particular solution has guaranteed that House Republicans will not come to the negotiating table. This makes sequestration—the policy lawmakers saw as a cudgel to force compromise because of its crude approach to deficit reduction—all but inevitable for the next y ear.



While Senate Budget Committee Chairwoman Patty Murray (D-WA) has warned of painful reductions to expenditures for national security programs and parts of the social safety net such as Head Start, the predictions have y et to reverberate on a national level with the economy .

Other than congressional intervention to prevent the furloughs of air traffic controllers and federal meat inspectors, law makers essentially stepped back and allowed the sequester fully take hold. As a result, many on Capitol Hill concluded they could easily live with the sequester this y ear—and presumably the next one as well.

In a sign that the Obama administration is bracing for a second y ear of mandated cuts, Defense Secretary Chuck Hagel informed Congress last week that further reductions would cause a sharp decline in the effectiveness of jet fighter wings, group combat units and even Special Operations forces.

Hagel warned that the Pentagon must identify an additional $52 billion in budget reductions for the fiscal y ear that begins Oct. 1 . What matters is that Congress may have no choice but to give Hagel more flexibility in assessing which cuts to make.

“I strongly oppose cuts of that magnitude,” Mr. Hagel wrote. “The size, readiness and technological superiority of our military will be reduced, placing at much greater risk the country ‘s ability to meet our current national security commitments.”


National commission to visit 13 bases

Posted 7/15/2013

by Col. Bob Thompson

Air Force Reserve Public Affairs


7/15/2013 – WASHINGTON (AFNS) — The “National Commission on the Structure of the Air Force” is planning to visit 13 Air Force locations, beginning with Joint Base McGuire-Dix-Lakehurst, N.J., on July 16.

Appointed by the FY13 National Defense Authorization Act, the commission is reviewing the Air Force’s structure to determine if and how it should be changed to meet future missions and funding challenges. The commission’s report and recommendations are due to the president by Feb. 1, 2014.

“This is one of the biggest issues for the future of the Air Force – to develop the right force mix of Regular and Reserve Component Airmen,” said Lt. Gen. James F. Jackson, chief of Air Force Reserve and commander of Air Force Reserve Command. “Getting this mix right directly affects our Air Force’s capability, capacity, efficiency and cost-effectiveness.”


The commission’s visits include:

July 16:

• Joint Base McGuire-Dix-Lakehurst, N.J.

July 29:

• Wright Patterson Air Force Base, Ohio

• Springfield Air National Guard Base, Ohio

July 30:

• Mansfield ANGB, Ohio

• Rickenbacker ANGB, Ohio

Aug. 5:

• Joint Base Langley-Eustis, Va.

Aug. 13:

• Barksdale AFB, La.

Aug. 20:

• Tinker AFB, Okla.

Sept. 5:

• Pease ANGB, N.H.

• Westover AFB, Mass.

Sept. 6:

• Burlington ANGB, Vt.

Sept. 22-23:

• Camp Smith/Hickam Field, Hawaii

Sept. 25:

• Beale AFB, Calif.


The commission is led by the Honorable Dennis M. McCarthy, a retired Marine Corps lieutenant general and the previous assistant secretary of Defense for Reserve Affairs at the Pentagon. The vice chair is the Honorable Erin Conaton, a former under secretary of Defense for Personnel and Readiness and previous under secretary of the Air Force.

Other members of the commission include: F. Whitten Peters, former secretary of the Air Force; Les Brownlee, former acting secretary of the Army; retired Air Force Gen. Raymond Johns Jr., previous commander of Air Mobility Command; retired Air Force Lt. Gen. Harry M. “Bud” Wyatt III, previous director of Air National Guard; Dr. Janine Davidson was a deputy assistant secretary of Defense for Plans, and Dr. Margaret C. Harrell is the director of the Army Health Program at the RAND Corporation.

During their visits at each location, the commission will tour facilities, hear mission briefings and meet Airmen at wing, group and squadron levels. Also, the commission will hold off-base hearings with state and local community leaders as well as members of the general public.

Senior leaders in Congress and the military are looking at ways to balance the ratio of Regular and Reserve Component Airmen to create the most effective and efficient combat capability in austere budget times.

In January, the secretary and chief of staff of the Air Force established a “Total Force Task Force” to create a process to determine the appropriate force mix. This task force is led by generals from each component: Maj. Gen. Mark Bartman, Air National Guard; Maj. Gen. Brian Meenan, Air Force Reserve; and Maj. Gen. John Posner, Regular Air Force. The task force’s data and findings are a ready resource for the National Commission on the Structure of the Air Force.

“Today’s Air Force Reserve is a combat-ready force with operational capability, strategic depth and surge capacity, to be used by the nation in a variety of ways, either abroad or at home,” said Jackson during his testimony to the National Commission on Jun. 3. “With a shrinking defense budget, increasingly consumed by manpower-associated costs, there is little doubt the cost-effective Reserve Component will continue to provide a valuable role.”

Regular Air Force, Air National Guard and Air Force Reserve Airmen work together around the world as a team in air, space and cyberspace. Today’s “Total Force” consists of about 327,600 Regular Air Force Airmen, 105,400 Air National Guardsmen, and 70,880 Air Force Reserve Airmen actively serving in the Selected Reserve as designated by the FY13 National Defense Authorization Act. The Air Force Reserve also maintains a strategic depth of more than 790,000 stand-by or non-participating Reservists and retirees that can be called up for national emergencies.

After more than two decades of continuous combat operations, the Reserve Components are relied upon now for daily operations as well as strategic surges. By leveraging the resources and talents of all the Air Components, planners are developing better solutions to capitalize on the strengths of each component.

“I believe that working together we can combine the personnel, equipment, and readiness necessary to build a Total Air Force equal to all the challenges our nation faces,” Lt. Gen. Stanley E. Clarke III, director of the Air National Guard, during his testimony to the House Armed Services Committee Subcommittee on Defense, on March 20.


Hagel: Budget cuts mean 20 percent trim of top staff in Army, Navy, Air Force, Marines

Washington Post

By Associated Press, Published: July 16

JACKSONVILLE, Fla. — Defense Secretary Chuck Hagel said Tuesday he has ordered 20 percent “across the top” budget cuts for his Pentagon staff and that of his top brass.


The reductions, which he did not spell out in detail, are for the 2015-19 period. They will apply to his office, that of the Joint Chief’s chairman and also the Pentagon headquarters offices of the Army, Navy, Air Force and Marine Corps.

It is one element of a broader effort by the Pentagon to adjust to forced spending reductions that already have resulted in the furloughing of civilian workers. Hagel said he believed Pentagon headquarters staff must share in the sacrifices.

“That isn’t going to fix the problem,” he told about 100 Defense Department civilian employees in a question-and-answer session at Jacksonville Naval Air Station on the second day of a tour of military bases. “But, yes, everybody’s got to do their part.”

Hagel spokesman George Little later said the top brass cuts could save between $1.5 billion to $2 billion over the five years and will target personnel, including civilians and contractors. He said the cuts will happen even if Congress eases budget caps that have created sharp limits on defense spending.

Military spending was slashed by $37 billion this year, forcing job furloughs that began last week for an estimated 650,000 Defense Department civilian employees. The layoffs do not apply to military members, but they, too, are feeling the effects of a budget squeeze that is reducing some training.

The Pentagon faces the prospect of an additional $52 billion budget cut in 2014 unless Congress and the White House come up with a deficit-cutting plan. Hagel told Congress last week that such a large additional cut would have “severe and unacceptable” effects.


Air Force ‘carving out’ more AF members for cyber


Jul. 16, 2013 – 05:19PM |

By Oriana Pawlyk

Demand for airmen with cyber skills will grow to 1,500 in the coming years, up from earlier projections of 1,000, the Air Force’s head of Air Force Space Command said Tuesday.

“I’m being a little fuzzy on the numbers because the numbers are really being finalized, but … when I say 1,500, we’re not sure where exactly it’s going to settle, but it’s gone up from where we initially thought it was going to be,” said Gen. William Shelton, at a Capitol Hill breakfast.

Space Command announced earlier this year they would be standing up additional cyber mission teams between 2014 and 2016 in support of U.S. Cyber Command.

“I think the future is very bright in space and cyber … there are challenges in budgets, there are challenges in threats, but in every strategic review that I’ve seen, space and cyber are mentioned prominentley as things we have to fund and things we have to protect,” Shelton said.

He explained that regardless of budget setbacks, “[carving] those people out somewhere from within the Air Force structure, and [funding] those civilian billets” remains a high priority within the department.

Another priority Shelton focused on was the Air Force Network, or AFNet. AFNet — which had a relaxed December 2012 deadline — is the breakdown of hundreds of local base IT networks consolidated into one enterprise network. The goal of this project is to collapse all individual or stand-alone Air Force, Air Force Reserve, and Air National Guard networks into a central Air Force Network.

“That’s our number one cyber priority, is getting collapsed down to that single network,” Shelton said.

“The reason we’re doing that is, there are now just 16 touch points to the ‘external world’ — the Internet — within the Air Force. That’s [going to be] much easier to defend, much more consolidated from a command and control perspective, it also allows us more flexibility … much more ‘defendability’ of our networks, and it’s gonna be great once we get it completed this next year.”

Shelton said the Air Force is still “going through the dialogue” with senior leadership in the Air Force on current and future cyber projects.

“There’s no question that we’re going to support what USCYBERCOM is looking for in terms of these national missions,” he said.

“In terms of what the Air Force does and how we manage career fields, how we organize ourselves, we’re not quite there yet.”

STEM Labor Shortage: Myth or Reality?

Professors, trade organizations and technology companies weigh in on this question as the U.S. continues to push for more STEM graduates.

BY TANYA ROSCORLA / JULY 16, 2013 1–Tech-Labor-Shortage-Myth-or-Reality.html

Indiana University Bloomington’s Bobby B. Schnabel doesn’t like the spring as much as the rest of the year. Though that may initially seem a bit strange, he has good reason.

As the informatics and computing dean, Schnabel flies to Silicon Valley often to build relationships with tech leaders. And toward the end of the school year, they start asking him his least favorite question: Do you have any students we can hire?

“Part of my job as a dean of a big school is to know a bunch of the IT CEOs,” Schnabel said, “and sometimes they’ll contact me directly in May saying, ‘We’re looking for people.’ And we don’t have people in May — they’re all hired.”

By the time the university holds its main career fair in early September, the best students are already taken. Of this year’s graduating class, 245 undergraduate and masters’ students answered a university survey, which found that just over half of them lined up a job before graduation, while less than a quarter of them decided to continue their education at a higher level.

But while Schnabel identifies a labor shortage in technology rather than STEM as a whole, not everyone believes we have an undersupply of college graduates and other laborers in this field. In fact, two schools of thought compete against each other on this issue, and both of them cite data to back up their claims. This begs the question, “Is the STEM labor shortage reality, or simply a myth?”



At the University of California at Davis, Norman S. Matloff, a computer science professor, says that the STEM shortage is really a web of deceit designed to trick the whole country.

“There’s a giant deception, a huge public relations concerted effort being engaged on the various parties with a vested interest to implant in the American consciousness this idea that we have a STEM labor shortage,” Matloff said.

Many positions in these fields don’t even require a college degree, much less a doctorate. In fact, less than five percent of jobs in the technology field call for high levels of knowledge.

And the technology field has seen success stories at a high level from people who never earned a college degree.

Take Bill Gates of Microsoft, Steve Jobs of Apple and Larry Ellison of Oracle. All of them started college, but dropped out. Now Gates and Ellison are two of the top five richest people in the world.

That said, bachelor’s degree production in computer science saw double-digit growth between 2009 and 2012. And last year, colleges awarded nearly 2,000 doctorate degrees, the highest number ever reported in the Computing Research Association’s Taulbee Survey.

Yet the demand isn’t there, some argue.

“There is still benefit to inspiring kids to be passionate and excited about STEM disciplines,” said Bob Sun, CEO of Suntex International Inc., which helps students build a solid foundation in math. “But the idea that we need more STEM graduates to meet this unfulfilled demand — I don’t think it’s quite exactly what it’s cracked up to be.”



A professional society called the Programmers Guild argues that the U.S. is producing plenty of computer science graduates and has enough older workers to fill positions — they’re just not getting hired. It says companies won’t hire workers over the age of 35, and that they hire foreign laborers through temporary work visas to bring wages down.

In specialty occupations, these H-1B visas allow employers to hire highly-skilled foreign workers when they can’t find U.S. residents or citizens. The employer files a visa application for these workers, which allows them to work for that employer over three years, or up to six years if they receive an extension.

When these visa holders enter the country, they could stay permanently depending on whether their employer files a petition on their behalf.

“If you’re a foreign worker being sponsored for a green card, you are trapped, you cannot go to another employer, because you’d have to start all of that all over again, and it’s just unthinkable,” Matloff said. “So the employers like that. This immobility is huge.”

U.S. Citizenship and Immigration Services issues 65,000 H-1B visas each year to fill positions in specialty occupations such as computer programming. Plus, it provides 20,000 exemptions for students who earned advanced degrees in the U.S.

“Over the past 10 years, we’ve been bringing more H-1Bs than the job market has been increasing by,” said Kim Berry, president of the Programmer’s Guild. “And so I think it’s no question that Americans are being displaced by this.”

The companies that hire these workers beg to differ. They’re calling on Congress to increase the cap so they can bring in more foreign workers. But in addition to increasing the cap, companies such as Microsoft also want to invest in education to build the U.S. pipeline of computer scientists, said Jack Chen, senior attorney of compliance and advocacy for Microsoft Corp.

At the end of February, the company had more than 6,400 open jobs in the U.S., half of which were for research and development positions.

“There really is a hunger and a thirst for talent,” Chen said, “that has not been satiated by the workers we have available in the U.S. market.”

Both schools of thought hold such opposing views that they may not agree anytime soon. And that leaves us to decide for ourselves where we stand on the STEM labor issue.


Kremlin’s Anti-Hacking Tech: Typewriters

150-Year-Old Technology Isn’t a Good Idea to Protect Secrets

By Eric Chabrow, July 16, 2013.Follow Eric @GovInfoSecurity

The apparatchiks at the Kremlin think they’re clever sorts with plans to replace computers with typewriters to prevent the American e-spies at the National Security Agency from hacking into Russian intelligence systems.

The successor to the KGB – Russia’s Federal Protective Service – has placed an order for 20 typewriters at $750 a pop (that’s 24,480 rubles each or 489,603 rubles for the bunch; it’s sounds much more expensive in Russian currency), according to a number of published reports that cite the Izvestia news service.

Of course, the safest way to secure a computer is to keep it off the Internet. (But Stuxnet proved that even high-tech devices not connected to the Internet can be hacked, using infected removable drives to spread the worm to cripple Iranian nuclear centrifuges.) Typewriters go one better.

But the idea of replacing computers with typewriters to mitigate the risk of the United States – or anyone else – stealing secrets is misplaced.

From a practical side, as Sunday Telegraph Chief Foreign Correspondent Colin Freeman points out, correcting mistakes on important documents without the “delete” button would prove difficult:

FOr the chnaces are that ecven if u are reasonabley goood typrer like me, most of what you iwll write will end up litter d with misteks. Sometimes ot the point of bing complely eillegiebe (illegible).”

Even when tongue is removed from cheek, other reasons exist that show replacing a computer with a typewriter is a terrible idea.

Governments have been stealing secrets from one another for years – heck, for centuries – before the invention of the computer and the Internet. Removing documents from a computer and putting them in a vault or locked filing cabinet doesn’t guarantee protection, especially given the prevalence of the insider threat. And, safeguarding documents in transit can be as daunting in the real world as it is in the virtual one.


7,000 Typewritten Pages ‘Hacked’

Though not nation-to-nation intelligence theft, one of the biggest disclosures of classified information in American history was the leaking of the Pentagon Papers, a top secret history that showed how the Lyndon Johnson administration systematically lied about the United States involvement in Vietnam from 1945 to 1967.


Unlike former NSA contractor Edward Snowden and Army Pvt. Bradley Manning of WikiLeaks fame, who copied hundreds of thousands of documents onto electronic drives, Pentagon Papers’ whistleblower Daniel Ellsberg and his friend Anthony Russo photocopied the 3,000 pages of historical analysis and 4,000 pages of original government documents – all typewritten at one point – that they leaked to The New York Times and other newspapers.

The reward for mitigating the risk of abandoning the computer for the typewriter might not be worth the nearly half-million ruble investment. Computers, networks and the Internet have improved productivity and collaboration, and it’s hard to conceive in today’s environment that the work needed to create such crucial documents can be achieved effectively using 150-year-old technology. Simply, contemporary technology is required to protect contemporary content.

Adn thats teh fax; JAck!


Data demand soars as consumers drop landline phones

Dayton Daily News

July 18, 2013

By Dave Larsen

Staff Writer

Demand for mobile and broadband data in Ohio is skyrocketing as more consumers switch from traditional landline phones to wireless communications.

Ohio’s local phone companies have lost 64 percent of their lines since peaking in 2000, and continue to lose lines at a rate of 6 to 10 percent annually, according to a new Ohio Telecom Association report. Those companies range from large service providers such as AT&T Ohio and Cincinnati Bell, with 50,000 or more phone lines, to small locally owned and operated companies with 300 to 25,000 lines.

Experts said the migration to smartphones is forcing phone line companies to diversify into new technologies such as wireless, video and broadband. It also is requiring companies that include large wireless providers to invest billions of dollars to update their networks to meet rising demand for data.

The study found that Ohio’s 11.5 million residents are using an estimated 11.8 million wireless phones, with an increasing number using different phones for home and work.

“As much as the landline market is decreasing, the broadband market is exponentially increasing,” said Stu Johnson, executive director of Connect Ohio, a nonprofit working to expand access and use of broadband Internet statewide. The challenge for telecommunications companies is to manage the transition financially while coordinating the delivery of reliable service to consumers, he said.

The average household consumes 52 gigabytes of data per month — the equivalent of more than 5 million emails or surfing the Internet for about 100 hours monthly — up from 28 GB in 2012. Part of that increase is driven by smartphones, which now account for half of all cell phones in Ohio.

“When you almost double the amount of data over your network in a year, that is a significant issue to engineer,” said Charles R. Moses, Ohio Telecom Association president. The trade group represents the state’s $20 billion telecommunications industry, which includes 42 local phone companies, three wireless carriers and more than 100 associated industry product and service providers.

Officials said the association’s member companies employ more than 20,000 Ohio residents and invest an average of $1 billion annually in the state’s telecommunications infrastructure.

Copper-wire landlines don’t offer multiple uses like broadband — an optic fiber, coaxial cable or wireless medium that has wide bandwidth characteristics and can deliver voice, video and Internet services. However, copper landlines are reliable and can serve customers in rural areas that are costly or difficult to reach with broadband technologies, Johnson said.

Ohio’s wireless-only residents are 81 percent urban or suburban; 84 percent are under age 44; 58 percent have a college education; and 63 percent earn more than $25,000 annually, according to Connect Ohio.

“The least likely to convert to a wireless-only solution would be an older, rural, less educated, lower income individual,” Johnson said. “Those are probably also the most expensive copper customers.”

Federal regulations require local phone companies to supply and maintain a phone line to every residence, business or organization that requests one.

Copper landlines also deliver services that include home health care monitoring, ATM networks and elevator emergency telephones that won’t easily transition to wireless, Johnson said.

The Department of Defense in a July 8 letter to the Federal Communications Commission warned that a rush to transition from copper landlines to wireless and Internet protocol (IP) broadband communications could disrupt the functions of important military and federal agencies, including Federal Aviation Authority air traffic communications.

These agencies continue to rely on wireline networks and services that are “critical to public safety and security — and will do so for the foreseeable future,” wrote Terrance A. Spann, general attorney for the U.S. Army Legal Services Agency.

Wireless is now a $6.8 billion industry in Ohio, according to the Ohio Telecom Association. An estimated 41.3 percent of homes in the state have eliminated local phone service and rely exclusively on wireless communications. Data plans account for 50 percent of wireless revenues.

Last week, AT&T expanded its 4G LTE (fourth generation long-term evolution) wireless network to the Dayton region to provide area customers with greater mobile Internet speeds. Verizon Wireless launched a 4G LTE network in Dayton in June 2011.

Verizon Wireless has seen “geometric growth” in wireless data demand, said Don Caretta, executive director for Verizon’s network in Ohio, West Virginia and Pennsylvania. The company has invested more than $2.5 billion in its Ohio wireless network from 2001 through 2012 to increase capacity to meet demand, he said.

New wireless applications from online movie services to so-called “smart” electric and gas meters will continue to grow data demand, Caretta said.

Similarly, AT&T has invested $1.5 billion from 2010 to 2012 into its Ohio wireline and wireless networks, said Mark Romito, the company’s director of external affairs for southwest Ohio.

In November, AT&T announced a $14 billion project to upgrade much of its wireline network to high-speed IP broadband and convert some of its harder-to-reach copper customers to 4G LTE wireless services by the end of 2015. “We are trying to address customers’ demand for data and the transition to an IP environment,” Romito said.

Ohio wireless by the numbers

90 – Percentage of Ohio residents with a wireless phone

80 – Average monthly wireless bill, in dollars

50 – Percentage of Ohio residents who own a smartphone that connects wirelessly to the Internet

41.3 – Percentage of Ohio homes that are wireless-only

40 – Additional monthly cost per phone for Internet access, in dollars

Source: Ohio Telecom Association


Hacker Magnet or Sophisticated Tool? Obamacare’s Database Debated


Government Executive

By Charles S. Clark

July 17, 2013

The centralized data hub that will link agency records on people who sign up for Obamacare is either solidly on schedule and impenetrable to hackers, or it is floundering and in danger of swelling into “the biggest data system of personal information in the history of the U.S.”

Both views were displayed Wednesday at a joint subcommittee hearing of the House Oversight and Government Reform Committee that roped in issues from cybersecurity to government mismanagement to the fate of the Affordable Care Act to the scandal over political targeting of taxpayer groups by parts of the Internal Revenue Service.

Taken together, representatives from the Health and Human Services Department, the IRS and auditing agencies appeared confident that the Obama administration can meet the health care law’s Oct. 1 deadline for opening state insurance exchanges with sufficient safeguards on personal information to protect Americans’ privacy.

The Federal Data Services Hub, a $394 million contractor operation run by the Centers for Medicare and Medicaid Services in Columbia, Md., is a tool that will allow citizens applying for the new health insurance plans to enter their income and personal identification online and get a determination of eligibility for tax credits, in many cases within seconds. The hub is designed to link databases at HHS and IRS with the Homeland Security and Veterans Affairs departments, the Social Security Administration, the Office of Personnel Management and the Peace Corps.

Hearing chairman Rep. James Lankford, R-Okla., warned that the “potential for fraud and violations of privacy are multiplied by this Rube Goldberg construction.” He cited a June Government Accountability Office report showing that only 20 percent of the privacy controls and other preparations are complete. “The government will waste billions of dollars in subsidizing taxpayers who are not eligible,” Lankford said, and proceeded to blast the IRS as being “highly politicized under this administration” citing 100 visits to the White House by previous Commissioner Doug Shulman.

But the “GAO ultimately concluded that the implementation was workable and on track,” countered Rep. Jackie Speier, D-Calif. “No major program has launched without a few hiccups,” she said, adding that CMS has contingency safeguards and “long experience with complicated health systems.” Speier praised the “dedicated federal and state government employees who are implanting the law of the land” and said she wanted to “debunk the notion that to expand health care we have to sacrifice privacy.”

Speier did, however, express worry that the “federal data hub has a bull’s eye on it, and the potential for being hacked is great.”

CMS Administrator Marilyn Tavenner said her agency is “hard at work designing and testing a secure system, and I can assure you that by Oct. 1, the health insurance markets will be open for business. I can also assure all Americans that the information they supply will be protected to the highest standards.”

Tavenner sought to dispel “confusion,” declaring that no one implementing Obamacare will ask exchange applicants for their personal health information, and that no such information will be stored on the hub. “The hub is not a database that stores, it is a routing tool to put government databases on the same system” while eliminating the need for multiple agencies to design their own, she said. CMS has designed similar systems with privacy protections in implementing the Medicare Advantage program and state Medicaid programs.

Acting IRS Commission Danny Werfel said the tax agency’s design and testing of systems to share information on income eligibility for premium assistance tax credits “is on target to be ready by Oct. 1.” He cited interagency agreements on computer matching, training and the fact that IRS had decades of experience enforcing privacy guarantees under Section 6103 of the tax code. “We have a robust set of requirements that have been battle-tested over the years,” he added, promising strong monitoring and oversight.

Henry Chao, CMS’ deputy chief information officer, cited progress since the GAO report based on April data was prepared. He assured a skeptical Lankford that his team has been testing with states in waves since February and that the amount of time an applicant’s personal information would be stored or “cached” in the hub would be measured in minutes. The information, he said, includes names of people in a household, addresses, email addresses, phone number, Social Security numbers , race and ethnicity, veteran status and, where applicable, pregnancy status, but no information on disabilities. Protections against hacking are being tested by professional experts, he added.

John Dicken, director of health care at GAO, cautioned that the Oct. 1 deadline wasn’t assured., “Much progress had been made, but much remains to be done” in testing the hubs, he said.

Left unsatisfied was Rep. Pat Meehan, R-Pa., who warned that what might be the “biggest data system of personal information in the history of the U.S.” does not appear to have been vetted by the top specialists at the FBI and DHS or at private banks. “Are you ready? Who has access? Are they competent?” he asked the CMS witnesses, raising the specter of cyber theft of intellectual property from U.S. “innovators.” “The personal information of 20 million Americans is just as important” as trade secrets, he said, and “every sector says they are only as strong as their weakest link.” This hub “is an overwhelming task that at best carries an unacceptable price tag.”

Rep. Darrell Issa, R-Calif., the committee chairman, criticized CMS for hiring Serco, a British-owned company, to help set up the exchanges, noting that the company recently was faulted for allowing exposure of thousands of internal Thrift Savings Plan records. “Where are the pilots for a company with no internal controls?” he asked.

Tavenner said Serco is a “highly skilled company with a proven track record, and won the contract in full and open competition.”

In a related drama, Rep. Jim Jordan, R-Ohio, castigated Werfel, who has been on the job for a month and a half, for substituting himself as a witness when the committee had invited Sarah Hall Ingram, who plays a coordinating role in implementing the health care law but who also helped preside over the tax-exempt division unit in Cincinnati responsible for the controversial singling out of mainly tea-party groups.

“Who is the project manager for the ACA?” asked Jordan. Hall is “in D.C. just a few blocks from here,” he noted, brandishing a document showing that Hall recently briefed an IRS oversight board on the technical implementation of the federal data hub.

Werfel disagreed with the characterizations. “At IRS we balance out a lot of factors, two of which are accountability and technical expertise,” he said. “We received an invitation to the hearing, so I suggested a combination of me and the chief technology officer. There are multiple people in the IRS with the expertise.”

Rep. Michelle Lujan Grisham, D-N.M., asked Werfel if he was concerned about the proposed 24 percent budget cut for IRS being considered Wednesday by the Appropriations Committee. “It’s extremely challenging generally, and all mission-critical activities will be severely impacted, from collecting revenue, to helping taxpayers navigate the code, to enforcement, to going after bad actors, to legislative mandates such as offshore tax evasion,” he said. Such cuts would be “extremely relevant across IRS and extremely relevant across the Affordable Care Act. The tax code doesn’t go away” with a budget cut, he said, noting cutbacks in customer service that are already underway. “In the end, it affects the taxpayer.”

DOD systems block feds from reading about NSA leaks

By Frank Konkel

Jul 18, 2013

Leaked National Security Agency documents on classified programs have garnered a lot of attention lately, but feds and contractors are not reading about them on any of the Department of Defense’s unclassified networks.

As it did in 2010 during the Wikileaks saga, DOD is using automated software to block web pages that could contain classified information, like the PowerPoint slides first published by The Guardian and Washington Post that depict the NSA’s PRISM program.
That DOD can filter and block content for its employees, service members and contractors has been public knowledge since at least 2010 when Wikileaks began revealing classified cables to the world. But media reports in late June initially claimed that some agencies within DOD were blocking access to specific websites of certain news outlets. Those were followed by more reports – including a story from The Guardian, which first broke the NSA story – that claimed only content from the Guardian website was being blocked.

DOD spokesman Damien Pickard, however, told FCW that such reports are inaccurate, and emphasized that all of DOD’s unclassified networks – millions of computers – are under the same preventative measures to mitigate unauthorized disclosures of classified information.

“The Department of Defense is not blocking any websites,” Pickart said. “We make every effort to balance the need to preserve information access with operational security; however there are strict policies and directives in place regarding protecting and handling classified information. Until declassified by appropriate officials, classified information – including material released through an unauthorized disclosure — must be treated accordingly by DOD personnel.”

DOD networks make use of software that scans sites for potential malware threats or content concerns through keyword and other filters. Any website flagged by the system that could contain classified information is blocked to users “so long as it remains classified,” Pickart said.

The measure is a department-wide directive. While the automated filters are not entirely failsafe, they mostly do what they are designed to do: Keep feds from viewing classified information before it is officially declassified, even if widely reported in the media. It is not clear, though, whether employees found intentionally accessing classified information on DOD’s unclassified networks would face consequences, or what those consequences would be.

Pickart said it is costly when classified information is accessed on unclassified DOD networks because policy dictates unclassified computers must be scrubbed of unauthorized or classified material. A single viewing of the PRISM images, for instance, could require an IT team response — and might even merit a computer’s entire hard drive being wiped.


New domain names bound for collisions: ‘Things are going to break’

Posted by William Jackson on Jul 12, 2013 at 12:38 PM

The Internet is on the brink of the largest expansion of generic Top Level Domains in its history, with as many as 1,000 new strings expected to be added over the next year, more than quadrupling the current gTLD space.

Some observers, including the operator of two of the Internet’s root zone servers, worry that this expansion of public domains could result in naming collisions with private internal network domains, disrupting those networks.

“We know things are going to break,” said Danny McPherson, chief security officer of Verisign, the company that runs the A and J root servers. Networks in the .gov domain could be affected, as well as those supporting emergency services such as public safety answering points for the nation’s 911 system. “It makes us uneasy,” McPherson said.

At risk is any enterprise with a network naming scheme using domain names for non-public resources that are the same as new domain name strings now being considered for approval on the Internet. There are 1,833 such names now being considered by the Internet Corporation for Assigned Names and Numbers, and the approved new gTLDs could begin being delegated in the root system later this year.

The resulting collisions could cause some networks to become about as useless as the Washington Beltway on Friday afternoon.

The solution is to change those internal domain names to avoid naming collisions. But this can be a complex job for a large enterprise, and McPherson worries that many administrators are not aware of the issue. He believes the 12 root zone operators have a responsibility to monitor the global systems to identify potential collision situations and warn network operators in advance. But there is no zone-wide system to provide that visibility.

Top Level Domains are the suffixes on URLs that appear to the right of the final dot in the address, such as .gov and .com. There now are 317 of these, including country names such as .us and .uk. Name servers in the Domain Name System use authoritative lists maintained in the 13 root servers to associate URLs with an IP address to direct queries. The potential problem with the domain expansion is that requests for a network’s internal domains are routinely checked against the global DNS database as well as the local enterprise name database. If the domain name is not in the global database, it looks for it in the local database, and the query is directed to the proper server within the network.

But if that internal name is added to the Internet’s collection of domains, the internal request will be sent out to the Internet and the user will not be able to access resources on his own network.

How likely is this to happen? Take .home for instance. This is a default internal domain name used on millions of pieces of home networking equipment. McPherson said .home is one of the top five queries received by Verisign’s root servers. It also is one of the most coveted new gTLDs being considered, with 11 applicants. Other commonly used internal domain names being considered for the Internet include .inc, .corp, .cloud and .mail.

McPherson also is concerned that less commonly used names such as .med that might be used by hospitals and clinics for connecting with health care equipment could suddenly become unavailable internally if .med goes onto the Internet.

Ideally, if you are managing a network you would be warned by the root zone operators when they notice local domain queries from your network that would be likely to result in collisions. With no system in place for monitoring for this, however, the responsibility falls on network administrators to know their naming schemes, pay attention to ICANN’s new gTLD program,  and make sure they are not using new Internet domains internally.

Ohio pursuing drone test center with Indiana, as FAA tackles unmanned flight

The Plain Dealer

By Tom Breckenridge

on July 13, 2013 at 8:00 PM, updated July 13, 2013 at 8:01 PM

DAYTON, Ohio — Ohio and Indiana have teamed up to pursue a federal site for research into unmanned aircraft, popularly known as drones.

The Ohio Department of Transportation, which has its own, squirrel-sized drone, is leading Ohio’s share of the effort.

The two-state team is competing with applicants from several dozen states. The Federal Aviation Administration will oversee six test sites, where experts will help the FAA figure out how unmanned aircraft — ranging in size from jetliners to model airplanes — will one day mix with the nation’s air traffic.

Ohio and Indiana have pitched a research site in Springfield, near Dayton and the Wright-Patterson Air Force Base. Indiana has a bounty of restricted air space to test unmanned vehicles, officials said.

Ohio officials recently approved $70,000 for the proposed test site. The money, from an Ohio Department of Transportation aviation fund, allows the state to lease office space the next two years at the nonprofit Advanced Virtual Test Cell, Inc., known as AVETEC.

The FAA expects to name the six sites later this year. The agency wants a mix of geographies, populations, weather and air traffic.

Winning states could gain a foothold in unmanned aircraft, a fast-growing, multibillion dollar industry that will generate thousands of high-tech jobs.

“We think it would have a great economic benefit,” said Rob Nichols, spokesman for Gov. John Kasich. “It builds on existing aerospace strengths in the Dayton region. We think leveraging this (site) could be a real boon to the area and the state.”

A recent economic-impact report from advocates of unmanned aircraft estimates the industry will generate 100,000 jobs nationwide by 2025.

Ohio will see 1,400 jobs in the industry and another 1,300 in related employment by 2025, the report said. Those jobs will have a $265 million impact on the state’s economy, according to the Association for Unmanned Vehicle Systems International, based in Arlington, Va.

Ohio was not among the top 10 states that could see the most  benefit from the fledgling industry.

Right now, unmanned aircraft are mostly known for their war-time uses. The U.S. military frequently directs bomb-laden drones to al-Qaida targets in Afghanistan.

Drones are flying in U.S. airspace but under tight control.

Uses include disaster relief, fire fighting, search-and-rescue, law enforcement and border patrols, the FAA says.

ODOT uses a 2-foot-wide drone that carries a specialized camera for surveying land, said spokesman Steve Faulkner. ODOT has used the $15,000 device about 25 times for smaller projects, like bridge repairs and road resurfacing.

It replaces airplanes, Faulkner said.

“We can use this for much less cost, as opposed to fueling up an airplane,” Faulkner said.

ODOT selected the Springfield site from 15 around the Dayton area.

The AVETEC center affords about 2,000 square feet and room for 15 workers. The site meets federal and state specifications, including broadband connections with others in the aerospace industry; computers for modeling and simulations; and the ability to meet U.S. Defense Department requirements for top security.

Nichols, the Kasich spokesman, said AVETEC is already in use as part of a national challenge grant for unmanned vehicles, offered by the National Space and Aeronautics Administration.

A Dayton-area nonprofit, Development Projects, Inc., is overseeing competition for the NASA grant, which features a $500,000 prize for the development of technology that enables drones to sense and avoid other aircraft.

NASA’s Glenn Research Center in Brook Park is part of the aerospace portfolio that hopefully convinces the FAA to set up a test center in Ohio, said Mike Heil, president and chief executive officer of the Ohio Aerospace Institute, a trade group based near NASA Glenn.

NASA Glenn’s expertise in space communications is a boon to unmanned aircraft, Heil said. And Northeast Ohio has “lots of companies” that will benefit from the growing industry, Heil said.

With the development of drones comes concern about privacy and government surveillance. Along with crafting new regulations for unmanned flight, the FAA is developing a privacy policy that “emphasizes transparency, public engagement and compliance with existing law,” according to an FAA news release.

Nichols said the state is working with the law school at the University of Dayton to craft “very strict and cutting-edge privacy rules and requirements” for drone use in the state.


What They Told Us: Reviewing Last Week’s Key Polls

Rasmussen Reports

Saturday, July 20, 2013

Reactions to the George Zimmerman verdict highlight how wide the racial divide remains in America.

At week’s end, 44% of Americans agreed with the Florida jury’s finding Zimmerman not guilty in the shooting death of black teenager Trayvon Martin,  down from 48% earlier in the week.   Thirty-five percent (35%) disagreed, but that included 80% of black Americans.

The verdict was reached by an all-white jury of six women. Sixty-five percent (65%) of blacks do not think it is possible for an all-white jury to fairly decide a case involving the shooting death of a black man. Seventy percent (70%) of whites think it is possible.

Among those who agree with the jury verdict, most believe Zimmerman was innocent and acting in self-defense. However, nearly a third believe neither side was able to prove its case and that the jury verdict merely reflected the fact that Americans are innocent until proven guilty.

Overall, that means 35% think Zimmerman should have been found guilty, 29% believe he was innocent, 21% are not sure, and 13% agree with the jury verdict because neither side proved its case beyond a reasonable doubt.

Just 32% of Americans now have a favorable opinion of Zimmerman, while 48% view him unfavorably.

Still, only 24% believe Zimmerman’s actions that led to Martin’s death were motivated primarily by racism. Just 21% believe the Hispanic neighborhood watch volunteer should now be charged with a hate crime by the U.S. Department of Justice.

But on virtually every question related to this case, there is a wide racial difference of opinion.

Currently, 32% of voters believe American society is generally unfair and discriminatory. That’s near the highest level ever recorded.

Just 30% say the country is heading in the right direction, and voters remain pessimistic about the nation’s future. Thirty-six percent (36%) think America’s best days are still to come, but 49% think those days have come and gone. This ties the highest level of pessimism in nearly a year.

Confidence in the nation’s safety against terrorism is at its lowest level in several years. Only 39% of voters believe the United States today is safer than it was before the terrorist attacks on September 11, 2001. Confidence jumped to 51% after Osama bin Laden’s death in May 2011 and hadn’t fallen below 40% until now.

Fifty-nine percent (59%) of Americans continue to feel that Boston Marathon bombing suspect Dzhokhar Tsarnaev should receive the death penalty if convicted. Fifty-three percent (53%) say it’s at least somewhat likely that he will receive the death penalty.

President Obama’s job approval ratings remain at levels seen throughout most of his first term, down from the bounce they experienced just after his reelection.  Positive assessments of his leadership have fallen to their lowest level in over a year of regular tracking. Forty-three percent (43%) of voters now consider the president a good or excellent leader.  Thirty-eight percent (38%) give him poor marks for leadership.

Just 45% hold a favorable opinion of the president’s national health care law. Voters continue to believe costs will go up and quality with suffer under the new law. Eighty percent (80%) think the law is likely to cost more than official estimates.

By a two-to-one margin, voters agree with the House’s decision this week to delay for one year the law’s requirement that every American buy or obtain health insurance.

Voters strongly disagree with the Obama administration’s decision to make it easier for low-income Americans to qualify for health insurance subsidies under the new health care law. Eighty-six percent (86%) think these individuals should be forced to prove they are eligible by documenting their income and their lack of access to insurance.

Forty-four percent (44%) of voters now rate the president’s handling of health care issues as good or excellent. That’s up from June’s low of 38% but more in line with regular surveying earlier this year. Forty-three percent (43%) rate the president poorly on health care.

Twenty-one percent (21%) believe the president’s new regulations on the coal industry will help the economy, but twice as many (41%) think those regulations will hurt the economy instead. Interestingly, voters now view the U.S. coal industry more favorably than the Environmental Protection Agency and are closely divided when asked if the Obama administration’s ultimate goal is to kill that industry.

 Most voters believe that all new EPA regulations should require congressional approval before going into effect. But then 60% think it is more important to preserve our constitutional system of checks and balances than it is for government to operate efficiently.

“Many in politics act as if the end result is creating a government that works,” Scott Rasmussen notes. “However, the real goal should be to create a society that works. A system of careful checks and balances may frustrate political activists from both parties, but it protects the American people from over-zealous politicians and the demagoguery of passing political fads.”

Senate Democrats threatened Republicans this week with changing long-standing Senate rules and effectively eliminating the minority’s filibuster power to delay certain nominations and votes, but the two sides reached a last-minute compromise. Forty-four percent (44%) favor the proposed elimination of the filibuster; 38% oppose it. The filibuster is not a constitutional protection.

For the third week in a row, Republicans hold a one-point lead over Democrats on the Generic Congressional Ballot. 

Consumer and investor confidence remains down from a month ago but is still near its highest levels in several years. 

In other surveys this week:

— Forty-three percent (43%) of Americans now think the spike in food stamp recipients is chiefly because government rules have made it easier to get food stamps. Fifty percent (50%) believe it is too easy to get food stamps in this country, a 10-point increase from December 2010.

— Most voters think it’s possible for the United States to achieve energy independence through shale oil development and government-funded programs to promote alternative energy sources. 

— Forty-four percent (44%) of voters favor a ban on abortion after 20 weeks. Forty-one percent (41%) oppose such a ban. Forty-eight percent (48%) favor a law that would require doctors to have admitting privileges at nearby hospitals before they can perform abortions.

— Fifty-nine percent (59%) of Americas believe government subsidies should be used to keep costs lower on student loan interest rates. But 81% think lowering tuition costs would do more to help college students than giving them easier access to student loans. Fifty-three percent (53%), in fact, believe the availability of student loans has actually helped increase the cost of college tuition.

— Three-out-of-four Americans still prefer a traditional book over an electronic book-reading device and continue to reads books that way. 

— Fifty-six percent (56%) say they rarely or never eat a meal from a fast food restaurant during a typical week. 



From → Uncategorized

Comments are closed.

%d bloggers like this: