Skip to content

September 7 2013



Also on the Newswire blog at


Unmanned aircraft bought online being deployed to monitor private and public property in Australia

by Press • 1 September 2013





UNMANNED aircraft used to spy on enemies in war zones are now being deployed to monitor private and public property and to perv on unsuspecting Queenslanders.

Unregistered and unlicensed drones costing as little as $800 on the internet are being used to beam back spy video of women on beaches and areas ordinarily hidden from prying eyes.

Legitimate operators want the authorities to ground the rogue drones but the privacy commissioner says there are no rules in his armoury to take them down.

Queensland’s acting Privacy Commissioner Lemm Ex, who told The Courier-Mail he expected more drones to be used in the state for legitimate crime detection, urban planning and search and rescue, said he could not stop private rogue operators.

“There is no privacy law in Australia that will deal with one neighbour using a drone to survey another,” Mr Ex said.

“There may be non-privacy laws which apply but it remains that the use of drones by individuals is not currently regulated by privacy law.

“And anybody with $1000 can deploy a surveillance system.”

Legitimate companies and agencies, such as police and mining and power companies, are also using the technology.

“Anecdotally there is strong consideration for the use of drones in Queensland for everything from surf lifesavers to patrol Queensland beaches to specific crime-fighting activities,” Mr Ex said.

“One of the reasons why drones could be an attractive option for agencies is that the technology is both now eminently affordable and relatively sophisticated.

“Where drones differ slightly from the now familiar fixed camera surveillance systems is their potential for short one-off surveillance and the fact that geography is no boundary.

“There is an easier capacity for drones to survey what has traditionally been considered as personal spaces such as private property.”

He said if government agencies used drones they would have to comply with the Information Privacy Act’s privacy principles.


One of Queensland’s 11 licensed drone operators has written to the Civil Aviation Safety Authority asking for intervention to stop the unregistered spy planes.

Eric de Saint Quentin of Aerial Pix said drones were being abused by unlicensed operators who were buying the aircraft from the internet.

“People are flying UAS (unmanned aircraft systems) … and are not qualified. Some are using them on women sunbaking at the beach,” he said.

He said his company was doing surveillance for mining companies and power companies in southeast Queensland but for operational reasons would not name the businesses. He said some surveillance was done in built-up areas.

He said the companies requested the drones for security reasons or for maintenance.

A live feed goes straight to the companies and the footage was not accessed by his business, he said.

A spokesman for CASA said action could be taken against unlicensed drone operators but evidence was needed.

CASA requires a drone operator to have a controller’s certification and certificate. Drone pilots require general aviation knowledge in line with a private pilot’s licence, and specific unmanned aircraft skills.

Drones cannot be used for recreational use.


Publications say ND leads race to land UAS site

by Press • 3 September 2013

By: Kevin Bonham, Forum News Service


GRAND FORKS, N.D. — North Dakota appears to be one of the leading contenders in the race to land one of six national test sites to integrate unmanned aircraft into the national airspace.

An article, “How North Dakota plans to become the drone capital of America,” published this week on, the online version of Popular Science magazine, lists the state as one of seven top contenders.

“North Dakota will probably get the bid, and it should,” wrote the author, Kelsey D. Atherton. “Testing drones in extreme winter conditions is important, and it’s best to do it where it’s very unlikely a mishap can actually hurt someone.”

The Federal Aviation Administration is expected to name the six national test sites in December.

Atherton, a science writer and blogger, said he based his research on the same criteria listed by the FAA site selection committee: geographic diversity, climatic diversity, location of ground infrastructure and research needs, population density and air traffic density.

The Association of Unmanned Vehicle Systems International, which last week hosted a convention in Washington, D.C., estimates that integration of unmanned aircraft systems (UAS) into the national airspace will result in “100,000 jobs created and economic impact of $82 billion” nationwide by 2025.



Other top contenders, according to Atherton, are: San Diego; Hancock Field, N.Y.; Sierra Vista, Ariz.; Huntsville, Ala.; Creech Air Force Base, Nev.; and Dayton, Ohio.

North Dakota, one of seven states that had booths at last week’s AUVSI convention, also was mentioned in stories by various media covering the event.

“(University of North Dakota) is one of the nation’s UAS academic hubs. It recently announced it developed sense-and-avoid software that will be tested aboard a NASA unmanned aircraft,” National Defense Magazine reported.

The Washington Post noted North Dakota, Oklahoma and Ohio in its article on the event, noting North Dakota’s claim to have the nation’s first four-year UAS degree, Oklahoma’s UAS engineering graduate program and Ohio’s Air Force research lab.

The newspaper quoted Lt. Gov. Drew Wrigley, who was one of the booth’s presenters, who said the state, “‘already is on the leading edge’ of evaluating drones for agriculture, search and rescue, and inspecting infrastructure like pipelines for oil, gas, water and power lines.”


Al Palmer, director of UND’s Center for UAS Research, Education and Training, listed other states are among the contenders, adding Alaska, Oklahoma and Florida to the seven listed in the Popular Science article.

Palmer is confident North Dakota will be one of the winners.

“We want to become the drone capital of America,” he said. “The state of North Dakota is very friendly to UAS operations. A lot of people think we are going to be designated, but you can’t rest on your laurels. We’re still working hard.”



The state already is making plans to be ready when the announcement is made.

Gov. Jack Dalrymple in May appointed Col. Robert Becklund, former commander of the North Dakota National Guard’s 119th Fighter Wing, to lead the North Dakota test site project. Becklund recently served on a UAS task force at the Pentagon.

“We want to hit the ground running,” he said.

Becklund already is working on UAS projects in the state. He and Palmer spent time working this week on a collaborative UND-North Dakota State University UAS precision agricultural research project.

But Palmer said even if North Dakota ultimately is not one of the UAS national airspace integration test sites, the state will continue to be one of the nation’s industry leaders.

Atherton agrees.

“There are states less confident than North Dakota. I think it’s justified in the roughrider state (Do people say that there? I’m new at the North Dakota beat), but it also looks like North Dakota will develop unmanned aviation fine without selection,” he said in an email this week.

“FAA-approved drone use is happening in North Dakota right now,” he wrote, “and states with established programs like that clearly have a head start. States that get approval will benefit, but by no means is that the only way to develop a drone economy.”



Obama Issues Plan for 1 Percent Pay Raise Next Year

By Tom Shoop

August 30, 2013

President Obama issued an alternative federal pay plan late Friday, setting an across-the-board increase for civilian federal employees of 1 percent in 2014.

The figure matches the amount the president requested in his fiscal 2014 budget proposal. Obama issued a separate plan providing 1 percent boost in monthly basic pay rates for military service members.

In a letter to House and Senate leaders, Obama wrote, “Civilian federal employees have already made significant sacrifices as a result of a three-year pay freeze. As our country continues to recover from serious economic conditions affecting the general welfare, however, we must maintain efforts to keep our nation on a sustainable fiscal course.”

Under Obama’s plan, locality pay levels would remain at their 2013 levels.

Obama declared that his pay proposal “will not materially affect the federal government’s ability to attract and retain a well-qualified federal workforce.”

The leader of the largest federal labor union praised the proposal, though without much enthusiasm.

“Although the 1 percent is a pitiful amount that doesn’t begin to compensate for the furloughs and three years of frozen pay, it is a welcome development,” said J. David Cox, president of the American Federation of Government Employees.”

“To call this raise inadequate is an understatement, but it is good news all the same,” he added.

If the president had not informed Congress of his alternative pay plan for feds by the end of August, then the increase mandated by the 1990 Federal Employees Pay Comparability Act would have kicked in. Under FEPCA, the raise would be determined by the change in the Employment Cost Index minus 0.5 percent. For 2014, that equals 1.3 percent.

Presidents largely have ignored the FEPCA formula in their federal pay raise proposals, preferring to offer their own figure. Congress created FEPCA, which provides an annual across-the-board salary boost and a locality pay adjustment for General Schedule employees, to close the public and private sector pay gap. The latest Federal Salary Council report concluded that federal employees are underpaid relative to private sector workers by approximately 34.6 percent.

The reality, however, is that Congress will end up determining whether federal employees receive a pay raise next year.

So far, lawmakers have not shown much enthusiasm for ending the three-year freeze on federal employees, despite Obama’s repeated calls to do so. None of the House fiscal 2014 spending bills to date contain funds for a civilian pay raise. The House has passed four of the 12 spending bills for fiscal 2014: Defense; Energy and Water Development; Homeland Security; and Military Construction, Veterans Affairs and Related Agencies. The Homeland Security and Military Construction-VA bills did not endorse a civilian pay raise, but didn’t forbid it either.


GSA Unfreezes Federal Per Diems


By Eric Katz

August 30, 2013

Federal employees will receive a small increase in their per diems starting in October, the General Services Administration announced Friday.

Federal agencies will reimburse their employees traveling in “standard” areas of the continental United States up to $129 per day for expenses — $83 for lodging and $46 for meals and incidentals — in fiscal 2014. The lodging per diem increased from $77 in fiscal 2013, while the meals portion held steady. GSA did not increase the federal per diem last year.

GSA’s per diem is about 5 percent lower than the average daily rate of the market, the agency said. While the rate applies to about 2,600 counties, 400 additional “non-standard areas” — or NSAs — receive individual calculations. Feds traveling in cities with high costs of living, such as New York City, receive inflated reimbursements, for example.

Some of the NSAs, such as Washington, D.C., will actually see a reduction in their per diem rates, while others will receive an increase.

GSA also announced the elimination of the Conference Lodging Allowance, which let federal travelers spend 25 percent more than per diem rates for conferences. The move will save the government $10 million in fiscal 2014, GSA said.


The agency also recently announced new airfare rates for federal employees through its City Pair Program, estimating they will save $2.2 billion annually. The Obama administration has targeted federal travel as a means to cut spending, directing agencies to reduce travel costs by 30 percent.

The directive has led to $2 billion in savings across government, GSA said. Still, the agency made clear federal employees must occasionally travel to accomplish their core missions.

“For many public servants, travel is a necessary part of their jobs,” GSA wrote in a blog post announcing the new per diem rates. “As each agency reviews its travel and conference-related activities, each agency must ensure that any spending serves the American people as efficiently and effectively as possible.”



Air Force Developed Bombs Capable of Destroying Syria’s Chemical Weapons


by KRIS OSBORN on AUGUST 30, 2013


B-2The U.S. Air Force has spent years developing so-called “Agent Defeat Weapons” designed to target and destroy stockpiles of chemical or biological weapons without dispersing or releasing them to surrounding areas, service officials said.

“The U.S. Air Force has Agent Defeat Weapons designed to limit collateral damage and effects,” Air Force spokeswoman Jennifer Cassidy told Military​.com. “The munitions are PAW (Passive Attack Weapon) and Crash Pad.”

Both of these weapons would be carried by aircraft such as the F-15 or F-22 fighter jets and B-2 or B-1 bombers. It’s likely the Air Force would deploy the weapons from a B-2 or F-22 to take advantage of their radar-evading stealth technology considering the advanced air defense systems in Syria.

Could these weapons be used if a strike on Syria is ordered? Air Force officials would not comment upon whether the Agent Defeat Weapons were part of the discussion or strategic calculus regarding Syria.

An official with the Office of the Joint Chiefs of Staff, Gen. Martin Dempsey, did not offer any specifics regarding planning details or ongoing considerations – but did tell Military​.com that planning sessions, meetings and considerations were currently underway.

“The Joint Staff continues to meet and plan in order to provide the best possible military advice and options to the President. It would be inappropriate to speculate on what decision the President might make and what military options might be used in support of that decision. The U.S. military remains postured to provide a range of military capabilities as directed by the President,” said Cmdr. Scott McIlnay, spokesman with Office of the Joint Chiefs of Staff, Pentagon.

The CrashPad, or BLU-119/B weapon is a high-heat explosive bomb designed to incinerate chemical agents before they can be harmful, according to defense officials and DoD documents.

The weapon is a 420-pound, high-heat incendiary weapon with what’s called a “blast-fragmentation” warhead. The Crash Pad is built from an existing standard MK 84 bomb body. The “PAD” in CrashPad stands for “Prompt Agent Defeat,” referring to the weapon’s ability to destroy chemical and biological agents without causing contamination, official documents describe.

The Passive Attack Weapon, or PAW, involves firing a host of steel and tungsten penetrator rods to create a “kinetic energy” battlefield effect without using an explosive. The weapon, first used to knock out antennas in Iraq during Operation Iraqi Freedom in 2003, is among the weapons used to advance what strategists call “effects based warfare.”

The idea with effects-based warfare is to achieve a strategically valuable battlefield “effect” without necessarily having to damage or attack large portions of the infrastructure of the attacked country or area. The PAW penetrator rods, which range from several inches to more than one-foot, can disable an enemy fuel tank, antenna or helicopter without necessarily damaging people.

One analyst said if the PAW were to be fired from a high enough altitude and was able to travel with enough terminal velocity – it could destroy chemical weapons stockpiles without releasing contaminants.

“When you hit something at high velocity, what you get is a flash of incredible heat in a confined area extremely fast. That can vaporize everything in small area,” said Daniel Goure, vice president of the Lexington Institute, a Virginia-based think tank.

Goure likened the effect to the impact of so-called “Sabo” Kinetic Energy 120mm tank rounds fired by the U.S. Army’s M1Abrams tank.

“A Sabo round is essentially the same thing, a combination of spalling and heat effects. The round melts its way into the tank,” he said.

Being able to generate enough heat sufficient to incinerate or neutralize the harmful agents is an essential ingredient to the success of Agent Defeat Weapons, according to military officials and scientists.

“Most agent defeat options—including nuclear weapons and high-explosives—neutralize chemical or biological agents by raising the target’s temperature. Thus, to assess any weapon’s effectiveness, we must first determine the threshold temperature for rapid agent neutralization,” Brookings Institution Scientist Michael A. Levi said in written testimony to the National Academy of Sciences, 2004.


Wearable Electronics Market Worth $8.36B By 2018


According to a new market research report “Wearable Electronics Market and Technology Analysis (2013 – 2018): By Components (Sensors, Battery, Display, Networking); Applications (Consumer, Healthcare, Enterprise); Products (Smart -Textiles, Glasses, Watches); e-Materials and Geography”, the global wearable electronics market revenue is expected to cross $8B, and the total unit shipment is expected to cross 130 million units globally, by 2018.

Wearable Electronics, often also referred as wearable computing or wearable technology, is the most buzzing next-generation aspect in the consumer electronic and computing technology industry in the current scenario. It is expected to be the next big thing in the field of computing, with a new generation of electronic gadgets creating a wave similar to that created by the global smartphone revolution. As such, it comes as no surprise that increasing number of technology giants are betting high on wearable products and devices.

The field of wearable technology includes several products such as smart glasses/goggles, ring/finger worn scanners, foot-wear such as athletic, fitness & sports shoes, wrist-wear such as advanced electronic watches and wrist-bands, and others (such as head-bands and neck-wear) along with the upcoming “smart textiles” and the yet to commercialize (expected in 2016) “e-textiles”.

The global wearable electronics market was worth more than $2.5B in revenue in 2012 and is expected to cross $8B in 2018, growing at a healthy CAGR of 17.7% from 2013 to 2018. In terms of products, wrist-wear accounted for the largest market revenue in 2012, with total revenue of the most established wearable electronic products – wrist-watches and wrist-bands combined, crossing $850M.

Among application sectors, consumer applications accounted for the largest market share, with revenue crossing $2B, as of 2012. However, that of enterprise and industrial applications is expected to grow at the highest CAGR (more than 21%), during the forecast period of 2013 to 2018.

The global wearable technology ecosystem’s value was estimated to more than $4B as of 2012, and is expected to reach cross $14B by 2018, growing at a CAGR more than 18% from 2013 to 2018. The total addressable market (TAM) for wearable technology is estimated to be more than $14B, as of 2012, and the current level of penetration for wearable technology was estimated to be roughly 18%. This market penetration rate of wearable technology is expected to accelerate (increasing rate of penetration every year) over the next five years, reaching roughly 46% penetration level in the TAM, by 2018.

The global wearable electronic textiles market is expected to grow faster than that of the overall wearable electronics market, with increasing demand and growing consumer adoption for wearable electronic smart-textiles, along with expected commercialization of more advanced – wearable electronic e-textiles by 2016.

The market of electrical and electronic components for wearable electronic products is also expanding rapidly, as the market value of components amounts to roughly 66% of that of products. With dynamically changing landscape for components in wearable electronics, the global wearable electronic components market is expected to cross $6B by 2018, offering huge revenue potential for key electronic component manufacturers focusing on this field.

The research report, based on an extensive research study on the wearable electronics market, and its entire ecosystem, describes the market trends, drivers, restraints and opportunities of the wearable electronics market and forecasts the market and all its sub-segments to 2018, in terms of both – revenue and shipments. The overall market statistics are segmented on basis of technology, components, form-factor, products application, industry verticals and geography.

This global report gives a bird’s eye-view of the market across geographies – North America, Europe, Asia-Pacific, and ROW (Rest of the World). North America is the largest geography in terms of wearable electronics unit shipments and market revenue. APAC has been identified as the fastest growing region, with China leading the way. Europe and ROW are also promising markets, with Germany and the U.K. in Europe, and the Middle-East and Latin America in ROW providing impetus to the growth.

The report profiles 23 promising players in the wearable technology market ecosystem and also includes market share analysis (with estimated market shares and rankings) for each of the product categories in this market. The detailed competitive landscape of the market included presents a very interesting picture, where large number of small players has become a force to reckon with. The market is witnessing a series of new product launch and announcement and partnership across the value chain. Some big announcements by small and big players alike are expected in the coming months.

Some of the key players in the wearable electronics market include Adidas AG (Germany), Fitbit, Inc. (U.S.), Fibretronic Ltd. (U.K.), Google, Inc. (U.S.), Jawbone, Inc. (U.S.), Nike, Inc. (U.S.), Olympus Corporation (U.S.), Recon Instruments, Inc. (Canada), Vuzix Corporation (U.S.), and Weartech s.l (Spain) among others.



UAVs And Electronic Payloads: Navigating Turbulent Times, Bright Skies Ahead

August 31, 2012

As seen in Electronic Military & Defense magazine.

By Paul Kruczkowski, Editor


The unmanned aerial vehicle (UAV) market is alive and well, despite worldwide economic worries and concerns about deep U.S. defense budget cuts. Although the tremendous growth the industry has experienced in recent years may slow temporarily, analysts still predict great things for UAVs over the next decade and see the sector as one of the best opportunities for electronic design and manufacturing in the defense industry.


In this article, we will first discuss the overall prospects for the UAV market. We will then explore some of the important UAV electronic payload categories, the projections for the payload market, and a recent design trend that has the potential to transform the UAV electronic payload industry. Next, we will take a look at the major UAV programs — and the electronic payloads they carry — that are shaping this rapidly evolving market. Finally, we will take a glimpse at several emerging UAV programs that may drive the electronic payload market going forward.


The UAV Market

Market analysts remain bullish on the potential of the UAV market. For instance, the Teal Group projects worldwide UAV spending to nearly double over the next decade, from $6.6 billion in 2012 to $11.4 billion in 2021, totaling just over $89 billion in the next 10 years. It is abundantly clear that intelligence, surveillance, and reconnaissance (ISR) is the cornerstone of the global war on terror, and that the UAV has proven itself to be very adept at ISR. Military and civilian authorities around the world have observed the capabilities and versatility of the UAV and will continue to expand their use of unmanned vehicles.

The Teal Group also predicts the U.S. will account for approximately 62% of the worldwide RDT&E (research, development, test, and evaluation) for UAVs and 55% of the procurement of these systems. The Asia-Pacific region will be the second largest market, followed by Europe as a close third. The U.S. defense budget still has $3.8 billion earmarked for UAV spending in FY2013, even with a 12.2% reduction in overall aircraft funding.


Electronic Payloads

Today’s UAVs carry a wide range of electronic payloads on ISR missions. Sometimes these payloads are mounted within the vehicle, but in other cases multiple sensors are combined into payload suites that can be mounted within stabilized multi-axis gimbal systems to provide wide angles of coverage. Major categories of UAV electronic payloads include:

Electro-optic (EO) cameras provide near real-time high-definition video surveillance in the visible light spectrum.

Infrared (IR) sensors operate in a wide range of frequencies, from long-wave (LWIR) to short¬wave (SWIR), and are used for thermal imaging of people and vehicles, providing photograph-like images of objects in the field of view, and detecting targets by their chemical signature in addition to their visual appearance.

Lasers provide luminance for night vision and designators for munitions targeting.

Synthetic aperture radar (SAR) utilizes a single antenna and time-multiplexing of radar returns to create accurate radar imaging to target area.

Signal intelligence (SIGINT) sensor payloads detect, locate, and classify RF transmissions, allowing forces to monitor or interrupt enemy communications

Electronic support measures (ESM) are electronic warefare (EW) payloads that detect, locate, and classify radar emitters, providing early warning so that radar sources can be disabled.

Command, control, communications, computers, and intelligence (C4I) systems aboard UAVs or in ground control units allow collected video, data, or targeting information to be coordinated and disseminated throughout the ISR network.

The market for UAV payloads was worth $2.7 billion in FY2012 and is forecasted to increase to $6 billion in FY2021 by the Teal Group. While the overall UAV electronic payload market is projected to have steady growth over the next decade, the SIGINT / EW payload market is expected to see a massive 20.2% compound annual growth rate (CAGR) from FY2012 to FY2017. UAV system upgrades involving smaller, lighter, and higher-performance payloads, as well as the stocking of extra payloads for field changes and repairs, should allow growth of the payload market to outpace the vehicle market.


Another trend that could have a major impact on the growth of the UAV payload market is open architecture design. An emerging design philosophy in the defense industry, open architecture recently demonstrated its applicability to UAV programs. General Atomics Aeronautical Systems, Inc. (GA-ASI), manufacturer of MQ-9 Reaper, collaborated with SELEX Galileo and Cobham Aviation Services to integrate SELEX Galileo’s active electronically scanned array (AESA) radar and other sensors onto an MQ-9 Reaper — the first step in proving the concept of the new platform and sensor-agnostic skyISTAR mission management system, allowing customers the flexibility to integrate future payload capabilities into the system. If this open payload architecture approach develops into a widespread UAV design trend, it would further enhance the opportunities for payload manufacturers not specified into the initial UAV designs.


Major UAV Systems And Their Electronic Payloads

The UAV systems operated by the U.S. Department of Defense (DoD) are diverse in configuration and mission. High-altitude, long-endurance (HALE) UAVs like the RQ-4 Global Hawk are at the top of line in terms of size and cost (at approximately $220 million each) and were designed to replace the high-altitude U-2 Dragon spy plane, which has been in service since 1955. Medium-altitude, long-endurance (MALE) UAVs like the MQ-9 Reaper fly lower and slower than the HALE UAVs but also carry payloads with precision hunt and kill capabilities. These two classes of UAVs will receive the majority of DoD spending on unmanned aerial systems. However, small tactical unmanned aerial vehicles (STUAVs) — used by ground troops to gain advantage in battlefield awareness — are less expensive platforms purchased in much larger quantities, and so provide another substantial market opportunity for payload manufacturers.


HALE UAVs: Global Hawk

Hawk hit some turbulence when the FY2013 budget canceled the Block 30 program and provided no funds to operate the 18 UAVs in service, which would effectively eliminate the U-2’s planned replacement in 2015. The House Armed Services Committee tried to preserve this ISR capability by designating $260 million of its markup in the National Defense Authorization Act for FY2013; however, their counterparts in the Senate disagreed, so the future of the program is in question.


Good news for the Global Hawk platform came in the form of a $1.7 billion contract for five Block 40 Global Hawks for the NATO Alliance Ground Surveillance (AGS) aircraft program and $657.5 million in the FY2013 budget for the U.S. Navy Broad Area Maritime Surveillance Demonstration (BAMS-D) program, which consists of five Global Hawks that the U.S. Navy acquired from the U.S. Air Force as a test platform for the development of the MQ-4C UAV. Initial operational capability (IOC) for the NATO AGS is projected for 2014, and the first MQ-4C BAMS was scheduled for June 2012 delivery with IOC expected in 2015.


In terms of payload, RQ-4 Block 40 incorporates the BAN/ZPY-2 MP-RTIP (Multi-Platform Radar Technology Insertion Program), which uses AESA technology to deliver long-range, very high-resolution SAR images, ground moving target indicator (GMTI), and new air target tracking capabilities. The RQ-4 will communicate through wideband Ku SATCOM and line-of-sight connectivity. A future battle management and control (BMC2) system will direct weapons to attack stationary and moving targets using the Global Hawk’s radar cues from its SAR and GMTI radars.


The MQ-4C BAMS UAV sensor payload provides ISR and tactical information over a mission radius of 2,000 nautical miles (nm) and will leverage several advanced technologies to accomplish this mission. The Multifunction Active Sensor Active Electronically Scanned Array (MFAS AESA) radar provides long-range detection and identification of targets. The MQ-4C will also have the Multi-Spectral Targeting System (MTS¬B), which integrates an EO camera, IR sensors, a laser illuminator, and a laser range finder/designator to provide auto-target tracking and high-resolution, full-motion video. In addition, the BAMS will have AN/ ZLQ-1 electronic support measures (EMSs), giving it the capability for specific emitter identification (SEI), and an automatic identification system (AIS) that provides maritime vessel movement information from VHF broadcasts. Finally, the MQ-4C is specified to have radar capabilities for maintaining safe separation from other airborne assets.


MALE UAVs: Predator/Reaper/Grey Eagle

The MQ-9 Reaper MALE UAV continues to build on the reputation of its predecessor, the MQ-1 Predator, and is the clear choice of the U.S. Air Force for long-endurance UAVs with strike capability. The Air Force spent most of the money earmarked for UAV spending in FY2012 to procure 48 MQ-9 Reapers, and although the quantity of Reapers is reduced to 24 for FY2013, the budget still has $885.4 million for the procurement of the MQ-9. The FY2013 budget also provides $749.6 million to continue deployment of the U.S. Army MQ -1C Grey Eagle, a variant in the predator family. The strategic goal is to procure enough UAVs and ground stations to support 65 combat action patrols (CAPs) by FY2017.


In addition to its SIGINT payload, data links, and beyond-line-of-sight communications equipment, the Reaper boasts the Raytheon AN/DAS-1 MTS-B Multi-Spectral Targeting System, which integrates an EO camera and IR sensors, a laser illuminator, and a laser range finder/designator. MTS-B is capable of full-motion video, day or night, and provides the tracking and targeting required for precision laser-guided munitions.


The Reaper also carries the AN/APY-8 Lynx II radar, an all-weather SAR with GMTI technology. The Ku-band radar has a “search” mode with a range of 47 nm with a resolution of 10 feet, a “stripmap” mode that covers 29 nm with 1-foot resolution, and a “spotlight” mode that produces photograph-like radar images that cover 21 nm with a resolution of 4 inches.


MQ-1C Grey Eagle has communication relay payloads and EO/IR video sensors. It also uses an AN/ZPY-1 STARLite small tactical SAR that has GMTI and dismount moving target indicator (DMTI) functionality to provide all-weather, wide-area surveillance and detection of stationary, dismounted, and moving targets.



This class of UAVs is on the other end of the spectrum from the Global Hawk and Reaper. STUAVs are small enough for a soldier to carry in a backpack along with a handheld control unit. They typically sacrifice endurance, payload capacity, altitude, range, and strike capability in favor of light weight, portability, quick deployment, and real-time ISR and targeting information. These UAVs essentially become an extension of the soldier, hand-launched to provide a bird’s eye view of the battlefield and tactical ISR support of combat battalion and below operations.


There are several STUAV manufacturers, but one company in particular — AeroVironment — has been very successful at getting qualified by the various branches of the U.S. armed forces. In fact, the U.S. Marine Corps has adopted four different AeroVironment UAV systems. The company also has received a $2.5 million order from the U.S. Air Force for its Wasp AE, and orders totaling $28.3 million from the Army, Marines, and Air Force for its new RQ-20A Puma AE. The DOD FY2013 budget also has $25.8 million designated for the procurement of 234 AeroVironment RQ-11 Raven systems. The Raven, Wasp AE, and Puma AE are interoperable, using the same ground control system (GCS). Additional details on these STUAVs and their payloads are as follows:



X-47B, the demonstration version of the U.S. Navy MQ-47B


The 4.2-lb. RQ-11 Raven has forward- and side-looking EO/IR cameras for day and night video capture. The Raven is remotely operated or can be programed for autonomous flight.

The Wasp AE weighs only 2.8 lbs. and is capable of encrypted communication beyond line-of-sight and can provide video, voice, data, and text relay. It carries the Mantis i22 AE gimbaled payload, which only weighs 9.7 ounces, delivers front- and side-looking color and IR video. It was designed for land and water landing, so it can operate in harsh environments.

An RQ -20A Puma AE system consists of three UAVs and two GCS. The vehicles weigh 13 lbs. each and are battery powered with 2-hour endurance. The UAV has a stabilized gimbal, is equipped with an EO/IR video camera and IR illu¬minator, and is capable of tilt and pan of plus or minus 180 degrees, giving it 360-degree coverage. It is designed for enhanced survivability for both maritime and land environments and operates in foul weather conditions and over rough terrain.

Looking To The Future

The U.S. Navy is working diligently to get the MQ-47B into service as soon as possible, its 2,500-km range provides a greater standoff distance between the target and the aircraft carrier from which it was launched, and it eliminates the issue of pilot endurance. It can carry 4,500 lbs. of smart bombs, and its projected cost is about half that of a new $100 million (manned) F-35 Lightning II JSF. Its first carrier launch and recovery is scheduled for 2013, followed by autonomous aerial refueling demonstration in 2014, with the goal of bringing the platform into service by 2018. It is likely to carry EO/IR sensor payloads, SAR and ISAR radar with GMTI and MMTI, as well as electronic support measures (ESMs).


There are other emerging UAV programs that could provide great opportunities for the electronic payload market, including:

MQ-8C Fire Scout: Northrop Grumman received a $263 million contract for eight new MQ-8C Fire Scout unmanned helicopters, based on the Bell 407 airframe. This new UAV is twice the size of the current MQ-8B, which will allow it to carry 1,000 lbs. of payload (weapons and sensors) — five times the capacity of the current Fire Scout.

Solar Eagle: The Solar Eagle is a HALE UAV being built by Boeing Defense, Space & Security under a Defense Advanced Research Projects Agency (DARPA) program. This solar-powered UAV will be able to carry 1,000 lbs. of ISR payload at 60,000 feet for five years without replenishing.

ScanEagle: The ScanEagle UAV has been tested with a hydrogen fuel cell, which would reduce the UAV’s power plant weight, allowing for greater IRS payloads.

Argus One: This lighter than air (LTA) UAV has a flexible, non-rigid body that can be easily stored, transported, and deployed. It is designed to carry about 30 lbs. of sensors for ISR and communications, permitting GPS-guided autonomous flight.



In this article, we have reviewed only a portion of the current and future UAV systems that will provide opportunities to the electronic payload market, both domestically and abroad. The growing need for ISR capabilities will only increase the market opportunities for UAV systems and electronic payloads, despite shrinking defense budgets and austerity measures worldwide. In today’s budget-conscious world, better-equipped soldiers provide a sensible path to reducing the size of the military without decreasing its effectiveness, and UAVs and their payloads will play a significant role in achieving this goal. (Although it was not the focus of this article, there is also a growing law enforcement and civilian market for UAVs that will provide additional opportunities and support industry growth in the next decade.)


The design trends for future UAV systems will focus on increasing range, endurance, complexity, and autonomy. Some of these will be achieved through the capabilities of the vehicles themselves, but likely more will be accomplished through more sophisticated payloads, such as wide-angle EO/IR sensors and advanced radar systems that can penetrate foliage. The STUAV payloads market will continue to focus on size, weight, and power (SWaP) and will look to add large UAV capabilities including SIGINT, SAR, and hunt and strike capabilities. Technical issues like sensor drift will present challenges as complex non-EO sensors are developed for increasingly smaller UAVs. The electronic payload industry will have ample opportunity to innovate and further improve the performance of UAV systems. As creative ways are developed to expand the roles of these platforms, the blue skies ahead should become increasingly crowded with UAVs.



Double Furlough Trouble for Feds


By Kellie Lunney

Sep 5 2013 12:00 AM ET

Labor Day marks the traditional, if not official, end of summer. It also means a new fiscal year is less than a month away, and so is the threat of more furloughs for federal employees.


Congress for the last few years has lurched from one government funding deadline to another, and the federal workforce has lived under the shutdown shadow since 2011. Then in March 2013, automatic, across-the-board budget cuts took effect, forcing some agencies to place employees on unpaid leave.

This fall the circumstances are a little different because feds are facing the possibility of two types of furloughs simultaneously starting Oct. 1: furloughs related to sequestration, and those caused by a government shutdown because Congress has yet to pass any appropriations bills or a continuing resolution. And then there’s the mid-October deadline for raising the debt ceiling, which is a whole other story.

Yep, summer’s over alright. Don’t make any Columbus Day plans.

It’s not likely that the government will shutter in the next few weeks. Republican and Democratic leaders in both chambers are dead set against it, and lawmakers have to vote right after recess on whether the military should intervene in Syria. On the one hand, they have a lot to do in the nine legislative days between their return and the start of fiscal 2014, creating an anxiety-ridden situation for feds. On the other, shutting down the government and authorizing military strikes on a foreign government at the same time seems particularly unwise, even for this Congress.


Having said that, it’s worth looking at the differences between shutdown, or emergency furloughs due to a lapse in appropriations, and the administrative furloughs many feds already are familiar with through sequestration. The bottom line is, your pay is at risk but your benefits are well protected under both scenarios. For simplicity’s sake, this column will refer to administrative furloughs as sequestration furloughs.



Sequestration Furlough: This is a little tricky. Federal pay under a statutory pay system — the General Schedule, for example — is subject to spending cuts as are other administrative expenses within budgets. But the rates of pay for individual civilian and military employees cannot be reduced under sequestration. In other words, if an agency has to find more savings within its administrative accounts, which include employees’ salaries, then it can resort to furloughs or layoffs. A furloughed employee isn’t paid and isn’t guaranteed back pay. But the agency can’t indiscriminately slash an employee’s rate of pay to save money. So if you aren’t furloughed, your pay is protected, but if you are furloughed, all bets are off. Layoffs are more expensive for agencies, so the odds are furloughs will be much more popular than reductions-in-force in the event of continued sequestration, though the Defense Department has said it likely will need to resort to RIFs for civilians in fiscal 2014 in that case. And don’t forget, a furlough of more than 30 calendar days, or of more than 22 discontinuous work days, is considered a RIF, according to the Office of Personnel Management.


The Obama administration has announced that it plans to exempt military pay from sequestration in fiscal 2014, as it did in fiscal 2013.


Shutdown Furlough: Employees furloughed during a government shutdown are not paid during their leave, but can be paid retroactively. Congress agreed to pay back employees affected by the partial government shutdown (Federal Aviation Administration) in 2011, as well as the full government shutdowns in 1995 and 1996. Lawmakers might not be inclined to reimburse employees for the time lost during the current fiscal climate if the government is forced to close on Oct. 1.


Lawmakers still are on the federal payroll during a shutdown; other employees not subject to furlough include the president, presidential appointees, certain legislative branch staff and “essential” or “excepted” federal workers. Most essential employees perform jobs in defense, health care or other areas of national security and/or emergency-related fields, and while they are paid, their paychecks could be delayed during a hiatus.


Performance awards and within-grade step increases for federal employees can be delayed under both sequestration and shutdown furloughs.


Health Benefits

Sequestration Furlough: The health care coverage of furloughed federal employees will continue if the employee earns enough to pay the premiums. If the worker’s salary can’t cover the cost of the premiums, and the agency has applied all other deductions in order of precedence, then the agency has to give the employee the choice to continue or end coverage under the Federal Employees Health Benefits Program.


Shutdown Furlough: The employee’s health coverage continues even if the agency doesn’t pay for its share of the premiums on time. “Since the employee will be in a non-pay status, the enrollee share of the FEHB premium will accumulate and be withheld from pay upon return to pay status,” stated 2011 guidance from OPM.



Sequestration Furlough: Employees cannot substitute paid leave or other forms of paid time off for unpaid leave during an administrative furlough.


Employees can take leave without pay under the Family and Medical Leave Act when other employees are on administrative furlough. But if an employee is placed on furlough during the time she was supposed to be on FMLA leave without pay, those hours or days will not be considered LWOP under FMLA. “Furlough hours will not count toward the employee’s 12-week FMLA leave entitlement,” stated June 2013 OPM guidance.


Shutdown Furlough: Employees cannot take paid leave or other forms of paid time off, even if it is previously scheduled, instead of going on unpaid leave during a shutdown-related furlough. As for FMLA, “no days associated with a shutdown furlough period will be counted against an employee’s 12-week FMLA leave entitlement,” according to OPM guidance.


Retirement Benefits

Sequestration Furlough: Annuity benefits under the Civil Service Retirement System and Federal Employees Retirement System generally are not affected under administrative furloughs.


Shutdown Furlough: Feds’ retirement benefits are protected under this scenario as well.


The policies related to Thrift Savings Plan contributions are more nuanced, so check out the TSP Board’s guidance on nonpay status and TSP accounts here and specific guidance related to the fiscal 2013 sequester here.


Please remember that this is not an exhaustive list, just the highlights. Click on the links for OPM guidance on administrative furloughs and shutdown furloughs for more detailed information.


Pentagon to Ask Congress for Spending Flexibility in 2014

Sep. 4, 2013 – 08:55PM | By MARCUS WEISGERBER |

Defense News


WASHINGTON — The Pentagon will ask the US Congress for funding flexibility in a government-wide spending measure that is expected to be enacted at the end of September in lieu of a dedicated defense appropriation, a senior DoD official said Wednesday.

Congress is not expected to pass a defense budget by the conclusion of the fiscal year on Sept. 30. Instead — as has become the custom — lawmakers are expected to enact a temporary continuing resolution (CR), which would fund the entire government at spending levels comparable to 2013. Starting new programs and changing program investment levels are typically prohibited under this type of spending measure.

“We don’t know what the CR will say. We hope we’ll have more flexibility under a CR than we would normally have because we need it,” Frank Kendall, DoD undersecretary for acquisition, technology and logistics, said during a speech Wednesday at the IDEEA-sponsored COMDEF conference in Washington.

“We’re going to have to make some adjustments,” Kendall said. “Part of the reason is … to move some money around. We need much more flexibility than we had in ’13. So, we’ll ask for that. I don’t know if we’ll get it.”

Kendall said he is already cutting back on “some commitments of resources because of the uncertainty I’m facing.”

The Pentagon had to cut $37 billion from across its $614 billion 2013 budget due to sequestration spending caps. Those cuts were required across all spending accounts, except military pay. DoD’s 2014 budget proposal is $52 billion above the sequestration spending cap.

“We don’t know where we’re going to end up in ’14; we don’t know where we’re going to end up long term,” Kendall said. “It’s not about getting rid of fat or getting rid of things we don’t really need. It’s about cutting the things we need the least of all the things that we do need.”


To meet 2013 spending caps, the Pentagon furloughed most civilian employees six days, cut back on training and delayed deployments. Kendall said Wednesday that future furloughs might be necessary if the 2014 spending cap is not removed.

While Kendall said he is “hopeful that we will not have to furlough next year,” he said he “can’t guarantee at this point that it won’t happen.”

“We will probably have to reduce the civilian work force,” Kendall said. “Whether that happens under a RIF [reduction in force] of under attrition or if we just don’t hire people as much as we’d like to, or at all virtually, I don’t know yet.

“At least, not hiring people, possibly a RIF and it gets bad enough — unfortunately — possibly furloughs again,” he said. “We really don’t want to do furloughs.”

Kendall said he expects sequestration to have an even greater impact on DoD in 2014 than it has this year.

“Right now we are essentially finding ways to get by [and] a lot of that is by deferring work,” he said.

“I think the damage will be much more visible as we go into ’14,” he added.


Extension Needed

The Pentagon will also ask Congress for the OK to continue an initiative to design exportability into new weapon systems, Kendall said.

“I think it’s so much in everybody’s interest to do this,” Kendall said. “[H]opefully we’ll be able to persuade that Congress to do that for us.”

Congress gave DoD approval to conduct a pilot program, which is called the Defense Exportability Features Initiative. The program is authorized through 2015.

The pilot looks at several programs, including the Air Force Three Dimensional Expeditionary Long-Range Radar (3DELRR) and Navy Next-Generation Jammer.

This initiative is part of a hefty overhaul of DoD weapon buying guidance — dubbed “Better Buying Power 2.0″ — which is designed to improve efficiency and give the military more bang for its buck.

The exportability program — which is conducted in partnership with industry — will produce cheaper products, faster, Kendall said. The initiative will also create more interoperability between the US and partner militaries, he said.


Ohio touted as open for UAS business

Dayton-Springfield leaders work to attract test site, related jobs.

Dayton Daily News

Posted: 5:30 p.m. Wednesday, Sept. 4, 2013

By Andrew McGinn

Staff Writer

XENIA — Regional economic leaders on Wednesday stressed that little will change if Ohio is passed over later this year when the Federal Aviation Administration picks the six sites where testing of unmanned aerial systems will be done.

“The test center we’re building in Ohio will be there operating regardless of what happens with the FAA,” said Maurice McDonald, executive vice president of aerospace and defense for the Dayton Development Coalition.

McDonald was among the speakers assembled at the Greene County Career Center by U.S. Rep. Mike Turner, R-Dayton, to provide an update on efforts to lure the UAS industry to the region and to answer questions for representatives of business, government and academia.

That test center — the Ohio/Indiana UAS Center and Test Complex — opened on July 22 in Springfield’s Nextedge Applied Research and Technology Park and gained its first director last month when retired Air Force Reserve Col. Dick Honneywell was appointed by Gov. John Kasich.

The center hopes to win Ohio and Indiana a joint FAA test site designation this year, but it also will serve as the state’s hub for economic development in the nascent drone sector and will manage designated airspace in the region.

As Turner said, the FAA-designated test sites are to prove a concept — that unmanned aircraft can safely share the skies with manned aircraft. The explosion in business, he said, will occur after the two-year test period.

He compared it to the advent of the automobile industry.

For some already in the business, Ohio can’t move fast enough to open its skies to commercial UAS.

“We’re ready to test now,” Robbie Robinson, vice president of business development for Textron Systems, told the panelists.

Textron is the maker of the Army and Marines’ RQ-7B Shadow UAS and has been paying to use government airspace in Arizona that’s “hit and miss to get in” to test its unmanned aircraft, which are manufactured in Maryland.

“I would love to be able to help Ohio and the region out if we had this infrastructure,” Robinson said. “If the price was competitive, we’re prepared to come here.”

Honneywell, who most recently was vice president of aerospace at the Dayton Development Coalition, assured Robinson that Ohio’s pricing will be “very competitive” when flying begins in May 2014.

For his part, Turner championed the work of Congress, which, despite the political gridlock, has aggressively pushed for the UAS industry.

It took an act of Congress, he said, for the FAA to move toward integrating commercial and civil UAS into national airspace.

Earlier, he said, the Pentagon had been “adamantly against” the arming of drones — like the MQ-1 Predator, which is remotely operated from the Springfield Air National Guard Base — until congressional pressure ultimately changed the way wars are fought.

“Innovation has brought us to this point,” Turner said.

Others came to Wednesday’s panel discussion to explore how they can get in on the ground floor of the predicted boom in commercial UAS.

Urbana University is updating its strategic plan and wants to include UAS curriculum, said David M. Ormsbee, the university’s vice president of enrollment management.

“If Ohio is going to be UAS focused, we want to see how we can be UAS focused as well,” Ormsbee said, adding that 80 percent of Urbana alumni reside in Ohio.

Panelists encouraged Ormsbee to explore the field of human-machine relationships, along with IT and software development.

“Ohio,” Ormsbee said, “has a chance to be a leader or a lagger.”

Sinclair Community College in Dayton has approved $1.4 million to support an expansion of its UAS programs, which focus in part on first responders and precision agriculture, said Adam Murka, the school’s director of public affairs.

Sinclair, which has special FAA approval to train students using three types of UAS at the Springfield and Wilmington airports, has 22 students enrolled in a new class this fall called UAS Standards, Regulations and Law.

Murka cited the coming availability of good-paying jobs in the UAS industry.

“We don’t know where it’s going,” he said, “but we know it’s worth going there.”


Net Neutrality Faces Uncertain Court Ruling in US

Monday’s hearing is a major step in a long argument over FCC authority over net neutrality

By Grant Gross

Thu, September 05, 2013


IDG News Service (Washington, D.C., Bureau) — It’s difficult to predict how an appeals court will rule after it hears arguments Monday in Verizon Communication’s challenge of the U.S. Federal Communications Commission’s net neutrality rules.

Groups on both sides of the debate over the FCC’s rules prohibiting broadband providers from selectively blocking or slowing traffic say they believe they have a good case at the U.S. Court of Appeals for the District of Columbia Circuit. Reading the court’s tea leaves has become as much of a case of wishful thinking as a predictive science.

On one hand, the same appeals court ruled against the FCC in April 2010, when the agency tried to force Comcast to comply with an Internet policy statement after the cable broadband provider was caught slowing BitTorrent and other bandwidth-hogging applications. The court said then that the FCC lacked “any statutorily mandated responsibility” to enforce network neutrality rules.

The legal situation has changed since then, however. Last December, the same appeals court ruled in favor of the FCC after Verizon Wireless had challenged the agency’s authority to impose data roaming rate rules on mobile carriers. The question over the FCC’s authority to impose data roaming rules is similar to the one raised by Verizon in the net neutrality case, some telecom experts said.

Then, in May, the U.S. Supreme Court ruled, in a case called City of Arlington v. FCC, that a regulatory agency generally be given broad deference when interpreting its own authority when statutory ambiguity exits. That decision could influence the upcoming appeals court decision, some experts said, although others cautioned that the cases have significant differences.

Adding to the difficulty in predicting an outcome: The court has a number of options it could take. It could strike down the FCC’s net neutrality order, it could uphold it, or it could take some type of middle ground. For example, the court could kick back the rules to the FCC by saying the agency may have the authority but hasn’t made its case.

Verizon argues that the FCC doesn’t have authority to regulate an information service, a class of communications that the agency has previously exempted from most regulation. The net neutrality rules are a violation of Verizon’s First Amendment free speech rights and its Fifth Amendment property rights, the company has argued.

The agency has claimed broad authority over broadband using twisted regulatory logic, Verizon’s lawyers wrote in their brief to the appeals court. As with the earlier Comcast case, “the FCC has acted without statutory authority to insert itself into this crucial segment of the American economy, while failing to show any factual need to do so,” Verizon said in the court brief.

That earlier Comcast decision from the same court presents a major “hurdle” for the FCC, said Randolph May, president of the Free State Foundation, a free market think tank that has joined a brief calling for the court to overturn the rules. Although the FCC, in its 2011 net neutrality order, “made an effort to beef up its argument that it possesses authority under the Communications Act to regulate Internet access service, I think the overall impression is that the agency is reaching too far,” May said by email.

May would lean toward the FCC losing the case, he said. The court will look at whether the FCC’s net neutrality rules were reasonable, May said, and many critics have argued the regulations were unnecessary because there have been few examples of violations.

“Even if the court finds that the FCC possesses authority under the statute, there is a pretty good chance the court will find, in light of the lack of persuasive findings concerning market failure, consumer harm, or impact on investment and innovation, that the agency’s decision is arbitrary and capricious,” May said.

The Free State Foundation, free market think tank TechFreedom and other critics of the net neutrality rules argue in their brief that the U.S. government could police major violations of net neutrality principles under existing antitrust law.

If the appeals court strikes down the rules, “net neutrality will be dealt with the same way concerns about competition are dealt with throughout the rest of the economy,” Berin Szoka, president of TechFreedom, said by email.

The TechFreedom/Free State Foundation brief also repeats concerns that the rules violate broadband providers’ free speech rights. “By denying Internet service providers their editorial discretion and by compelling them to convey content providers’ messages with which they may disagree, the Order violates broadband providers’ First Amendment rights,” the brief says.

The First Amendment and Fifth Amendment concerns are “silly,” countered Matt Wood, policy director at Free Press, a digital rights group that has pushed for strong net neutrality rules. The FCC hasn’t taken away Verizon’s ability to communicate on its website or its blogs, and the agency hasn’t taken away the carrier’s network, he said.

Verizon’s argument that its free speech is impacted when it provides the pipes for other people’s messages is “contrary to the notion to what a carrier does and how the Internet works,” he said.

Verizon, during other debates, has argued it should not be held responsible for the communications of its broadband customers, says the Center for Democracy and Technology and a group of legal scholars in their brief to the appeals court.

The FCC’s order does not violate Verizon’s free speech rights, but “instead protects the First Amendment interests of Internet users,” CDT says in the brief. “Certainly, Verizon often does speak via the Internet, using websites, blogs, email, social media, and the like. But its separate conduct in transmitting the speech of others should not be confused with Verizon’s own speech.”

Still, the FCC’s argument that it has so-called ancillary authority to regulate broadband because it has authority over other communications services may be a tough sell, Wood said. The appeals court rejected the ancillary authority in the 2010 Comcast case, he noted.

The Supreme Court’s City of Arlington case and the data roaming case give the FCC a “mini winning streak,” however, Wood said. He gives the FCC a “close to 50 percent chance” of winning the Verizon case.

The FCC has a good chance of winning, countered Michael Weinberg, a vice president at digital rights group Public Knowledge. The agency is “basically right” in arguing it has the authority to regulate broadband under the Communications Act, he said.

The agency had potential court challenges in mind when it drafted the net neutrality order, Weinberg said. “The FCC was thoughtful about this,” he said.


Rasmussen Reports

What They Told Us: Reviewing Last Week’s Key Polls

Bottom of Form

Saturday, September 07, 2013

Congress is back in session this coming week with three critical issues demanding its attention – Syria, the October 1 deadline to raise the federal debt ceiling and immigration reform. Voters have pretty strong opinions on all three.

Despite President Obama’s insistence that the United States has proof of the Syrian government’s use of chemical weapons against its opponents, just 37% of Likely U.S. Voters think the United States should provide increased military assistance to protect the citizens of Syria. In part that’s because only 35% consider Syria a vital national security interest of the United States these days. 

Seventy-three percent (73%) think it’s at least somewhat likely that the debt ceiling will be raised before the federal government begins defaulting on its debts. But just 25% think the debt ceiling should be raised without any significant spending cuts.

Congressional Republicans are insisting on long-term entitlement cuts before they will agree to raise the debt ceiling, and a sizable number of GOP congressmen are willing to shut down the government through the debt ceiling process to prevent funding of the president’s national health care law.

Just 41% of voters have at least a somewhat favorable opinion of the health care law, while 52% view it unfavorably. This includes 18% with a Very Favorable opinion of it and 39% with a Very Unfavorable one.

While the Senate has passed an immigration reform plan with bipartisan support, fewer voters than ever (28%) think the reform measure will make it through the House of Representatives and become law this year. A continuing problem for the reformers is that most voters still think tougher border control is the most important element of any immigration reform plan, but just 32% believe it is even somewhat likely that the federal government would actually secure the border to prevent future illegal immigration if the plan passed.

The economy, health care and job creation remain at the top of the list of 15 important voting issues regularly tracked by Rasmussen Reports.

Consumer and investor confidence are down from recent highs but are still generally running at higher levels than have been measured since 2008.

The federal government announced on Friday that employers added 169,000 jobs in August, with the unemployment rate falling to 7.3 percent. This marginal gain was predicted in the Rasmussen Employment Index for August released on Tuesday. The index which measures worker confidence rebounded last month, rising more than three points over July and up nearly 15 points from a year ago. 

Still, Americans are more pessimistic about the U.S. job market than they have been all year. Only 25% now believe the job market is better than it was a year ago. Thirty-one percent (31%) think the job market is worse today than it was a year ago, the highest finding since December.

Just 19% think the government should help out the long-term unemployed in a big way, with nine percent (9%) who say their unemployment benefits should be extended indefinitely and 10% who say the government should just hire them. Forty percent (40%) think the federal government should do nothing at all for the long-term unemployed, the highest finding in three years of regular surveying.

Americans continue to believe that government workers have it better than those employed in the private sector when it comes to pay and job security. They also don’t think government employees work as hard.

Sixty-seven percent (67%) of Democrats have a favorable view of unions. Sixty-eight percent (68%) of Republicans – and 52% of adults not affiliated with either of the two major parties – view them unfavorably.

Fifty-four percent (54%) of all Americans express a Very Favorable opinion of people who start their own businesses, and nearly as many (52%) say the same of small business owners. Just two percent (2%) feel that way about Congress.

Only seven percent (7%) of Likely Voters now think Congress is doing a good or excellent job. Just 23% think their representative in Congress is the best possible person for the job.

Democrats lead Republicans by two points on the latest Generic Congressional Ballot

 The president’s total job approval remained unchanged at 47% in August, still at its lowest level since December 2011. Following Obama’s reelection, his full-month approval rating jumped to 56% in December but has steadily decreased since then.

As they have since mid-April, the president’s daily approval ratings now match those found for much of his first term in office.

In other surveys last week:

– Thirty percent (30%) of Likely U.S. Voters now say the country is heading in the right direction.

– Democrat Terry McAuliffe has opened up a seven-point lead over Republican Attorney General Ken Cuccinelli – 45% to 38% – as the Virginia gubernatorial race enters the post-Labor Day stretch.

– Seventy percent (70%) of Americans rate their own overall health as good or excellent, but 36% think they are overweight.

– The Massachusetts Supreme Court heard arguments this past week challenging the legality of saying the Pledge of Allegiance in schools. Sixty-eight percent (68%) believe students should be required to say the Pledge of Allegiance every morning, and 83% think the Pledge should include the words “under God.”

– With school back in session, first lady Michelle Obama’s “Let’s Move!” campaign is back in the spotlight, but only 18% of Americans think the federal government should have a say in nutritional standards at schools.

– Eighty-seven percent (87%) of adults with children at home attending elementary or secondary school say their children have received all the vaccinations they are supposed to have. Seventy-six percent (76%) believe children who do not receive all their vaccinations will cause health problems for other children.

– Just 14% of Americans rarely or never watch movies at home.

– Thirty-five percent (35%) say that when they celebrate Labor Day, they honor the contribution of workers in society which was the original intent of the holiday. As has been the case for years, however, many more (50%) view Labor Day as the unofficial end of summer instead.

August 31 2013




NSA Paid Tech Companies Millions For Prism

Leaked documents show taxpayer cost of involving Google, Microsoft and other tech companies in Prism digital dragnet.

By Mathew J. Schwartz, InformationWeek

August 23, 2013



Who paid the cost of giving the National Security Agency direct access to the systems of nine technology companies, including Facebook, Google, Microsoft and Yahoo?

The answer arrived Friday: U.S. taxpayers.

Furthermore, the bill didn’t come cheap. The U.S. Foreign Intelligence Surveillance Act (FISA) Court, which is charged with monitoring the NSA’s surveillance programs, ruled in 2011 that the agency violated section 702 of FISA as well as the Fourth Amendment. Accordingly, the court ordered the programs to cease within 30 days unless specific “upstream collection” practice problems were fixed.

“Upstream collection is when the NSA gets a copy of Internet traffic as it flows through major telecommunications hubs and searches through for ‘selectors,’ like an email address or a keyword,” Parker Higgins, an activist at Electronic Frontier Foundation, said in a blog post.

That FISA Court ruling triggered a period of successive 30-day extensions, each of which required corresponding changes from the technology companies that were legally compelled to give the NSA access to their systems. Those extensions and the surveillance program certifications they included came at quite a cost, according to a December 2012 NSA newsletter marked “top secret,” which was published Friday by the Guardian and presumably provided by former NSA employee-turned-whistleblower Edward Snowden.

“Last year’s problems resulted in multiple extensions to the certifications’ expiration dates which cost millions of dollars for Prism providers to implement each successive extension — costs covered by Special Source Operations,” read the NSA newsletter.

Yahoo confirmed to the Guardian that it had been reimbursed for costs related to responding to data requests from the U.S. government. “Federal law requires the U.S. government to reimburse providers for costs incurred to respond to compulsory legal process imposed by the government,” said a Yahoo official. “We have requested reimbursement consistent with this law.”

Special Source Operations — described by Snowden as the NSA’s “crown jewel” — administers the agency’s surveillance programs that involve service providers, telecommunications companies and corporate partnership arrangements with technology firms that give the agency direct access to the data they handle.

But according to three rulings declassified this week by the director of National Intelligence James Clapper — as ordered by President Obama — the FISA Court in 2011 ruled that the agency had broken the FISA law and violated the Fourth Amendment thousands of times due to its data interception practices. That document disclosure was made in response to a Freedom of Information Act request from EFF.


In one of those declassified documents, FISA Court Judge John Bates wrote in an 86-page opinion that the “volume and nature of the information [NSA] has been collecting is fundamentally different from what the court had been led to believe.” Furthermore, he said that the NSA’s so-called minimization procedures for intercepting multi-communication transaction (MCT) data “tend to maximize, rather than minimize, the retention of non-target information, including information of or concerning United States persons,” thus violating the Fourth Amendment.

Accordingly, rather than renewing the requested annual legal certifications the agency is required to obtain from the FISA Court for its FISA surveillance programs, he instructed the NSA to fix specific problems or cease its related surveillance efforts.

In a cover letter published with the declassified court rulings, Clapper characterized those problems as involving “highly technical reasons concerning the matter in which the collection occurred” rather than involving questions of civil liberties. In particular, the problem appeared to center on the capture of MCT data, which might bundle multiple messages in a single communication.

“In large-scale enterprises as technologically sophisticated and operationally complex as the 702 program, mistakes and errors can and will happen,” said Clapper. He said that after the court ruling, the agency proactively deleted all upstream communications it had intercepted in violation of FISA.

Clapper emphasized, however, that the agency reports all such errors both to the FISA Court and Congress. That including reporting earlier “unintended misrepresentations in the way the collections were described to the FISA Court” that resulted in part from “gaps in technical understanding” between different groups at NSA. In the wake of those discoveries and reporting the problems to the FISA Court and Congress, Clapper said that part of the solution entailed making not just technical changes but also related structural, managerial and training changes at NSA.



DHS kicks off $6B cyber program

Federal Times

Aug. 25, 2013 – 06:00AM | By NICOLE BLAKE JOHNSON | Comments


There’s a lot riding on the Homeland Security Department’s new $6 billion cybersecurity contract.

DHS has committed $185 million this year to fund the initial roll out of monitoring tools capable of firing billions of automatic security inspections across civilian networks every 24 to 72 hours. DHS has additional funding budgeted for at least the next two years, pending congressional approval.

“We need tools to automate security testing and specialized experts to make those tools effectively operate and interpret the results,” said John Streufert, who leads DHS’ Continuous Diagnostics and Mitigation (CDM) program. “It’s how we make the repairs,” Streufert said. “In tightening budgets under sequestration, we are trying to make that repair labor more efficiently applied.”

While many agencies today use network scanning tools to detect rogue devices, flawed software applications and other security risks, the technical capabilities vary. So does the departments’ in-house expertise to diagnose and fix those flaws.

DHS’ goal: Standardize security protections across the government over the next three years.


“Agencies are trying to manage threats in real time,” said Andy Maner, a managing partner with IBM. “That is the goal of this vehicle.”

Key to that will be equipping agencies with both tools and experts through a $6 billion, five-year contract it awarded to 17 vendors earlier this month. The General Services Administration awarded the blanket purchase agreement on behalf of DHS, and GSA will charge agencies a 2 percent fee to use the contract.


Funding for monitoring

DHS is now developing task orders on behalf of civilian agencies as part of the first phase of the CDM program. Agencies can also place task orders using their own funding and contracting officers, according to DHS. State and local governments can also work through GSA to use the contract.

Under the contract, DHS will fund operations of the scanning tools or sensors at the basic network level, while agencies will have to fund monitoring tools for mission and custom applications. Funds for remediating security problems will also come out of the agencies’ budgets, Streufert said.

Until now, the administration’s push for agencies to bolster continuous monitoring has been an unfunded mandate, said Niels Jensen, regional vice president of federal sales at ForeScout. “Now, not only is there a mandate but Streufert has done a good job working with the administration and making sure there is funding.”

ForeScout offers a software solution that can detect devices on an agency’s network and determine if the device is properly configured in line with agency policy. Nearly a dozen of the 17 vendors DHS selected have included ForeScout’s offering as part of their wider suite of tools.

While use of the contract is not mandated, Jensen said the DHS program is “very much a top-down” initiative that the administration expects will assist agencies in meeting security goals.

“There are many observers of the federal government that expected substantial resistance to adopting the CMD program and, without the facts available to them, made some characterizations that things were going slower than they actually were,” Streufert said.

So far, most large civilian agencies have agreed to use the contract, with the exception of the GSA, he said. Because of internal issues, including some technical challenges, GSA has not made a formal agreement to use the contract, but Streufert expects the agency will do so next fiscal year

“We have a substantial portion of the entire government covered now,” said Streufert, noting that the 21 largest civilian agencies represent more than 90 percent of the federal workforce.

The program will roll out in three phases, starting with a focus on managing all hardware and software that has access to agency networks and managing known vulnerabilities and preventing unauthorized programs from operating on the network. The second phase will include the roll out of tools to determine who uses the systems and when and the role of that account user. Phase three will offer capabilities for responding to cyber incidents.

A separate contract will be awarded for a dashboard, which will provide agencies a more comprehensive view of their security risks, Streufert said. Based on past experience running continuous monitoring programs at the State Department, he said it takes about that long to understand how to use the new tools and address any false positives from security scans.

“Our strongest objective is to diagnose what those cyber flaws are and leave the data at the department and agency on the detailed level and not move that sensitive information of the content of information that is being protected to the Department of Homeland Security,” Streufert said. “Instead, what we’re worried about on a macro level is how many of the doors are unlocked of the 2.2 million personal computers of the civilian government.”

Embedded into the dashboard will be a method for calculating security risks to help agencies track risks numerically, weight their severity and interpret actionable reports so they can better prioritize which problems to tackle first.

“The actual practices of how [agencies] measure risk will be worked out over time,” Streufert said. “We’re not grading our security in terms of pass/fail but looking at our precise results.”



Bare Bones Health Plans Expected To Survive Health Law

By Jay Hancock and Julie Appleby    

KHN Staff Writers

AUG 25, 2013


Consumer Reports calls it “junk health insurance.” A California regulator described them as “skeleton policies.” To an expert from the American Cancer Society, they “are a perfect example of why health care reform is so crucial.”

They are bare-bones health plans, and critics say they could leave consumers who become seriously ill on the hook for tens of thousands of dollars in medical costs. The Affordable Care Act was supposed to do away with them.

“The good news is that these plans will be a thing of the past in 2014,” Steve Larsen, then a high-ranking Department of Health and Human Services official, told reporters two years ago.

The law did outlaw so-called “mini-med” plans, which cap annual benefits at, say, $2,000 even though the average hospital stay costs $14,000. But now a new type of bare-bones policy may take their place.

Consumer advocates, employers and insurers say that unless regulators move to block them at the last minute, plans with limited benefits may continue to be offered by some large businesses, especially those with low-paid workers such as restaurant chains and retailers.

Proposed and final rules issued this spring surprised many by failing to bar large employers from offering insurance policies that could exclude benefits such as hospitalization.

Offering bare-bones policies may result in some fines, but that expense could be less than the cost of offering traditional medical coverage.

For large employers, “the feds imposed no minimum standard on how skimpy that coverage can be other than to say, in essence, it’s got to be more robust than a dental plan or a vision plan,” said Ed Fensholt, a senior vice president at insurance broker Lockton Companies. “We had customers looking at offering some relatively inexpensive and skimpy plan designs to satisfy the individual mandate at modest cost.”


Employers Showing Interest

“There is a lot of interest” from retailers and others that have offered limited-benefit plans in the past, said Joan Smyth, a partner with benefits consultant Mercer. She’s gotten so many inquiries since the Wall Street Journal reported on the issue in late May that limited benefit plans are “my favorite topic,” she joked.

Such plans were typically offered because some insurance was seen as better than none — and the premium costs for both employers and workers were far lower than for traditional coverage.

This summer, the Obama administration gave businesses with 50 or more employees another year, until 2015, to comply with the requirement that they offer insurance or pay a fine.

“Some of the pressure was taken off because of the announcement” to delay the employer mandate, said Neil Trautwein, employee benefits policy counsel at the National Retail Federation, a trade group. “But I think you will continue to see employers in many industries … carefully calculate their strategy for compliance,” in part by considering skinny plans. “As always, the interest is to limit cost increases.”

Officials for McDonald’s, Ruby Tuesday, Darden Restaurants and other large employers that have offered mini-med coverage in the past declined to comment or did not respond to questions about their plans.


Small Businesses Barred

The bare-bones plans cannot be offered to small businesses with fewer than 50 workers, or to individuals buying coverage through new online marketplaces that open for enrollment Oct. 1. But benefit experts expect some larger firms that buy outside the marketplaces or that self-insure to consider them.

The Obama administration says that workers offered such coverage may qualify to shop in the marketplaces and to buy subsidized plans.


Why Health Law’s ‘Essential’ Coverage Might Mean ‘Bare Bones’

“Individuals who are not already offered quality, affordable health care can enter into the marketplaces and choose a health insurance option that works for them,” said Sabrina Siddiqui, spokeswoman for the Treasury Department.

About 2 million Americans are covered by limited benefit mini-med insurance policies, many of which were issued by Aetna and Cigna.

Asked whether Cigna will offer new versions next year, a company spokesman said, “We are currently evaluating the types of plan designs that will meet the needs of employers and employees.” Aetna spokesman Matt Wiggin said the insurer is “still assessing” customer needs.

Skimpy insurance under the Affordable Care Act won’t be quite the same as it is now. Under the new rules, capping the dollar value of annual benefits isn’t allowed, but excluding entire categories from coverage – such as hospital stays – is permitted, say benefit consultants. That’s another way of keeping costs down.


‘Mini-Meds Have Morphed’

The law says only that large-employer policies must cover preventive care such as blood pressure tests or vaccines with no co-pays for consumers. So the plan could cover dental, vision and preventive cancer screenings, but possibly not the treatment or hospital care a patient could need if diagnosed with an illness.

True, the health act requires policies to include coverage for 10 broad categories of “essential health benefits,” such as hospitalization and mental health services, but that provision applies only to plans sold to small businesses and individuals. Larger firms and self-insured employers are exempt.

Benefit advisers say some retailers and restaurant chains are considering limited-benefit plans for 2014 even though the deadline was pushed back for offering coverage or facing fines.

“It seems like mini-meds have morphed,” said Lydia Mitts, a health policy analyst for Families USA, a consumer advocacy group. The new limited benefit policies “are not the same animal but are still substandard coverage.” Employers offering these sorts of plans do face some risks, experts said. If a large employer doesn’t offer “minimum essential coverage,” it’s potentially liable for fines of $2,000 per full-time worker after the first 30 workers. Under the abstruse wording of the health law, however, skinny plans appear to qualify as minimum essential coverage.

But if employers don’t offer “comprehensive” policies — defined as covering at least 60 percent of health expenses — they must pay $3,000 for each worker who receives subsidies to buy coverage. Opinions differ on whether skinny plans will be able to pass the comprehensive test; some regulations are still pending. But employers see that potential expense as far lower than the cost of offering all their workers more robust coverage, experts said.

Some businesses are also betting that few workers will go to the government-run marketplaces to seek subsidized coverage, opting instead for the skinny plan “which costs less than the penalty,” said Dania Palanker, senior counsel for the National Women’s Law Center in Washington, D.C.

Signing up for a company skinny plan would fulfill a consumer’s obligation to be covered under the health act and protect her from the law’s fines.

Advocates are still pressing employers to offer more comprehensive policies.

“People need to be covered for hospitalizations,” said Mitts of Families USA. “It’s important for employers to do the right thing and they should not just look at the minimum requirements of the law.”



Colleges Set to Offer Exit Tests

Employers Say They Don’t Trust Grade-Point Averages


August 25, 2013, 7:48 p.m. ET


Next spring, seniors at about 200 U.S. colleges will take a new test that could prove more important to their future than final exams: an SAT-like assessment that aims to cut through grade-point averages and judge students’ real value to employers.

The test, called the Collegiate Learning Assessment, “provides an objective, benchmarked report card for critical thinking skills,” said David Pate, dean of the School of Arts and Sciences at St. John Fisher College, a small liberal-arts school near Rochester, N.Y. “The students will be able to use it to go out and market themselves.”

The test is part of a movement to find new ways to assess the skills of graduates. Employers say grades can be misleading and that they have grown skeptical of college credentials.



“For too long, colleges and universities have said to the American public, to students and their parents, ‘Trust us, we’re professional. If we say that you’re learning and we give you a diploma it means you’re prepared,’ ” said Michael Poliakoff, vice president of policy for the American Council of Trustees and Alumni. “But that’s not true.”


The new voluntary test, which the nonprofit behind it calls CLA +, represents the latest threat to the fraying monopoly that traditional four-year colleges have enjoyed in defining what it means to be well educated.


Even as students spend more on tuition—and take on increasing debt to pay for it—they are earning diplomas whose value is harder to calculate. Studies show that grade-point averages, or GPAs, have been rising steadily for decades, but employers feel many new graduates aren’t prepared for the workforce.


Meanwhile, more students are taking inexpensive classes such as Massive Open Online Courses, or MOOCs, but have no way to earn a meaningful academic credential from them.


HNTB Corp., a national architectural firm with 3,600 employees, see value in new tools such as the CLA +, said Michael Sweeney, a senior vice president. Even students with top grades from good schools may not “be able to write well or make an argument,” he said. “I think at some point everybody has been fooled by good grades or a good resume.”


The new test “has the potential to be a very powerful tool for employers,” said Ronald Gidwitz, a board member of the Council for Aid to Education, the group behind the test, and a retired chief executive of Helene Curtis, a Chicago-based hair-care company that was bought by Unilever in 1996.

Only one in four employers think that two- and four-year colleges are doing a good job preparing students for the global economy, according to a 2010 survey conducted for the Association of American Colleges and Universities.

Meanwhile, GPAs have been on the rise. A 2012 study looking at the grades of 1.5 million students from 200 four-year U.S. colleges and universities found that the percentage of A’s given by teachers nearly tripled between 1940 and 2008. A college diploma is now more a mark “of social class than an indicator of academic accomplishment,” said Stuart Rojstaczer, a former Duke University geophysics professor and co-author of the study.

Employers such as General Mills Inc. and Procter & Gamble Co. long have used their own job-applicant assessments. At some companies such as Google Inc., GPAs carry less weight than they once did because they have been shown to have little correlation with job success, said a Google spokeswoman.

At Teach for America, which recruits college students to teach in rural and urban school districts, the GPA is one of just dozens of things used to winnow nearly 60,000 applicants for 5,900 positions. Candidates who make it to the second step of the process are given an in-house exam that assesses higher-order thinking, said Sean Waldheim, vice president of admissions at the group. “We’ve found that our own problem-solving activities work best to measure the skills we’re looking for,” he said.

The Council for Aid to Education, the CLA + test’s creator, is a New York-based nonprofit that once was part of Rand Corp. The 90-minute exam is based on a test that has been used by 700 schools to grade themselves and improve how well their students are learning.

The CLA + will be open to anyone—whether they are graduating from a four-year university or have taken just a series of MOOCs—and students will be allowed to show their scores to prospective employees. The test costs $35, but most schools are picking up the fee. Among schools that will use CLA + are the University of Texas system, Flagler College in Florida and Marshall University in West Virginia.

The CLA + is scored on the 1600-point scale once used by the SAT “because everyone is familiar with that,” said Chris Jackson, director of partner development at the Council for Aid to Education. Instead of measuring subject-area knowledge, it assesses things like critical thinking, analytical reasoning, document literacy, writing and communication.

Cory LaDuke, a 21-year-old senior at St. John Fisher, said he had mixed feelings about taking the CLA + but understood why employers might be skeptical of some graduates because “some people don’t work that hard and fake their way through it,” he said.

“It kind of sucks that an employer can’t trust your GPA, but that’s the way it is right now, so this also an opportunity,” said Mr. LaDuke. “It’s another way to prove yourself.”

Other groups also have been seeking ways to better judge graduates’ skills. The Lumina Foundation, which aims to boost the number of college graduates, is offering a way to standardize what students should know once they earn a degree. The MacArthur Foundation has helped fund a system of “badges” for online learning to show mastery of certain skills. Last Thursday, President Barack Obama said he wants the federal government to devise a ratings system to gauge colleges’ performance based on student outcomes.

Meanwhile, established testing companies are introducing new tools. Earlier this year, Educational Testing Service, which developed the Graduate Record Exam, announced two certificates to reward high marks on its Proficiency Profile, which assesses critical thinking, reading, writing and math.

And ACT, the nonprofit that administers the college-admission exam of the same name, has a National Career Readiness Certificate, which measures skills such as synthesizing and applying information presented graphically.

Educational Testing Service was surprised to learn through a survey last spring that more than a quarter of businesses were using the GRE to evaluate job applicants, said David Payne, an ETS vice president.

Sean Keegan, a 2011 graduate of Tufts University, has posted his GRE on his resume because he landed in the 97th percentile, even though he isn’t applying to graduate school. “I think it shows I’m relatively smart,” said Mr. Keegan, who is looking for work in finance. “So far, I’ve gotten a lot of positive feedback from employers.”


Pentagon Planning to Lay Off Thousands of Civilians After September


By Eric Katz

August 23, 2013


The Defense Department is preparing to lay off more than 6,000 civilian employees starting in October, should sequestration cuts continue, as expected.

The Pentagon will begin the paperwork necessary to implement the reductions in force by mid-September if Congress fails to enact an alternative to the reduced budget levels, according to a new department planning document reported by Bloomberg News. In addition to the effects on civilian workforce, Defense is planning for significant cuts to procurement and research spending.

The Obama administration has stated its intentions to once again exempt military pay from sequestration cuts, thereby accentuating the impact of the deficit reduction program on Defense civilians. The Army is planning to cut its workforce by 2,100 employees and the Navy by more than 2,600. Departmentwide, agencies including the Defense Contract Management Agency, would have to lay off about 1,500 workers.

In fiscal 2015, the Pentagon would have to ask Congress for authority to offer early retirement incentives, according to Bloomberg.

The American Federation of Government Employees, which represents 250,000 Defense civilians, said the department is targeting the wrong people.

“Why is the administration threatening to fire 6,300 civilian defense workers and leave its much larger and costlier contractor workforce almost untouched?” AFGE National President J. David Cox said in a statement to Government Executive. “Have they learned nothing from the furlough fiasco when fat cat contractors sat around and did nothing while the people who actually repair the weapons and train the troops were forced out on the street? They need to stop coddling contractors at the expense of military readiness.”



The Atlantic

What Snowden and Manning Don’t Understand About Secrecy

Government often finds bad reasons to keep information hidden, but the recent indiscriminate leaks are foolish.

Mark Bowden

Aug 23 2013, 7:00 AM ET


As an old reporter who has from time to time outed classified information, I have watched the cases of Bradley Manning and Edward Snowden with professional interest.

What troubles me about them is not that they broke the oaths they swore when they took their classified government jobs, the thing that makes them liable to prosecution. Government finds all kinds of dubious reasons to keep secrets, sometimes nefarious reasons, and conscience can force one to break a promise. My problem is with the indiscriminate nature of their leaks.

These are young people at war with the concept of secrecy itself, which is just foolish. There are many legitimate reasons for governments to keep secrets, among them the need to preserve the element of surprise in military operations or criminal investigations, to permit leaders and diplomats to bargain candidly, and to protect the identities of those we ask to perform dangerous and difficult missions.

The most famous leakers in American history were motivated not by a general opposition to secrecy but by a desire to expose specific wrongdoing. Mark Felt, the “Deep Throat” who helped steer Bob Woodward and Carl Bernstein’s Watergate reporting, understood that the Nixon Administration was energetically abusing the powers of the presidency. Daniel Ellsberg copied and leaked the Pentagon Papers because they showed that the White House and Pentagon had never really believed the lies they were telling about the Vietnam War.

In other words, they had good reasons. The reporters and editors who published their leaks weighed taking that step seriously, ultimately deciding that the public’s need to know trumped the principle of secrecy. They concluded that the government in these instances was abusing its power.

Manning and Snowden are wholesale leakers. I can’t know this for a fact, but I suspect they were not completely aware of all they carried off. It isn’t just that they didn’t completely understand what they were leaking; they literally did not know what all of it was. Computers enable individual operators to open floodgates. Out spills everything, the legitimate along with the illegitimate. It’s easy, and it’s irresponsible. It proceeds from a Julian Assange-influenced, comic-book vision of the world where all governments are a part of an evil plot against humanity.

In my experience, government does routinely abuse its power to classify information, sometimes for ridiculous reasons. Sometimes it seems that officials declare something secret just because they can. As a transportation reporter for The Philadelphia Inquirer, I remember battling state transportation officials to release accident information — I wanted to write a story about which intersections were the most dangerous. Never mind that knowing where it was most treacherous to drive would be useful for public safety, and that the agencies involved in collection this data were public agencies, the numbers were, I was told, a state secret. When I walked through the old U.S. Embassy Chancery Building in Tehran in 2005, now an anti-American museum, there was an exhibit of documents seized during the 1979 takeover. The papers looked damning. They were stamped impressively, ‘Top Secret,” and “Eyes Only.” Few of the Iranian students who were marched through read English, and I’m sure few doubted that the documents on display revealed details of the Great Satan’s “plot” to derail the glorious Islamic Revolution. Close inspection revealed that the framed papers were orders from the embassy motor pool for spare parts.

There have been a few things in the Manning and Snowden leaks that might have warranted taking a principled stand, but the great bulk of what they delivered shows our nation’s military, intelligence agencies, and foreign service working hard at their jobs — doing the things we the people, through our elected representatives, have ordered them to do. It came as no surprise to me that America has been aggressively collecting massive pools of data in order to discover and derail terrorist attacks in advance, an enormously difficult thing to do, and yet the very thing Americans demanded after 9/11.

I think Manning’s 35-year prison sentence is excessive, and expect it will eventually be reduced. Whatever danger Manning (who has now asked to live as a woman named Chelsea) poses to American society can be avoided by denying her access to Pentagon computers. Snowden may have found a way to punish himself worse. He has turned himself into an enduring symbol of idiocy by fleeing the oppressive grip of Barack Obama for the open arms of that great civil libertarian, Vladimir Putin.

Both Manning and Snowden strike me not as heroes, but as naifs. Neither appears to have understood what they were getting themselves into, and, more importantly, what they were doing.


How Saddam Hussein Made the Middle East Stable

By DAVID FRANCIS, The Fiscal Times

August 26, 2013

The U.S. military was ready to intervene in Syria in the wake of President Bashar al-Assad’s use of chemical weapons against his own people in the country’s ongoing civil war. The statement was made by Defense Secretary Chuck Hagel in Kuala Lumpur Sunday.

“President Obama has asked the Defense Department to prepare options for all contingencies. We have done that,” Hagel told reporters.   “We are prepared to exercise whatever option, if he decides to employ one of those options.”

But the only options President Obama has now are bad ones. He can’t bomb suspected chemical weapons depots releasing dangerous chemicals. A ground invasion is unlikely, given that American troops would have two enemies – both the Syrian army and the rebels, who have been infiltrated by al Qaeda. Even if a ground operation were feasible, the American public would never support it.

If there were feasible options for the United States, former Secretary of State Gen. Colin Powell, urged the United States not to take them. Powell did not urge restraint because he supported Assad. He said the United States should not act to depose him because the rebels he is fighting might be just as bad.

“I have no affection for Mr. Assad. But at the same time, I am less sure of the resistance. What do they represent? And is it becoming even more radicalized with more al Qaeda coming in? And what would it look like if they prevailed and Assad went? I don’t know,” Powell said.

He added: “We can influence things and we can be ready to help people when problems have been resolved or one side has prevailed over the other. That’s when I think we can play a role.”

Powell knows about dealing with Middle East dictators better than most. He fought Saddam Hussein in the Gulf War, and was the head of the State Department during the second Iraq war.


Bottom of Form

He is also the person who uttered one of the most iconic phrases of the second Bush administration. In warning the president about the dangers of invading Iraq unprovoked, he said, “if you break it, you own it,” meaning that the United States would be responsible for the future of Iraq if it took down a treacherous yet stable Hussein government.

Powell’s comments also recognize a sad fact of international diplomacy. There are lots of bad guys out there, and sometimes bad leaders are needed to keep peace and stability in countries where both peace and stability are rare.


No one would argue that Saddam Hussein was a good guy. He committed dozens of war crimes, used chemical weapons against his own people, tortured prisoners, and tried to exterminate the Kurds, among other atrocities.

But at the same time, his heavy hand was able to keep the country under control. He kept peace between the Sunnis and Shias, all while providing a counterbalance to Iran. He also served as a consistent leader in a region of the world where stability is rare. Under his rule, Iraq was relatively peaceful and safe.

Now, a full decade after he was removed, Iran is an absolute mess; the country’s religious groups are fighting among themselves; its fledgling political system is failing – a dozen candidates for political office have been assassinated in the last ten years. The Islamic State of Iraq, the al Qaeda front there, is still capable of pulling off large-scale attacks.

On Sunday, coordinated attacks killed 42 people. Britain-based NGO Iraq Body Count estimates that some 112,000 Iraqis have died in the last ten years.

One of the main reasons for the problems in Iraq is the United States did not abide by Powell’s advice: The United States broke Iraq, but never bought it. There is a laundry list of things the Bush administration did wrong and the end result has been disastrous.

U.S. involvement in Syria could mean making the same mistake twice. Neither the American public nor the Obama administration has the appetite for another prolonged engagement in the Middle East.

The same can be said of the Egyptian crisis. Hosni Mubarak was another dictator who provided stability: This is why the United States supported him with some $83 billion in aid during his time as Egyptian president. Many in the West thought the Arab Spring protests that removed Mubarak from office were the first step toward a democratic Egypt. As recent violence shows, that’s simply not the case.

Powell hinted at this in his “Face the Nation” appearance. He realizes that the price of stability is often dictatorship and that there are limits to U.S. power.

“But to think that we can change things immediately just because we’re American– that’s not necessarily the case,” he said. These are internal struggles and the parties inside those countries are going to have to sort it out amongst themselves.”

Read more at 


Government eyes regulation of ‘Bitcoins’

Kavya Sukumar, Medill News Service 5:18 p.m. EDT August 26, 2013


WASHINGTON — A Senate committee is investigating whether to establish regulations for online “virtual currencies” such as Bitcoins.

Bitcoins, a widely used virtual currency, are an alternative to money online. Unlike regular money, Bitcoins are not backed by any government or company. The currency is circulated without intermediaries such as banks. This online currency, sometimes called a libertarian’s dream, is not regulated or taxed. This may soon change.

The Senate Homeland Security and Government Affairs Committee sent letters last week to the Departments of Treasury, Homeland Security and other government agencies seeking details on how they oversee the use of virtual currencies, part of an investigation begun several months ago. The letters came on the heels of 22 subpoenas issued Aug. 12 by the New York Department of Financial Services to Bitcoin businesses asking questions about their policies to prevent money laundering and to provide consumer protection.

Digital currencies demand “a holistic and whole-government approach in order to understand and provide a sensible regulatory framework for their existence,” committee Chairman Tom Carper, D-Del., and Sen. Tom Coburn, R-Okla., the top Republican on the committee, wrote in the letters.

The agencies have been asked to provide information to the Senate committee by the end of August.

Patrick Murck, Bitcoin Foundation’s general counsel, praised the committee for “proactively seeking out a productive dialog with the Bitcoin community and authorities.”

Murck said, “New York is trying to set the policy for the entire country. … It is highly questionable if they have any jurisdiction on the issues they are trying to address.”

Bitcoins can be created or “mined” on your computer. Without banks to validate transactions, the task of weeding out fraudulent transactions falls on the users. Some users called miners solve complex mathematical problems to verify transactions. They, in turn, get paid in Bitcoins for their work.

Bitcoins do not have any inherent value, but they can be exchanged for other currencies. The exchange rate for a Bitcoin, which fluctuates wildly, is nearly $120 at Mt. Gox, a Bitcoin exchange.

Multiple attempts have been made at creating rules to oversee virtual currency operations.


In March, the Treasury Department’s Financial Crimes Enforcement Network released guidelines that brought Bitcoin businesses under the same umbrella of laws as other money services businesses.

“FinCEN guidance was a starting-gun shot for the industry,” said Marco Santori, a business attorney and chairman of the Bitcoin Foundation’s Regulatory Affairs Committee. “It signaled that bitcoins were not contraband, but a legitimate form of value transfer.”

Alan Reiner, developer of the Bitcoin software Armory, endorsed regulating Bitcoins to avoid “an unregulated system that is used mainly in black markets.”

Bitcoins work by harnessing the power of computers of the users. It cannot be shut down because there is not one owner or authority, Reiner said.

“Bitcoins put power in the hands of people who use it,” Reiner said. “It is going to do to money what e-mail did to written communication.”

This makes it a difficult system to control. This nearly anonymous currency fuels more than $1.2 million in sales of contraband items, including guns and drugs online, according to a study by Nicolas Christin, a researcher at Carnegie Mellon University. Transactions cannot be easily traced back to users, making it a law enforcement nightmare.

Some of the means of fighting financial crimes such as money laundering do not work on digital currencies because of the lack of a regulating authority. If the Senate investigation were to lead to regulations, public-private partnerships would be needed to detect financial crimes, Murck said.

Bradley Jansen, director of the Washington-based Center for Financial Privacy and Human Rights, said the Treasury Department’s March guidance was poorly written and served only to stifle innovation. “This guidance has raised more questions than it has answered,” he said. “Applying our failed banking policies on Bitcoins is a bad idea and may be the definition of insanity.”

Regulations are a “necessary evil” that the Bitcoin community is willing to accept, Reiner said.

“Bitcoin cannot survive as a mainstream concept unless it has governments’ approval,” he said.

The economy as a whole stands to gain if the Senate committee investigation leads to clearer rules, Santori said. “Jobs will be created, tax money will be collected, customer funds will be safeguarded, and the public will benefit from a highly sophisticated and efficient value transfer system.”


Who Hacked China’s Internet Yesterday?


By Rebecca Greenfield

August 26, 2013


On Sunday morning, China’s Internet was hit with the largest Denial of Service attack it has ever seen, according to China Internet Network Information Center. The assault, which took down sites like Weibo (the Twitter of China),, and the Bank of China, resulted in a 32 percent drop in Internet traffic — and nobody knows who did it.

The attacks came in two waves, starting at 2 a.m. and then again at 4 a.m. Denial of service, or DDoS, attacks use malware-infected computers to overwhelm a network by hitting servers with more activity than they can handle, overwhelming websites so that they are rendered inaccessible . Reports say the outages across China lasted somewhere between 2 and 13 hours. It’s unclear if the attacks are ongoing, but much of the Internet under the .cn domain is working now, according to The China Real Time Report

The CNNIC says it will release more information shortly, but so far hasn’t confirmed the origin of the attacks. Despite what sounds like a complex take-down of part of China’s domain, a single person with little hacking experience could have performed the hack, according to Matthew Prince, the CEO of CloudFlare, which provides Web performance and security services for more than a million websites. “I don’t know how big the ‘pipes’ of .cn are, but it is not necessarily correct to infer that the attacker in this case had a significant amount of technical sophistication or resources,” he told The Wall Street Journal‘s Paul Mozur Monday afternoon China time. “It may have well have been a single individual.”

Read more at The Atlantic Wire



Missile strikes on Syria likely response to chemical attack

By Chris Lawrence. Elise Labott and Tom Cohen, CNN

updated 7:00 AM EDT, Tue August 27, 2013


Washington (CNN) — Few question that there was a major chemical attack in Syria last week, and the United States has made clear that it blames the government of President Bashar al-Assad.

Now, the question is how President Barack Obama will respond.

For almost two years, Obama has avoided direct military involvement in Syria’s civil war, only escalating aid to rebel fighters in June after suspected smaller-scale chemical weapons attacks by Syrian government forces.

However, last week’s attack on a Damascus suburb that reportedly killed and wounded more than 3,000 people obliterated the “red line” Obama set just over a year ago against the use of Syria’s chemical weapons stocks.

At the White House, spokesman Jay Carney told reporters Monday that Obama was evaluating “a response to the clear use on a mass scale with repugnant results of chemical weapons,” adding that “there is very little doubt that the Syrian regime … used those weapons.”

Response to Syria’s ‘moral obscenity’ What are Obama’s options for Syria? Taking action against Syria? The latest from inside Syria

Meanwhile, U.S. Secretary of State John Kerry called the attack “inexcusable” and “undeniable,” and said there was “a clear reason that the world has banned entirely chemical weapons.”

He said that evidence “strongly indicates” chemical weapons were used in Syria and that “we know the Syrian regime maintains custody” of such weapons and has the rockets to use them.

Read Kerry’s remarks

Obama “will be making an informed decision about how to respond to this indiscriminate use” of chemical weapons, Kerry added, saying the president “believes there must be accountability” for those who use them.

Options available to Obama range from ordering limited missile strikes to continued diplomatic efforts labeled by critics as a “do-nothing” approach.

Obama will be presented with final options regarding actions against Syria in the next few days, a senior administration official said Monday. Assuming the president decides to go ahead with a military response, any action could come as early as mid-week, though it could be later, the official cautioned.

Factors weighing into the timing of any action include a desire to get it done before the president leaves for Russia next week and before the administration has to make a decision on whether to suspend aid to Egypt because of the ongoing political turmoil there, the official explained. The administration also wants it to be a quick response to the use of chemical weapons, the senior administration official said.

American officials are consulting with allies to ensure they are supportive of any U.S. action, which the senior administration official said would be very limited in scope and a direct reaction to the use of chemical weapons. And three representatives of allied governments involved in those top-level consultations said the goal is to reach a consensus as soon as possible.

“No one is talking about a long process,” one European diplomat told CNN.

Marie Harf, a State Department spokeswoman, said any U.S. response would be “a determination on how to respond to a blatant use of chemical weapons, and it’s not necessarily to change the entire situation on the ground in Syria.”

That might be a mistake, said Michael Doran, an analyst at the Brookings Institution’s Saban Center for Middle East Policy. A U.S. strike “can’t just be one and done,” but should be part of a plan to remove al-Assad, he told CNN’s “Anderson Cooper 360.”

“The president has been very reulctant to get involved. Public opinion has been against it. There’s not a lot of support on the Hill,” Doran said. “And yet, here we are again. Time and time again, we get dragged further and further in.” The result could be “a Vietnam-type problem, where we kind of back our way into this, if we don’t come up with a plan about how to win.”

Kerry spoke with his British, Jordanian, Qatari and Saudi counterparts Monday and with the secretary-general of the Arab League, Harf said.

“Obviously, the intelligence assessment is ongoing,” she said. “But he reiterated that the president is studying the facts and will be making an informed decision about how to respond going forward.”

The Obama administration is expected to declassify the intelligence assessment backing up its assertion that the Syrian regime was responsible for last week’s chemical weapons attack, another senior administration official said. The declassification would happen before any U.S. military action would take place.

A senior administration official familiar with the intelligence told CNN that the evidence “includes but is not limited to” satellite images of activity at Syrian military installations identified as including chemical weapons depots.

Earlier Monday, a White House official ruled out sending ground troops to Syria or implementing a no-fly zone to blunt al-Assad’s aerial superiority over rebels fighting to oust his regime. The official insisted that all other options were under consideration by Obama but put no time frame on a decision.

Meanwhile, a senior Defense Department official told CNN’s Chris Lawrence Monday that four U.S. Navy destroyers “maintain readiness and, if required, could execute a mission within hours” of being ordered to do so.

But the official added that the U.S. military remained “in a holding pattern” as Obama considers both military and nonmilitary options.

Opinion: How Al-Assad used chemical weapons to poison debate on Syria

Also, Defense Secretary Chuck Hagel said while visiting Indonesia that any U.S. action “will be in concert with the international community and within the framework of legal justification.”

While U.N. Secretary-General Ban Ki-moon said Monday that the use of chemical weapons was a crime against humanity and must be punished, certain opposition by Syrian ally Russia and possibly China undermined the possibility that the Security Council would support a military mission.

Instead, a limited coalition of NATO partners such as Germany, France and Britain — all of which have called for action against Syria — and some Arab League members appeared more likely to provide the political backing needed by Obama to order U.S. missile strikes.

A senior administration official told CNN on Monday that the goals of any coalition military action would be to punish al-Assad and show him that there was a cost for using chemical weapons while preventing him from doing so again.

In addition, a military strike would seek to degrade the Syrian regime’s capabilities enough to weaken it without causing it to fall to an opposition considered unprepared to assume power, the official said.

Possible coalition partners include NATO allies Britain, France, Germany and Canada, as well as regional powers Qatar, Turkey, Saudi Arabia and the United Arab Emirates.

Last month, Joint Chiefs Chairman Gen. Martin Dempsey provided Congress with a list of declassified U.S. military options for Syria that emphasized the high costs and risks of what he said would amount to “an act of war” at a time of deep budget cuts.

U.S. official: Almost no doubt al-Assad regime used chemical weapons

Dempsey’s letter, dated July 19, listed U.S. assets in the region including Patriot missile defense batteries in Turkey and Jordan, as well as F-16 jet fighters positioned to defend Jordan from possible cross-border trouble. In addition, the Pentagon has sent four warships armed with cruise missiles to the region.

According to U.S. officials, updated options offered the president in recent days included:

• Cruise missiles fired from one of four Navy destroyers deployed in the Mediterranean Sea. The missiles would be used to strike “command and control” facilities such as command bunkers, or the Syrian regime’s means of delivering chemical weapons: artillery batteries and launchers. There is no indication that the missiles would strike at actual chemical weapons stockpiles.

• Military jets firings weapons from outside Syrian airspace. This option carries additional risks and is considered less likely.

“They have to be careful to do this in concert with our allies,” Democratic Rep. Adam Schiff of California, a member of the House Intelligence Committee, told CNN on Sunday, adding that “I don’t think the White House is going to want to risk American lives by sending pilots over Syria, so that really limits our options to cruise strikes and think that’s probably where the White House is going to go.”

U.N. chemical weapons inspectors reach alleged attack site

Cruise missile strikes could be “very punishing” on al-Assad’s missile supplies and aircraft without going after the chemical weapons stockpiles to risk dispersing them, Schiff said.

To Aaron David Miller, a vice president at the Woodrow Wilson International Center, the situation is forcing Obama to shift from being an “avoider-in-chief” regarding military involvement in Syria.

“It’s almost inevitable that the president will authorize some form of military action,” Miller told National Public Radio in an interview broadcast Monday.

He said he expected a significant response that amounts to “a warning that lays down this time a red line that the president intends to enforce, not one that turns pink.”

“It cannot simply be a couple of cruise missiles into a storage shed somewhere,” Miller said, adding that the goal was to deter al-Assad rather than topple him or radically shift the balance in Syria at this time. “The president’s not on the verge of becoming the cavalry to rescue the country.”

Schiff agreed that Obama has little choice but to respond strongly.

“In terms of the credibility of the White House,” he said, “the cost of not acting now, I think, exceeds the cost of acting.”

CNN’s John King, Frederik Pleitgen, Hamdi Alkhshali and Ben Brumfield contributed to this report.



The Military Connects Microgrids for a ‘Secure Cluster’ of Power Networks

A new project will link three U.S. Navy microgrids into a mutually reinforcing, power-sharing unit.



Nobody is more interested in microgrids than the U.S. military. The idea of self-sustaining energy islands that can stay on even during grid-wide blackouts is of obvious value to military bases, which can’t let power outages keep them from performing their missions.

But what if those military microgrids could also serve a broader set of purposes, by linking themselves to one another, or even to the grid at large? That’s a question that the U.S. Navy is now striving to answer, via a first-ever project meant to tie three separate microgrids in San Diego, Calif. into a functioning whole.

Power Analytics (formerly EDSA) is the San Diego-based company that won the contract for the project, under the Department of Defense’s Environmental Security Technology Certification Program (ESTCP). While we’ve seen other military microgrids start to connect themselves to the grid, this will be the “first centrally managed cluster of multiple cyber-secure military microgrids” in the country, according to last week’s press release. Financial terms of the three-year contract weren’t disclosed, but Karen Cronin, Power Analytics vice president, told the San Diego Union-Tribune that it was more than $2 million.

Power Analytics is a long-time Navy partner; in fact, U.S. naval ships, which are in essence miniature cities running on self-contained nuclear power plants or diesel-powered generators, are designed using its Paladin software product. It’s also the designer of the three Navy microgrids it’s now seeking to combine into a “secure cluster” of microgrids, to help the Navy both improve energy supply security and reduce its overall energy consumption.

Kevin Meagher, Power Analytics president and CTO, said in an interview last week that the three microgrids, at the hospital at Naval Base San Diego, a data center at Naval Base Coronado and at Naval Base Point Loma, are now equipped with the on-site generation, solar power, energy storage and grid controls they need.

“The circuits are there; for the most part, all of the hardware is there,” Meagher said. “The trick to doing this is to take the existing circuits, the existing equipment, and figure out how to make it all work to meet the requirements” of the project, which are to provide the “first comprehensive, real-time view of the status of its critical power systems across multiple bases.”

Meagher broke down the project’s imperatives into three broad categories. First, “it clearly presents the opportunity for a cluster of microgrids from an economic perspective,” to do things like “wheel” power from base to base, or to optimize the way the combination of microgrids draws from the grid at certain times, versus relying on their own generation and energy storage capabilities.

Second, “Because of the synchronization of the data, it allows you to talk about enhancing or modifying the structure, either to enhance stability at a specific microgrid, or not, depending on what happens on each of the bases,” he said. In other words, it’s a way to study not just what the cluster can do as presently configured, but how it can be changed to maximize that potential.

“The third thing is the same thing that everyone talks about, but it’s still very near and dear to the Department of Defense — that’s situational awareness,” he said. In other words, this microgrid cluster is meant to be an “early warning system,” to allow the Navy to predict and prepare for power disruptions, whether they stem from the grid they’re connected to, or from internal changes like routine maintenance of various systems. 

The work being done by the U.S. military on microgrids will doubtless help lay the groundwork for the spread of the technologies involved to the private sector. GTM Research has collected some data from DOD’s microgrid programs, which include R&D into the hardware involved, as well as the system integration and economic analysis software that makes them run.

Military microgrid developers include SAICLockheed MartinRaytheon, Boeing and General Electric, which is already in a big microgrid project with the U.S. Marine Corps. DOD and the Department of Energy are also working on standardizing the technologies that go into microgrids, via the Smart Power Infrastructure Demonstration for Energy Reliability and Security (SPIDERS) projects underway at Fort Carson, Colorado, and at Pearl Harbor-Hickam Air Force Base and Camp Smith in Hawaii.

Building economically feasible, grid-interactive microgrids is the next step in this process. We’re seeing projects around the world testing the ability of locally controlled energy systems  to balance the grid at large, with grid giants like GE, ABB, Siemens, Alstom, Schneider Electric, Toshiba and Hitachi, startups like Blue Pillar, Viridity EnergyPowerit Solutions and Enbala, and technology firms such as SpiraeIntegral Analytics and Power Analytics all taking different tacks on the challenge. Power Analytics’ software, for its part, is used by financial data centers, air traffic control sites, military installations, deep-sea oil platforms, and power generation and distribution facilities worth a collective $120 billion in asset value.

San Diego is emerging as a key test bed for combining government, institutional and utility microgrids into a working whole. Beyond the Navy projects, the University of California at San Diego has built a cutting-edge microgrid that supplies 90 percent of the campus’ power needs. UCSD’s microgrid, in turn, is being integrated into a larger microgrid project with utility San Diego Gas & Electric, which has already tapped UCSD’s capabilities to help it avoid blackouts during a major fire that threatened power lines.

While the Navy project isn’t yet looking at how its microgrids could play a role in that kind of utility stability assistance on an ongoing basis, Meagher said it’s definitely part of what Power Analytics is trying to figure out.

“The Department of Defense has recognized for a long time that it has an opportunity to advance technology, and the allocation of that, for the entire community,” he said. “This is very much in that vein, where they see a huge upside potential to this.”


How the Snowden Effect Is Paralyzing CIOs

– Tom Kaneshige, CIO


August 21, 2013


In the aftermath of the great data heist by Edward Snowden, the now-infamous computer specialist who stole top secret information from the National Security Agency and leaked it to The Guardian earlier this summer, CIOs are feeling a little helpless.

“People are saying that if it happens to the NSA, which must have incredible tools to prevent people from leaking data yet still leaks on a grand scale, we better be really careful,” says Jeff Rubin, vice president of strategy and business development at Beachhead, a mobile security company.

There’s little doubt CIOs are reeling from the Snowden effect.


A New Breed of Rogue Employee Roams the Network

Snowden represents a new kind of rogue employee or contractor: a tech-savvy millennial armed with personal computers who can spirit away highly sensitive data. CIOs will have to deal with this threat sooner rather than later. The old thinking of relying on encryption to safeguard data just won’t suffice in today’s corporate computing environment.

The 29-year-old Snowden hatched a plan to swipe data from arguably one of the safest organizations on the planet. His age is significant because he’s symbolic of today’s millennial, a 20-something tech worker flooding corporations across the country. Millennials will make up the largest segment of the workforce by 2015, according to the U.S. Bureau of Labor Statistics.

Two-thirds of millennials assess their technology acumen as “cutting edge” or “upper tier,” according to CompTIA. Snowden, who once described himself as a “computer wizard,” not only gained access to sensitive data, he communicated with the media using encrypted email under the codename Verax.

For CIOs, the warning is clear: Your next rogue employee may be good at finding ways around your best-laid security plans.

Social Engineering and Tech Savvy a Dangerous Combo

While there’s no questioning Snowden’s technical chops—after all, he worked at contractor Booz Allen Hamilton as a computer specialist—Rubin doubts Snowden relied on technical skills alone to do what he did. Rather, Rubin believes Snowden employed social engineering tactics to gain access to computers and download data to thumb drives and, eventually, his personally owned computers.

“My guess is he went to NSA employees, said [he was there] to work on their computers and needed access to them, and gained their trust,” Rubin says. “He may have even gone as far as telling them, ‘You may get a notice on your screen that there’s some sort of intrusion, but that’s just me so don’t be alarmed.'”

The idea that Snowden probably used his personal computers and thumb drives should also be alarming to CIOs, especially in the age of BYOD, says Rubin. With BYOD, mobility and cloud storage services such as Dropbox now common, the chances of corporate data leaking out is higher than ever.

In fact, one of Beachhead’s customers recently reversed its BYOD policy because of the security risks. If an employee now wants an iPad, for instance, the company will buy and manage it instead of allowing the iPad to be a part of a BYOD program. They’re saying, We don’t feel we have our act together to really allow this,” Rubin says.

Encryption Is Not Enough

Another lesson CIOs can learn from Snowden is the need for multi-layer security, or automatic triggers for wiping data. Many companies rely on encryption to keep their data safe, yet once a rogue employee gains the password, encryption is worthless.

Rubin says the Snowden case highlights the need for triggers that eliminate data beyond a geo-fence or after a certain number of incorrect logins or amount of time.

Also, companies might want to look into multi-factor authentication and data access controls to prevent rogue workers like Snowden from seeing data in the first place, Rubin says.

Given Snowden’s ability to steal from the NSA, coupled with the rise of both the tech-savvy millennial and BYOD, CIOs are sensing a loss of control over corporate data.

“It’s happening too fast,” says Rubin. “I think companies are a little paralyzed.”

Tom Kaneshige covers Apple, BYOD and Consumerization of IT for Follow Tom on Twitter


U.S. Gov’t Warned Staff: Android Malware Widespread, Use Protection


By Michelle Maisto | Posted 2013-08-27

Google’s Android operating system continues to be the predominant target of malware threats, and we mean predominant. In 2012, Android was the target of 79 percent of malware threats, compared with 0.7 percent for iOS, 0.3 percent for BlackBerry and 0.3 percent for Windows Mobile, the U.S. Departments of Justice and Homeland Security said in an unclassified July 23 release intended for police, fire, EMS and security personnel.

The Next Web discovered the release and reported on it Aug. 26.

The government report pointed to industry reports stating that 44 percent of Android users are still using Gingerbread versions of Android (versions 2.3.3 through 2.3.7), which were released in 2011 and “have a number of security vulnerabilities that were fixed in later versions.”

The report points out three particular security threats. The first, Short Message Service (SMS) Trojans, represent nearly half of the malicious applications currently circulating, it says. Text messages are sent to premium-rate numbers, “potentially resulting in exorbitant charges for the user.”

In the second, rootkits, malware is hidden from normal types of detection and logs the user’s keystrokes, passwords and location without the user’s knowledge.

Finally, fake Google Play domains allow users to browse and download apps, movies, books, music and other content, while stealing sensitive information, such as financial data and log-in credentials.

The report says Android is singled out for being the world’s most widely used mobile OS, and that given the growing dependence on mobile devices by federal, state and local authorities, it’s “more important than ever to keep mobile OS patched and up-to-date.”

Malware Incidents on the Rise

Earlier this month, Trend Micro warned that vulnerabilities in Android are among its top security concerns.

“Due to the fractured nature of the Android network, it is very difficult for patches to reach all users in an effective timeframe,” JD Sherry, vice president of technology and solutions at Trend Micro, said in a Aug. 8 statement. ” Until we have the same urgency to protect mobile devices as we do for protecting PCs, this very real threat will continue to grow rapidly.

According to Trend Micro, it took three years to reach 350,000 high-risk apps—but only six months for that figure to double.

During the second quarter, it added, premium service abusers remained consistent, but the firm saw an increase in the “data stealer volume,” said Research Director Linda Barrabee, which “may indicate the continued sophistication of this threat type.”

On Aug. 12, the Bitcoin Foundation announced that Android had also opened up Bitcoin users to vulnerabilities. The company warned users about the issue on its Web site and recommended that anyone with an Android wallet upgrade their version of their software and perform several reparative steps.

“Because the problem lies with Android itself, this problem will affect you if you have a wallet generated by any Android app,” the company warned users.


Hagel Taps Donley to Lead OSD Downsizing Effort

Defense News

Aug. 27, 2013 – 03:45AM | By MARCUS WEISGERBER

WASHINGTON — US Defense Secretary Chuck Hagel has tapped former Air Force Secretary Michael Donley to oversee a Pentagon downsizing effort to cut the Office of the Secretary of Defense (OSD) headquarters staff by 20 percent over the next five years.

Donley — who stepped down as Air Force secretary in June after five years in the post — will also oversee the reduction of senior-level officials who report directly to Hagel through “consolidating functions” and eliminating positions, a Defense Department spokesman wrote in an email.

Deputy Defense Secretary Ashton Carter announced Hagel’s selection of Donley to Pentagon officials on Monday.

Donley — who will report to Carter — will develop recommendations for meeting Hagel’s staff reduction targets and an implementation plan, the spokesman said. The downsizing effort is officially called the 2013 OSD Organizational Review.

In July, Hagel said he wanted someone “from outside DoD who is deeply knowledgeable about the defense enterprise and eminently qualified to direct implementation of the OSD reductions.”

OSD grew from 2,433 positions in 2010 to 2,665 position in 2012 — a 9.5 percent increase.

Donley has worked on a number of major DoD reorganization efforts throughout his career. As a Senate Armed Services Committee staffer, he worked on the Goldwater-Nichols DoD reorganization act. He also was a member of the National Security Council staff at the White House during the Reagan administration.

Prior to being named Air Force secretary, Donley was DoD’s director of administration and management, the so-called mayor of the Pentagon.

“Mike has been personally involved in just about every major reorganization effort — particularly as it relates to the Office of the Secretary of Defense,” said Arnold Punaro, a retired Marine Corps major general and former Senate staffer who now chairs the Reserve Forces Policy Board.

Punaro has been one of the loudest critics of the growth within the Pentagon’s headquarters ranks.

“There’s probably no one on the planet that knows more about these organizational issues than Mike Donley,” Punaro said. “He understands how to [make the cuts] in a very deliberate and thoughtful way while still meeting the secretary’s goals.”


We Almost Had a Giant Robot Spy Blimp

David Axe

July 12, 2013

Fancy Army airship doomed by the usual incompetence, infighting

The Army’s dream was a fantastical one. Build a 300-foot-long, helium-filled, pilotless airship, pack it with sophisticated sensors and other spy gear and park it over the remotest, most dangerous region of Afghanistan, where it would hover for three weeks at a time beyond the range of enemy gunfire, unblinkingly watching for enemy activity.

But the Army’s plan for building this so-called Long Endurance Multi Intelligence Vehicle, or LEMV, turned out to be just as fantastical as the vision for the giant robot airship. Badly managed and repeatedly oversold by its advocates, the LEMV’s prospects gradually deflated even as rising expectations across the military added pressure to the airship’s development.

Originally meant to cost as little as $150 million and go from blueprint to working prototype in just a year and a half, the giant airship drifted out of control. Between 2010 when the program began and its termination in early 2013, the cost of just one LEMV ballooned to $270 million. And the schedule for completing the airship stretched from 18 months to 36. The first LEMV managed just one brief flight over New Jersey last August before an embarrassed Army pulled the plug.

The subsequent sucking sound could be heard throughout the Pentagon, so to speak. While program mismanagement and budgetary overspends are nothing new to the Army, rarely have they had such devastating effect on an entire promising class of technology.

That’s because LEMV was the military’s last, best chance to revolutionize its aerial fleet with high-tech airships able to fly far longer, far cheaper, than existing warplanes. It’s no exaggeration to say that as the LEMV program sank to the ground, it dragged with it the Pentagon’s whole ambitious scheme to acquire futuristic war blimps.

Unblinking eyes

Airships fought on the front lines for nearly a century. Hundreds were built for use in World Wars I and II. The U.S. Navy, one of the last major military airship users, finally retired its fleet of patrol blimps in the 1960s and replaced them with airplanes and helicopters. For nearly 50 years the idea of lighter-than-air weaponry lay dormant, giant abandoned hangars in California, New Jersey and North Carolina the only evidence of its glorious past.

Then the U.S. invaded Afghanistan and Iraq and found itself hunting elusive insurgents in crowded Iraqi cities and the vast Afghan countryside. Bad guys could hide out for days or weeks before striking.Planes, copters and even unmanned drones lacked the endurance to wait out these patient attackers.

So the Army installed video cameras on simple, tethered balloons and sent them thousands of feet into the air to watch over combat outposts. It was a small conceptual leap to untether the airship, add motors and remote controls and use them to patrol vast swaths of hostile territory for potentially weeks at a time — far longer than any manned aircraft or winged drone can manage.

And cheaper, too. Because of their buoyancy and relative simplicity, airships are highly fuel efficient and easy to maintain and thus cheaper than heavier-than-air craft, in many cases. A jet fighter like an F-16 can cost $20,000 or more per flight hour for fuel and repairs. Large airships generally cost as little as a third as much per hour.

The added flying time and potential cash savings of spy blimps compared to planes and copters intrigued planners in offices all over the Defense Department. The secretive Joint IED Defeat Organization, tasked with developing bomb-hunting technologies, wanted a cheaper way to watch for insurgents planting roadside bombs.

With a budget of more than $200 million, JIEDDO teamed up with the Air Force and Mav6, a Virginia-based aerospace start-up, to develop the Blue Devil II unmanned airship starting in 2010. Blue Devil would be a traditional blimp, its lift provided entirely by light, expansive helium gas. But on the inside, Blue Devil would pack some of the most sophisticated — and expensive — sensors and communications hardware ever developed.

By contrast, the Army wanted a somewhat more complex airship withless complex gear. The LEMV would be a so-called “hybrid airship,” which gets its lift from a combination of helium and also a flattened body that acts somewhat like a wing. Starting out, the LEMV’s cameras and radios would be roughly the same as those already used by Army drones.

LEMV and Blue Devil had similar technology and aims and began at around the same time; they couldn’t help but compete for funding. Moreover both new airships were supposed to be ready for combat trials in Afghanistan in 2011. The frontline testing would be expensive: $190 million for a year’s flying for just a single airship, according to one estimate. It wasn’t at all clear that Congress and the Pentagon would be willing to fund both.

“We are doing this to protect the soldiers on the ground,” Marty Sargent, the Army’s airship project manager, said of LEMV. But the giant blimp was also vying with the Air Force’s Blue Devil for another important role: clearing a flight path for a new generation of lighter-than-air war machine.

Faster, faster!

Cracks appeared in the program even before it went out for bids. Eyeing LEMV like a choice cut of technological steak, the Army’s top intelligence staff, then headed by Lt. Gen. Richard Zahner, wanted it for itself, according to one program insider who asked to remain anonymous.

Normally major weapons development programs for the Army are overseen by a dedicated organization with an unwieldy name: the Office of the United States Assistant Secretary of the Army for Acquisition, Logistics and Technology — a.k.a., ASA-ALT — whose sole job is to formulate specs, solicit bids from industry, draw up contracts and oversee the contractors’ work.

But the intel staff was determined to handle much of that work itself with bureaucratic reinforcements from the Army’s missile command, despite the intel staff and the missileers lacking experience managing new technology. “We took it on for ourselves, because it is our soldiers that are going into these regional conflicts where we may not get the apportionment of strategic [intelligence, surveillance and reconnaissance],” explained Lt. Gen. Mary Legere, who succeeded Zahner in early 2012.

To keep the unusual development scheme afloat, the intel staff sought money directly from Congress instead of asking ASA-ALT to arrange for funding, which was standard procedure. After all, the staff had undercut ASA-ALT and could not expect favorable treatment for its giant spy blimp. To convince a skeptical Congress, the intel staff promised LEMV would be ready fast — just 18 months from the signing of the development contract.

The year-and-a-half deadline proved to be a fatal flaw. Allocating just 18 months for such a complex technology development was ambitious, to say the least and forced LEMV’s builders to cut a lot of corners. “A development timeline of twice as long would still be counted as aggressive,” Mav6’s Jay Harrison commented.

LEMV. Northrop photo

Lowest bidder

At first there was competition. Aerospace giants Lockheed Martin and Northrop Grumman both wanted the LEMV contract, worth up to $517 million for several years of work designing and building as many as three huge airships plus all their on-board gear.

Lockheed was the clear frontrunner. The Maryland-based company already had a suitable airship in the air, the P-791. All the firm would have needed to do was add cameras, radios and other internal gear — admittedly a complex task. By contrast, Northrop in Virginia had no hardware at all and would have to assemble the LEMV blimp and its sensors and comms from scratch.

Everyone in Army intel assumed Lockheed would win. The intel staff even put the Lockheed airship on all the program’s flyers, posters and other promotional material.

But when it became clear how quickly the Army expected LEMV to be ready, Lockheed got cold feet. The Maryland company insisted it would take three years, not a year and a half, to complete the airship — even with the basic airframe already flying. “Lockheed basically gave the contract away,” Harrison explained. Northrop insisted it could meet the 18-month deadline and won the contract by default in June 2010.

Program manager Sargent defended the close deadline. “We are on a tight schedule but we want this to be successful for the Army and all services,” he said.

Concept for a hybrid air vehicle. U.S.patent


Problems piled on. Lacking direct airship experience, Northrop subcontracted with British blimp-maker Hybrid Air Vehicles for the basic LEMV airframe.

HAV struggled to build the airship on the Army’s truncated timeline. “It’s not as though components were ready and you could just buy them,” Hardy Giesler, HAV’s business development director, tells War is Boring. The British firm had to acquire custom-made LEMV components.

“They were told to move faster,” the anonymous program insider tells War is Boring. “In doing so, [HAV] didn’t focus on weight of the parts, but rather the speed of getting them to the States.” Torn between building the airship well and building it fast, the Army chose fast — and paid the price.

Parts began arriving at a massive, World War II-era government airship hangar in Lakehurst, New Jersey, for final assembly. The components were “massively overweight,” the insider says — and as a result the airship would be capable of staying aloft for just four days instead of three weeks, as the Army had promised.

In November 2011, panicked managers from the Army, Northrop and HAV met in the U.K. It was clear that the original 18-month schedule would have to be revised, as would LEMV’s ambitious performance specs. LEMV’s first flight, originally slated for no later than December 2011, was bumped back to a unspecified date in mid-2012. The much-hyped combat trial in Afghanistan was deferred indefinitely.

Blue Devil’s demise

The collapse of the Pentagon’s other airship effort in early 2012 increased the pressure on LEMV at precisely the moment the latter program was struggling the most. After two years of work costing more than $200 million, Blue Devil was 95 percent complete, inflated with $350,000 worth of helium, gently bobbing in Mav6’s North Carolina hangar awaiting the installation of cameras and radios.

That March the Air Force abruptly pulled the plug on Blue Devil, citing weight growth, schedule delays and cost overruns. “It doesn’t make sense,” one Mav6 employee mourned. The tiny company would later divest all its aerospace activities.

Blue Devil’s demise left LEMV as the military’s only major airship program. But the Army airship was suffering all the same problems that had plagued the Air Force model, albeit in near-total secrecy. The Air Force had publicly criticized Blue Devil’s troubled development. By contrast, the Army and Northrop cheerily reported only steady progress on LEMV despite repeated delays. “We’re about to fly the thing!” Northrop spokesman K.C. Brown, Jr., crowed in May 2012.

Six tons overweight, tens of millions over-budget and months late, the first LEMV took off for its debut flight that August. For 90 minutes the football-field-length airship motored at low altitude over the forests and fields of central New Jersey, returning as the sun was setting. Although meant to be robotic, for the initial flight LEMV had a pilot aboard.

“LEMV was designed, built and flown in a short 24 months, a considerable accomplishment for a vehicle of this scale and complexity,” Northrop boasted in a statement — as though a mere six-month delay (it was actually nine months) weren’t a total disaster for a program sold on the promise of an 18-month development.

The end

Word within the Army was that it would take another year and an extra $60 million to shave off weight, install more equipment and prep the LEMV for a second test flight in New Jersey—never mind operational missions over Afghanistan.

The additional delay could not have come at a worse time. After 11 years of fighting , the war in Afghanistan was winding down. Budget cuts were forcing the Army to cancel all but the most critical weapons programs. “I kind of knew … this thing wasn’t going back up again,” the LEMV insider says.

Among junior program staff, conversations turned to what-ifs. What if the airship had been developed earlier—say, 2005 or 2006—instead of nearly a decade into the war? What if the Army had been realistic about the time and cost of assembling the airship? What if experienced program managers had been in charge?

What if the Pentagon had been able to get a new airship—any new airship—off the ground, for real? Under his desk in Washington, D.C., the program insider kept a box containing miniature foam replicas of the LEMV, toys for handing out at trade shows. The tiny scale LEMVs would soon be among the only evidence the Army had even wanted a giant, robotic spy blimp.

For nearly another six months after LEMV’s first flight, the program was pretty much in limbo, its fate obvious but never officially stated. The intel staff made half-hearted overtures to ASA-ALT asking if the managers there could maybe find more money for LEMV, but ASA-ALT, previously scorned by the intel staff, blew off the requests.

And in February Legere, the intel chief, called 900 of her staffers to a meeting in the Pentagon to talk about budget cuts, including the possibility of furloughs. Toward the end of the discussion Legere surprised everyone by bringing up LEMV, the insider recalls.

“Some of you may have heard we are going to cancel this project,” Legere said, according to the insider. “You would be correct. Let me tell you all something. I’d rather pay you all of your money than allocate funds for this ridiculous, stupid project again.”

LEMV was dead. And with it, any chance the Pentagon had to acquire a next-gen airship. The prototype was deflated in late May, its pricey helium venting into the air, impossible to recover. HAV, the British airframe-maker, began negotiating with the Army to buy back the blimp components for the company’s own use.

Since then Army has been trying to distance itself from the program’s failure, portraying LEMV mostly as a victim of circumstance. “With the reduced U.S. presence in Afghanistan coupled with the technical challenges and limitations of constrained resources, the Army made the determination to discontinue the LEMV development,” service spokesman John Cummings tells War is Boring.

But the insider has a different view. “Army management at the highest levels failed LEMV.” And failed the entire concept of a future war blimp.

NIST seeks feedback on draft cybersecurity framework
Aug. 28, 2013 – 07:17PM   |  


The government is one step closer to finalizing what will become a framework of best practices and voluntary standards for securing critical infrastructure systems.

The National Institute of Standards and Technology on Wednesday released a preliminary draft of the framework , which includes a host of standards and guidelines for companies to measure how well they know what systems to protect, based on priority and impact to the company’s mission, and how well they can detect, prevent, respond to and recover from a cyber attack. The framework advises critical infrastructure companies to:

■ Inventory and track physical devices, systems and software applications and platforms within the organization.

■ Protect remote access to organizational networks to include telework guidance, mobile devices access restrictions and cloud computing policies and procedures.

■ Reduce potential for abuse of authorized privileges by eliminating unnecessary assets, separation of duties procedures and least privilege requirements.

■ Integrate cybersecurity practices and procedures with human resources management, such as personnel screenings, departures and transfers.

■ Perform personnel and system monitoring activities over external service providers.

“The Framework complements, and does not replace, an organization’s existing business or cybersecurity risk management process and cybersecurity program,” according to the document. The goal is to ensure the framework can be adapted to meet the unique threats facing a company, is cost-effective to implement, focuses on outcomes and complements rather than conflicts with current regulatory authorities.

NIST has been working with industry to construct the framework. Under the president’s cybersecurity executive order released in February, NIST has until October to publish a draft framework that includes those standards. A final version of the framework is due in February.

Companies that adhere to the voluntary standards could be rewarded with preferences in obtaining federal grants, lower insurance rates or public recognition, Michael Daniel, White House cybersecurity coordinator said in a blog post this month. The Department of Homeland Security and other agencies have been directed to suggest ways to encourage companies to adopt the standards.

The document released Wednesday is a discussion draft NIST is using to solicit feedback from the public before finalizing the draft framework. Specifically, NIST wants to know if the preliminary draft, as presented, is inclusive of, and not disruptive to, effective cybersecurity practices; adequately defines outcomes that strengthen cybersecurity and supports business objectives; and provides sufficient guidance and resources to aid businesses of all sizes.

NIST expects implementation of the framework will vary by company because each uses information technology and operational technology differently.

The document also includes a methodology for protecting privacy and civil liberties, such as identifying all personally identifiable information a company collects or retains that may be accessible and auditing access to databases that contain PII.


Surveillance drone helps firefighters battle Calif. blaze


Doug Stanglin, USA TODAY 11:16 a.m. EDT August 29, 2013

The 12-day-old fire has grown to 301 square miles and is 30% contained.

A National Guard Predator drone is flying over the vast Rim Fire near Yosemite park to send back real-time data to firefighters on its size and direction of the blaze in the longest such mission in California.

The unmanned drone launched Wednesday will be airborne for 22 hours.

“It will identify where fire activity is located and how it is moving, including locating and identifying spot fires which will improve the ability to protect life, property, and natural resources,” the U.S. Forest Service said in a statement.

The 12-day-old Rim Fire has grown to 301 square miles, and officials said the fire was 30% contained as of Thursday morning.

”We continue to get line around this fire,” said California fire spokesman Daniel Berlant. ”It’s not nearly as active as it was last week.”

Fire officials estimate that it can be contained no sooner than Sept. 10.

The remotely piloted drone, which is the size of a small Cessna, has helped firefighters by shouldering the burden normally carried out by helicopters, which must be refueled every two hours.

The MQ-1 unmanned aircraft is from the California Air National Guard’s 163rd Reconnaissance Wing from Riverside and is operating from Victorville Airport. It flew over mostly unpopulated areas on its 300-mile flight to the Rim Fire

”The drone is providing data directly back to the incident commander, allowing him to make quick decisions about which resources to deploy and where,” Berlant said.

Officials were careful to point out the images are being used only to aid in the effort to contain the fire. Outside the fire area, it will be escorted by a manned aircraft.

In 2009 a NASA Predator equipped with an infrared imaging sensor helped the U.S. Forest Service assess damage from a fire in Angeles National Forest. In 2008, a drone capable of detecting hot spots helped firefighters assess movement of a series of wildfires stretching from Southern California’s Lake Arrowhead to San Diego.


More than 3,000 Civilian Medical Personnel Quit Amid Furloughs, Budget Cuts

Eric Katz

Government Executive

August 28, 2013


More than 3,000 Defense Department medical civilians have left the agency in the face of furloughs and continued budget uncertainty, according to a report in USA Today.

A majority of the departures have come from the Army, where 5 percent of the civilian medical workforce has quit or retired this year. Vacated positions include “highly skilled clinicians, scientists, researchers and other health workers,” Lt. Gen. Patricia Horoho, the Army’s surgeon generalsaid.

 Eric Katz joined Government Executive in the summer of 2012 after graduating from The George Washington University, where he studied journalism and political science. He has written for his college newspaper and an online political news website and worked in a public affairs office for the Navy’s .

Horoho pointed directly to the nearly departmentwide, mandatory days of unpaid leave as a possible impetus for the separations. Many of the medical staff left DoD to work for the Veterans Affairs Department, which is exempt from sequestration.

While some DoD medical staff were also exempted from furloughs, about 60 percent of the Army’s doctors and nurses had to take six days of unpaid leave.

Defense Secretary Chuck Hagel has repeatedly warned of the effects of sequestration on the morale of the workforce. While Hagel has said he hopes to avoid furloughs starting in October — should sequestration continue in the new fiscal year as scheduled — the department recently announced plans to lay off more than 6,000 workers in 2014.

In addition to the Army’s departures, the Air Force lost 6 percent of its medical staff between late-February and mid-August, USA Today reported, while the Navy lost about 1 percent. All told, nearly 3,400 civilian medical personnel quit in six-month period. 

A Pentagon spokesman said the Defense Department made the decision to furlough only after careful consideration of the effects they — as well as other cuts — would have.

Hagel “recognized the significant hardship this placed on DoD civilians and their families,” Lt. Cmdr. Nate Christensen said. “We looked at all options to meet these cuts and unfortunately, furloughs became a reality.” 

J. David Cox, national president of the American Federation of Government Employees, said furloughs are only the latest in a series of challenges causing defense civilians to leave.

“Don’t think that 3,400 professionals left DoD medical facilities in the past quarter only because of furloughs,” Cox said. “It is furloughs on top of pay freezes, on top of hiring freezes, on top of retirement cuts, on top of threats of continued abuse and nine more years of sequestration. It’s the accumulation of cuts, disrespect and declining living standards that pushed them out the door.”


Syria Strike Wouldn’t Be Cheap

Defense News

Aug. 28, 2013 – 03:45AM | By MARCUS WEISGERBER |

WASHINGTON — A cruise missile strike against Syria could cost the Pentagon hundreds of millions of dollars in weapons, according to experts and government documents.

Since any type of US military action is expected to last just a few days, the price tag would be similar to costs accrued during the early days of the 2011, five-month NATO operation to overthrow Libyan leader Moammar Gadhafi, defense analysts say.

The first few weeks of the Libyan operation cost the US about $600 million. About $340 million of that was directly was to replenish munitions, specifically sea-launched Raytheon Tomahawk cruise missiles and air-launched Boeing Joint Direct Attack munitions, according to a Congressional Research Service report.

Tomahawk Land Attack Missiles (T-LAMs) cost about $1.4 million each, according to government budget documents.

But unlike the Libyan mission, there has been little talk of establishing a costly no fly zone over Syria.

The US and its allies appear to be planning for a limited strike against the Syrian government, which Western nations claim has used chemical weapons against civilians.

The US Navy has four destroyers in the eastern Mediterranean Sea, each with up to 96 missile cells — although experts say these ships are rarely loaded with a full complement of these types of weapons.

Missile-armed submarines are also likely to be in the area, although the US would not confirm their presence. Several submarines, including one SSGN missile sub armed with a capacity of 154 missiles, took part in the Libyan campaign. That submarine, the Florida, reportedly launched as many as 99 Tomahawks at targets in Libya in March 2011 alone.

At least one British submarine also launched missiles against Libyan targets.

Command-and-control and intelligence aircraft — such as E-3 AWACS Airborne Warning and Control System and E-8 JSTARS Joint Surveillance Target Attack Radar System — which would likely support any type of strike on Syria, could also add to the price tag. During the first 10 days of the Libya operation, the US spent about $1.6 million on these types of missions.

Support from aerial refueling tankers would also add to the cost. During the first weeks of the Libya operation, US Air Force tankers flew for more than 800 hours, costing $9.3 million.


Rasmussen Reports

What They Told Us: Reviewing Last Week’s Key Polls

Saturday, August 31, 2013

Week’s end finds the Obama administration out on a limb with its military threats against Syria, and despite President Obama’s enthusiastic embrace this week of the 50th anniversary of the civil rights March on Washington, Americans have a pretty sour view of race relations in this country.

U.S. voters continue to show little interest in getting involved in the civil war in Syria despite Secretary of State John Kerry’s announcement yesterday that the United States has definite proof of the Syrian government’s use of chemical weapons against its opponents. Just 37% think the United States should provide increased military assistance to protect the citizens of Syria if it is confirmed that the Syrian government used chemical weapons. Forty percent (40%) are opposed.

Support and opposition are both up slightly from early in the week.
The latest findings come from the night before and the night after Kerry’s announcement.

Seventy-three percent (73%) already think it is at least somewhat likely that the Syrian government has used chemical weapons against its citizens, including 53% who say it is Very Likely. But voters have consistently opposed U.S. involvement in any of the so-called Arab Spring protests. Fifty percent (50%) believe U.S. involvement in Middle East politics is bad for America. 

Forty-two percent (42%) of voters now think the president is doing a good or excellent job on national security.Thirty-six percent (36%) rate him poorly for his handling of national security issues. Obama’s positives in the area of national security hit a high of 54% in mid-January but by June had fallen to the low 40s, levels not seen since prior to the killing of Osama bin Laden in May 2011.

Obama’s job approval ratings in the daily Presidential Tracking Poll remain at levels seen for much of his first term in office.   Voters have consistently felt that the president is friendlier toward big businesses than small businesses, and they now believe so more than ever.

Voters think America’s a better place since Martin Luther King, Jr. gave his famous “I have a dream” speech 50 years ago this week, but nearly nine-out-of-10 say race relations have gotten worse or remained about the same since the election of the nation’s first black president. Sixty-nine percent (69%) think race relations in this country are better today than they were 50 years ago, but just 10% think they are better since Obama’s election. Only four percent (4%) of blacks believe those relations have gotten better since November 2008.

King preached about a day when men and women of all races would have equal opportunity in America. Only 27% of all voters think America has reached a day of equal opportunity for all. But 34% believe the federal government has done too much to promote equal opportunity in this country. Just as many (35%) say the government has done too little.

Raising the federal debt ceiling and funding the president’s national health care law will be front-burner issues when Congress returns to Washington, DC next week. Republicans tend to think a Tea Party-inspired GOP congressional threat to shut down the federal government to halt funding for the health care law will be good for their party, but other voters disagree.

Though most voters believe the Tea Party has less influence these days, 78% of Republicans believe it’s at least somewhat important for their leaders in Congress to work with the Tea Party, with 45% who think it’s Very Important.

While many voters are critical of the Tea Party itself, most continue to share its small government principles. Voters are still in agreement that tax and spending cuts help the economy, and 42% now would be more likely to vote for a candidate who promised to oppose all tax increases over one who would only raise taxes on the rich. That’s the highest level of support in over a year. Forty-one percent (41%) would vote for the one who would only raise taxes on the wealthy.

Most voters still have an unfavorable opinion of the health care law and believe it will increase the nation’s deficit and drive up health care costs.  While voters give the U.S. health care system lackluster reviews, 57% expect it to get even worse over the next couple of years as the new law is implemented. 

As of today, only 16% of adult consumers and 20% of investors believe the U.S. economy is in good or excellent shape.

However, Americans continue to be a bit more optimistic as far as the housing market is concerned. Forty-nine percent (49%) of Americans think buying a home is the best investment most families can make, down from a high of 67% in May 2009. But one-in-three (34%) say now is a good time for someone in their area to be selling a house. That’s up from 15% this time last year. 

Fifty-six percent (56%), though, expect interest rates to be higher next year at this time, the highest level of pessimism in over two years. 

Voters rate the damage done to U.S. national security by recently convicted Army intelligence analyst Bradley Manning and former government contractor Edward Snowden as about even, but a sizable number remain undecided.

Republicans lead Democrats by one point on the latest Generic Congressional Ballot. The two parties have been running neck-and-neck since mid-April.

Mid-term elections are coming next year, with control of the Congress again up for grabs, and you don’t want to miss a single survey. Take advantage of Rasmussen Reports’ special offer: A Rasmussen Reader subscription that lasts through December 31, 2014 is now just $24.95. Sign up today!

In other surveys last week:

– For the third straight week, 29% of Likely U.S. Voters say the country is heading in the right direction.

Thirty-six percent (36%) took a summer vacation this year, but 40% of those Americans say economic conditions caused them to cut back on the amount they spent on their vacation this year. 

– While many schools are already back in session, 65% of Americans with school-age children don’t think school should start until after Labor Day

– Forty-one percent (41%) of all adults think students should be required to wear uniforms to school. Forty-seven percent (47%) are opposed, down from 53% two years ago.

– Eighteen states currently offer a “sales tax holiday” – a brief suspension of sales taxes – at this time of year to encourage back-to-school shopping, and 57% like the idea. 

– Fifty percent (50%) of Americans believe movies and the movie industry have a negative impact on American society. Fifty-nine percent (59%) think violent movies lead to more violence in society. 

– Fifty-two percent (52%) of Americans rarely or never go to the movies, but among those who do, 59% say a film’s storyline is the strongest draw. Just 19% say the cast typically determines their movie choices.

Fifty-six percent (56%) say their family regularly flies the U.S. flag on important holidays, and 73% don’t think anyone should be able to stop them unless it’s a public safety issue.  Americans also strongly believe school children should be required to honor the flag every morning.

August 24 2013



Federal News of Interest

Blog version at


Are Washington’s policies causing a retirement ‘brain drain’ at federal agencies?

By Josh Hicks, Updated: August 19, 2013

A new online ticker from the National Active and Retired Federal Employees association claims to show the hours of institutional knowledge lost through federal-worker retirements this year, illustrating a supposed “brain drain” caused by increased retirements at a time when the government is cutting back on hiring and freezing salary rates.

The ticker showed nearly 920 million hours of lost experience in 2013 as of Sunday evening. NARFE claims retirements cost the government an average of 10,000 years of knowledge every day.

Data from the Office of Personnel Management shows that more than 82,000 federal workers have filed retirement claims since January, representing a jump of 30 percent. The agency has said a dramatic increase in retirements this year for the financially struggling Postal Service has contributed to the rise.

NARFE said in a statement that policy decisions to furlough workers, freeze pay rates and increase contributions toward retirement benefits may be encouraging workers to retire prematurely. It said the losses are “threatening the services they dedicated their careers to building.”

The numbers on the ticker may oversimplify matters a bit.

NARFE said in its statement that the 82,000-plus employees who retired this year have taken 2 million years of experience with them, meaning each would have worked an average of 24 years for the government. But OPM rules allow workers to retire with less service. For example, a 62-year-old employee can retire with benefits from the Federal Employee Retirement System after just five years with the government, while a 60 year-old can retire after 20 years.

It is unclear how NARFE came up with its average of 24 years of service for each retiree — was it an assumption or was it based on actual OPM data? The Federal Eye has requested an explanation from the group and will update this blog to clarify the methodology if possible.


Most of U.S. Is Wired, but Millions Aren’t Plugged In



August 18, 2013


The Obama administration has poured billions of dollars into expanding the reach of the Internet, and nearly 98 percent of American homes now have access to some form of high-speed broadband. But tens of millions of people are still on the sidelines of the digital revolution.

“The job I’m trying to get now requires me to know how to operate a computer,” said Elmer Griffin, 70, a retired truck driver from Bessemer, Ala., who was recently rejected for a job at an auto-parts store because he was unable to use the computer to check the inventory. “I wish I knew how, I really do. People don’t even want to talk to you if you don’t know how to use the Internet.”

Mr. Griffin is among the roughly 20 percent of American adults who do not use the Internet at home, work and school, or by mobile device, a figure essentially unchanged since Barack Obama took office as president in 2009 and initiated a $7 billion effort to expand access, chiefly through grants to build wired and wireless systems in neglected areas of the country.

Administration officials and policy experts say they are increasingly concerned that a significant portion of the population, around 60 million people, is shut off from jobs, government services, health care and education, and that the social and economic effects of that gap are looming larger. Persistent digital inequality — caused by the inability to afford Internet service, lack of interest or a lack of computer literacy — is also deepening racial and economic disparities in the United States, experts say.

“As more tasks move online, it hollows out the offline options,” said John B. Horrigan, a senior research fellow at the Joint Center for Political and Economic Studies. “A lot of employers don’t accept offline job applications. It means if you don’t have the Internet, you could be really isolated.”

Seventy-six percent of white American households use the Internet, compared with 57 percent of African-American households, according to the “Exploring the Digital Nation,” a Commerce Department report released this summer and based on 2011 data.

The figures also show that Internet use over all is much higher among those with at least some college experience and household income of more than $50,000.

Low adoption rates among older people remain a major hurdle. Slightly more than half of Americans 65 and older use the Internet, compared with well over three-quarters of those under 65.

In addition, Internet use is lowest in the South, particularly in Mississippi, Alabama and Arkansas.

Willa Ohnoutka, 78, who has lived in the same house in suburban Houston for 40 years, said she did not use the Internet at all. “I use my telephone,” Ms. Ohnoutka said. “I get news on the TV. I’m just not comfortable involving myself with that Internet.”

Others cite expense as the reason they do not use the Internet.

“I am cheap,” said Craig Morgan, 23, a self-employed carpenter from Oxford, Miss. So far, he has made do without the Internet at home, but while he has used a smartphone to connect, that has limitations, he said.

“When we came home from the hospital with our new baby two months ago,” the hospital “took pictures and put them online,” he said. “We had to go to my in-laws to order them.”

Gloria Bean, 41, an elementary school teaching assistant from Calhoun City, Miss., said cost was also a reason she had not had Internet access at home for three years.

“I just couldn’t afford it,” she said. Being cut off, she said, “has affected me and my children.”

“They have to have it for school to do research for a paper or something they need for class,” Ms. Bean said.

As a result, she added, she often rushes from her job at school to pick up her children and take them to the library, where there are 10 computers.

The Obama administration allocated $7 billion to broadband expansion as part of the 2009 economic stimulus package. Most of it went to build physical networks. About half of those infrastructure programs have been completed, with Internet availability growing to 98 percent of homes from fewer than 90 percent.

About $500 million from the package went toward helping people learn to use the Internet. Those programs were highly successful, though on a small scale, producing more than half a million new household subscribers to Internet service, Commerce Department statistics show.

“We recognize more work needs to be done to ensure that no Americans are left behind,” said John B. Morris Jr., director of Internet policy at the National Telecommunications and Information Administration, part of the Commerce Department. “Increasing the level of broadband adoption is a complex, multifaceted challenge with no simple, one-size-fits-all solution.”

The percentage of people 18 years and older in the United States who have adopted the Internet over the past two decades has grown at a rate not seen since the popularization of the telephone, soaring nearly fivefold, from 14 percent in 1995. Although that growth slowed in more recent years, it had still moved close to 80 percent of the population by the beginning of the Obama administration in 2009, according to several academic and government studies.

Since then, however, the number has not budged, shifting between 74 percent and 79 percent through 2011, according to one study by the Pew Internet and American Life Project. Pew’s most recent research shows the figure fluttering this year between 81 percent and 85 percent, a slight uptick that experts attribute to the still-growing popularity of smartphones. Most smartphone users also have home connections, however, and do not face the affordability or digital literacy problems that have caused Internet adoption to remain stagnant.

Even at that level of Internet adoption, however, the United States, with the world’s largest economy by far, ranked seventh among 20 major global economies in 2012, down from fourth in 2000, according to the International Telecommunication Union, a United Nations agency. Ranking ahead of the United States were Britain, Canada, South Korea, Germany, France and Australia, as well as nearly every other smaller country in Western Europe.

Aaron Smith, a senior researcher at the Pew Project, said that when the center asked nonusers if they believed they were missing out or were disadvantaged by not using the Internet, most of the older Americans said no, it was not relevant to them. “But when you excluded the seniors,” he added, “most people said, ‘Yeah, I feel like I’m not getting the access to all the things that I need.’ “

Researchers say the recent recession probably contributed to some of the flattening in Internet adoption, just as the Great Depression stalled the arrival of home telephone service. But a significant portion of nonusers cite their lack of digital literacy skills as a discouraging factor.

Some programs, like the federally financed Smart Communities, have shown promising results. Smart Communities, a $7 million effort in Chicago that was part of the administration’s $7 billion investment, provided basic Internet training in English and Spanish for individuals and small businesses. Between 2008 and 2011, the Smart Communities participants registered a statistically significant 15 percentage-point increase in Internet use compared with that in other Chicago community areas.

The Federal Communications Commission and some Internet providers have started programs to make Internet service more affordable for low-income households. Comcast’s two-year-old Internet Essentials program, which offers broadband service for $10 a month to low-income families, has signed up 220,000 households out of 2.6 million eligible homes in Comcast service areas.

Those types of programs hold promise, administration officials say, but they remain unsatisfied. “I’ve seen enough to know that we’re making good progress,” said Thomas C. Power, the administration’s deputy chief technology officer for telecommunications. “But I also know we need to make more progress.”

Cynthia Howle, Glenny Brock and Alan Blinder contributed reporting.


NSA broke privacy rules thousands of times per year, audit finds

Washington Post

By Barton Gellman, Published: August 15


The National Security Agency has broken privacy rules or overstepped its legal authority thousands of times each year since Congress granted the agency broad new powers in 2008, according to an internal audit and other top-secret documents.

Most of the infractions involve unauthorized surveillance of Americans or foreign intelligence targets in the United States, both of which are restricted by statute and executive order. They range from significant violations of law to typographical errors that resulted in unintended interception of U.S. e-mails and telephone calls.


The documents, provided earlier this summer to The Washington Post by former NSA contractor Edward Snowden, include a level of detail and analysis that is not routinely shared with Congress or the special court that oversees surveillance. In one of the documents, agency personnel are instructed to remove details and substitute more generic language in reports to the Justice Department and the Office of the Director of National Intelligence.

In one instance, the NSA decided that it need not report the unintended surveillance of Americans. A notable example in 2008 was the interception of a “large number” of calls placed from Washington when a programming error confused the U.S. area code 202 for 20, the international dialing code for Egypt, according to a “quality assurance” review that was not distributed to the NSA’s oversight staff.

In another case, the Foreign Intelligence Surveillance Court, which has authority over some NSA operations, did not learn about a new collection method until it had been in operation for many months. The court ruled it unconstitutional.

The Obama administration has provided almost no public information about the NSA’s compliance record. In June, after promising to explain the NSA’s record in “as transparent a way as we possibly can,” Deputy Attorney General James Cole described extensive safeguards and oversight that keep the agency in check. “Every now and then, there may be a mistake,” Cole said in congressional testimony.

The NSA audit obtained by The Post, dated May 2012, counted 2,776 incidents in the preceding 12 months of unauthorized collection, storage, access to or distribution of legally protected communications. Most were unintended. Many involved failures of due diligence or violations of standard operating procedure. The most serious incidents included a violation of a court order and unauthorized use of data about more than 3,000 Americans and green-card holders.

In a statement in response to questions for this article, the NSA said it attempts to identify problems “at the earliest possible moment, implement mitigation measures wherever possible, and drive the numbers down.” The government was made aware of The Post’s intention to publish the documents that accompany this article online.

“We’re a human-run agency operating in a complex environment with a number of different regulatory regimes, so at times we find ourselves on the wrong side of the line,” a senior NSA official said in an interview, speaking with White House permission on the condition of anonymity.

“You can look at it as a percentage of our total activity that occurs each day,” he said. “You look at a number in absolute terms that looks big, and when you look at it in relative terms, it looks a little different.”

There is no reliable way to calculate from the number of recorded compliance issues how many Americans have had their communications improperly collected, stored or distributed by the NSA.

The causes and severity of NSA infractions vary widely. One in 10 incidents is attributed to a typographical error in which an analyst enters an incorrect query and retrieves data about U.S phone calls or e-mails.

But the more serious lapses include unauthorized access to intercepted communications, the distribution of protected content and the use of automated systems without built-in safeguards to prevent unlawful surveillance.

The May 2012 audit, intended for the agency’s top leaders, counts only incidents at the NSA’s Fort Meade headquarters and other ­facilities in the Washington area. Three government officials, speak­ing on the condition of anonymity to discuss classified matters, said the number would be substantially higher if it included other NSA operating units and regional collection centers.

Senate Intelligence Committee Chairman Dianne Feinstein (D-Calif.), who did not receive a copy of the 2012 audit until The Post asked her staff about it, said in a statement late Thursday that the committee “can and should do more to independently verify that NSA’s operations are appropriate, and its reports of compliance incidents are accurate.”

Despite the quadrupling of the NSA’s oversight staff after a series of significant violations in 2009, the rate of infractions increased throughout 2011 and early 2012. An NSA spokesman declined to disclose whether the trend has continued since last year.

One major problem is largely unpreventable, the audit says, because current operations rely on technology that cannot quickly determine whether a foreign mobile phone has entered the United States.

In what appears to be one of the most serious violations, the NSA diverted large volumes of international data passing through fiber-optic cables in the United States into a repository where the material could be stored temporarily for processing and selection.

The operation to obtain what the agency called “multiple communications transactions” collected and commingled U.S. and foreign e-mails, according to an article in SSO News, a top-secret internal newsletter of the NSA’s Special Source Operations unit. NSA lawyers told the court that the agency could not practicably filter out the communications of Americans.

In October 2011, months after the program got underway, the Foreign Intelligence Surveillance Court ruled that the collection effort was unconstitutional. The court said that the methods used were “deficient on statutory and constitutional grounds,” according to a top-secret summary of the opinion, and it ordered the NSA to comply with standard privacy protections or stop the program.

James R. Clapper Jr., the director of national intelligence, has acknowledged that the court found the NSA in breach of the Fourth Amendment, which prohibits unreasonable searches and seizures, but the Obama administration has fought a Freedom of Information lawsuit that seeks the opinion.


Generally, the NSA reveals nothing in public about its errors and infractions. The unclassified versions of the administration’s semiannual reports to Congress feature blacked-out pages under the headline “Statistical Data Relating to Compliance Incidents.”

Members of Congress may read the unredacted documents, but only in a special secure room, and they are not allowed to take notes. Fewer than 10 percent of lawmakers employ a staff member who has the security clearance to read the reports and provide advice about their meaning and significance.

The limited portions of the reports that can be read by the public acknowledge “a small number of compliance incidents.”

Under NSA auditing guidelines, the incident count does not usually disclose the number of Americans affected.

“What you really want to know, I would think, is how many innocent U.S. person communications are, one, collected at all, and two, subject to scrutiny,” said Julian Sanchez, a research scholar and close student of the NSA at the Cato Institute.

The documents provided by Snowden offer only glimpses of those questions. Some reports make clear that an unauthorized search produced no records. But a single “incident” in February 2012 involved the unlawful retention of 3,032 files that the surveillance court had ordered the NSA to destroy, according to the May 2012 audit. Each file contained an undisclosed number of telephone call records.

One of the documents sheds new light on a statement by NSA Director Keith B. Alexander last year that “we don’t hold data on U.S. citizens.”

Some Obama administration officials, speaking on the condition of anonymity, have defended Alexander with assertions that the agency’s internal definition of “data” does not cover “metadata” such as the trillions of American call records that the NSA is now known to have collected and stored since 2006. Those records include the telephone numbers of the parties and the times and durations of conversations, among other details, but not their content or the names of callers.

The NSA’s authoritative def­inition of data includes those call records. “Signals Intelligence Management Directive 421,” which is quoted in secret oversight and auditing guidelines, states that “raw SIGINT data . . . includes, but is not limited to, unevaluated and/or unminimized transcripts, gists, facsimiles, telex, voice, and some forms of computer-generated data, such as call event records and other Digital Network Intelligence (DNI) metadata as well as DNI message text.”

In the case of the collection effort that confused calls placed from Washington with those placed from Egypt, it is unclear what the NSA meant by a “large number” of intercepted calls. A spokesman declined to discuss the matter.

The NSA has different reporting requirements for each branch of government and each of its legal authorities. The “202” collection was deemed irrelevant to any of them. “The issue pertained to Metadata ONLY so there were no defects to report,” according to the author of the secret memo from March 2013.

The large number of database query incidents, which involve previously collected communications, confirms long-standing suspicions that the NSA’s vast data banks — with code names such as MARINA, PINWALE and XKEYSCORE — house a considerable volume of information about Americans. Ordinarily the identities of people in the United States are masked, but intelligence “customers” may request unmasking, either one case at a time or in standing orders.

In dozens of cases, NSA personnel made careless use of the agency’s extraordinary powers, according to individual auditing reports. One team of analysts in Hawaii, for example, asked a system called DISHFIRE to find any communications that mentioned both the Swedish manufacturer Ericsson and “radio” or “radar” — a query that could just as easily have collected on people in the United States as on their Pakistani military target.


The NSA uses the term “incidental” when it sweeps up the records of an American while targeting a foreigner or a U.S. person who is believed to be involved in terrorism. Official guidelines for NSA personnel say that kind of incident, pervasive under current practices, “does not constitute a . . . violation” and “does not have to be reported” to the NSA inspector general for inclusion in quarterly reports to Congress. Once added to its databases, absent other restrictions, the communications of Americans may be searched freely.

In one required tutorial, NSA collectors and analysts are taught to fill out oversight forms without giving “extraneous information” to “our FAA overseers.” FAA is a reference to the FISA Amendments Act of 2008, which granted broad new authorities to the NSA in exchange for regular audits from the Justice Department and the Office of the Director of National Intelligence and periodic reports to Congress and the surveillance court.

Using real-world examples, the “Target Analyst Rationale Instructions” explain how NSA employees should strip out details and substitute generic descriptions of the evidence and analysis behind their targeting choices.

“I realize you can read those words a certain way,” said the high-ranking NSA official who spoke with White House authority, but the instructions were not intended to withhold information from auditors. “Think of a book of individual recipes,” he said. Each target “has a short, concise description,” but that is “not a substitute for the full recipe that follows, which our overseers also have access to.”

Julie Tate and Carol D. Leonnig contributed to this report.


Is Common Core Too Hard-Core?

James Marshall Crotty, Contributor

8/16/2013 @ 3:03PM


For those long skeptical of the decade-long improvement in the standardized test scores of New York high school students, the August 7 release of the state’s Common Core test results was bittersweet redemption. New York is the second state in the country, after Kentucky, to test its students under stringent Common Core learning standards adopted across the nation to bring uniformity to student testing.

And the results are not pretty.

31% of New York students in grades three though eight met or exceeded math and English competency standards on tests given over six days this past April. In 2012, under the older, far easier, standards, 65% of New York students were proficient in Math and 55% proficient in English. Moreover, according to the Summary of Statewide 3-8 Exam Results, “only 16.1% of African-American students and 17.7% of Hispanic students met or exceeded” the English Language Arts (ELA) proficiency standard, far lower than in years past.

Though these results should come as no surprise to anyone who has worked inside your average New York public school, they are actually a sign of hope. By effectively tying Common Core adoption to a state’s receipt of Race to the Top funding grants – 45 of 50 states have adopted Common Core Standards — the Department of Education has indicated that it is no longer willing to enable states to game the achievement metrics mandated by No Child Left Behind. This is particularly refreshing news, given that the Obama administration’s earlier NCLB compliance waivers – albeit, often tied to Common Core adoption — suggested that they were not serious about demanding excellence across school districts.

Rather than viewing New York’s results as an indictment of Common Core — or the allegedly speedy, haphazard way in which the standards were adopted — we should applaud the state’s courage in demanding that all New York students empirically demonstrate the writing, problem-solving, and critical thinking skills necessary for success in a global information economy.

Moreover, we should remember that the bipartisan standards-based movement is not of recent vintage. It dates back to the 1983 publication of A Nation at Risk and subsequent teacher-lead initiatives, such as the 1989 National Council of Teachers of Mathematics’ Curriculum and Evaluation Standards for School Mathematics. These early documents were followed by President H.W. Bush’s standards-focused 1989 national education summit, the Clinton administration’s 1996 National Education Summit, and the stepped-up call for common standards that came with both 2001′s No Child Left Behind and the 2004 report, Ready or Not: Creating a High School Diploma That Counts. These efforts collectively made clear that the standard American high school diploma had lost its brand value, as the real world demands of colleges and employers had become more rigorous and exacting.

Today’s Common Core State Standards are simply a more evolved version of those original standards. A curriculum that is more transparent and work-relevant — with clear benchmarks for success — helps parents, teachers, and students gain clarity on what is expected for career and life success in an intensely competitive marketplace.

Opponents of uniform higher standards describe them as unfair, too rigorous, and with intrusive data tracking systems that violate privacy. Moreover, they castigate Common Core as a stalking horse for school privatization, and a paean to vested corporate “special interests,” such as Pearson, which would ostensibly benefit from an easily replicable set of common core textbooks. Weirdly, as opponents decry Common Core as a push for decentralized privatization — egged on by the likes of that evil no-good Bill and Melinda Gates Foundation – they simultaneously rail about the increased centralization of standards out of Washington, DC, forgetting that DC has for decades set benchmarks for academic competency, regardless of state and local wiggle room in meeting those benchmarks.

In addition, contrary to heated claims of Common Core opponents, no set of standards emanating out of DC has ever precluded schools and teachers from making allowances for different student “learning styles, preferences, and paces.” Moreover, Common Core Standards do not preclude schools from adding back a cornucopia of important subjects — from art to music to great books shared inquiry — as required courses or electives. In addition, as noted by education consultant, and former Assistant Secretary of Education in the G.W. Bush administration, Christina Culver, “these standards are designed to raise the bar, but no one is stopping schools and teachers from exceeding the bar. In fact, the BASIS charter schools claim they complete the K-12 Common Core Standards by the 9th grade. I am sure many great schools with high bars could do this.”

While privacy concerns, in particular, should be taken seriously in light of the recent NSA and IRS scandals, a primary reason given for opposing “common core data mining” – it might, God forbid, lead to the empirically beneficial mandatory early childhood education – seems specious. The problem is that extreme privacy activists see better tracking as invariably Orwellian when, in fact, better tracking of student attendance, participation, and academic progress has been a key component in finally tackling the nation’s debilitating dropout epidemic.

Taken as a whole, these concerns strike me as the same tired, regressive apologies for failure that have kept the U.S. mired in the middle of the pack on global tests of academic excellence. All of the countries leading the U.S. in academic achievement – e.g., Singapore, South Korea, and Finland – have a common set of academic standards. Moreover, Common Core’s opponents have yet to empirically show why lowering standards based on the random criteria of state, district or locality is somehow better for students and the country. As report after report documents, if the U.S. is to continue to lead the world in economic growth – and the innovation that feeds such growth – we must start, at the very bottom, to demand more from our students, parents, and educators across all localities, regardless of race, income, or geography. With 48% of college graduates working in jobs that do not require a four-year college degree, and 75% of underemployed college graduates working in jobs that require no more than a high school diploma, clearly leaving the solution up to the sole discretion of states and localities is not working.

By providing consistent and practical guideposts — and long-term data analysis that tracks whether educational programs actually help students learn — Common Core learning standards are a long-awaited first step to closing the talent gap that bedevil college admissions officers and employers alike.


Ohio among states vying for feds’ unmanned-aircraft test centers

By Jessica Wehrman

The Columbus Dispatch

Monday August 19, 2013 3:54 AM


WASHINGTON — At a conference last week devoted to the unmanned aerial vehicle industry, states including Ohio, Oklahoma, North Dakota and Idaho set up sometimes-lavish exhibits touting their amenities.

The displays varied, but the message was the same: Pick us.

As many as 25 states are vying to be among the six sites the Federal Aviation Administration will select later this year for commercial testing of unmanned aerial vehicles, or UAVs. And as the U.S. unmanned aircraft industry evolves beyond military use into commercial use, it is increasingly viewed as a potential economic boon to states seeking economic development. The aircraft, they say, can be used for everything from precision agriculture to police work.

That made last week’s conference, held at a Washington, D.C., convention center, something of a beauty contest. States submitted their applications to be FAA test sites months ago, but as they laid out their free Frisbees or recyclable tote bags emblazoned with state logos, many hoped their exhibit would help “brand” them as UAV-friendly.

“Every good marketing effort requires repetition of the message,” said Michael O’Malley, director of the Utah Governor’s Office of Economic Development, standing in front of a giant inflatable Yeti holding up a UAV. Utah has been going to the AUVSI Unmanned Systems conference since 1996 and exhibiting there since 2006.

Ohio, meanwhile, has exhibited at the convention for the past three years, expanding its presence each time. This year, the state hosted a reception that featured giveaways including journals emblazoned with the state logo, a ringed binder advertising Ohio’s assets and cookies shaped like the state of Ohio. Staff at the exhibit wore matching red shirts one day, matching white shirts the next. Their name tags were shaped like the state of Ohio.

On Tuesday, hours after North Dakota held a reception at the convention featuring fruit, veggie and cheese platters as well as remarks by the state’s lieutenant governor, Ohio held its own reception, with light hors d’oeuvres and wine and beer. By 5 p.m., the Ohio booth was bustling. The next day, Rep. Mike Turner, R-Dayton, toured the exhibit — another chance to draw attention to the state’s booth.

Physically, Ohio’s exhibit was one of the biggest — only Oklahoma’s took up more space — and the booth alone cost $45,000 to design and build. That figure doesn’t count the other costs, including giveaways, literature and reception costs. The Dayton Development Coalition, Ohio’s Third Frontier Program and a coalition of Ohio companies and universities split the bill, according to Michael Gessel of the Dayton Development Coalition.

“We’ve got to have a presence here,” said Scott Koorndyk, chief operating officer of the Dayton Development Coalition, who said the booth is “about getting in front of companies and selling the benefits of Ohio.”

To bolster its bid, Ohio has partnered with Indiana. They highlighted that partnership with a massive map marking all the potential sites where UAVs could be tested, including in restricted air space in southern Indiana. The two states submitted their 6,000-page application to become an FAA test site in May, two years after a group of Ohio lawmakers pushed for language in an FAA reauthorization bill calling for the establishment of FAA test sites.

They say unmanned aerial vehicles are in their blood. Dayton businessman Charles Kettering developed the first unmanned aerial vehicle, the Kettering Bug, in 1918.

And the AUVSI, the trade organization representing unmanned aircraft now headquartered in Washington, D.C., was first established in Dayton in 1972.

There’s an irony inherent in the states’ sales pitches: At a time when unmanned aerial vehicles are getting negative attention for infringing on privacy and for being used overseas as a weapon, these states are eager to demonstrate that they want a piece of the pie. They’ve bolstered their FAA applications by partnering with law professors who can advise about privacy concerns. Idaho passed a law requiring law enforcement to have a search warrant before using unmanned aerial vehicles to investigate criminal activities. Ohio has partnered with law professors at the University of Dayton, among others, for guidance on privacy issues.

But to them, unmanned aircraft are inevitable. Even as they wrestle with concerns about privacy infringement, they see benefits: farmers able to focus pesticide and water on specific areas rather than blanketing vast acreage; aircraft taking care of destructive wildfires; cheaper and more efficient monitoring of utility and gas lines.

“There’s just so much opportunity,” said Richard Honneywell, the director of Ohio’s Unmanned Aerial Systems Center and Test Complex, which is overseeing development of unmanned aircraft in Ohio.

At Oklahoma’s booth, a centrally located behemoth featuring a giant unmanned aerial vehicle atop it, economic development officials handed out Frisbees and eyeglass cloths emblazoned with pictures of the state. It was their third booth in three years, and in one corner, Oklahoma Secretary of Science and Technology Stephen McKeever took meetings. Three years ago, he wandered the convention hall, taking it all in. This year, he didn’t have time to leave the booth.


His state is also vying to be one of the six. And while they’re excited, McKeever can’t help but worry that the process is moving too slowly.

“Other nations around the world are moving ahead,” McKeever said. “Other nations are using unmanned aerial systems in a way we are only hoping to use them.”


12 Weapons That Changed Everything

By JOSH HERR, The Fiscal Times

August 15, 2013


As the battle over gun rights in America reaches more and more baroque levels it’s easy to lose track of how we got here.

Weaponry and war has been a concern for humanity from its very earliest days. Empires from Greece, Rome and Persia to Napoleonic France and Colonial England have all been built on a backbone of superior firepower. From Roman’s peasants fearing Hannibal’s elephants to Irish farmers dreading the site of Viking warships on the horizon, from Paul Revere’s “The British are coming!” to the 1950’s “Duck and Cover” drills, weaponry has haunted our nightmares for as long as it’s fueled the dreams of boys and conquerors.

The right of protection against tyranny was so important to our founding fathers that they included it as the Second Amendment to the Constitution. This debate still rages, as we weigh the costs of our gun culture against this principle. Technology, as always, drives innovation. And that is just as true with weapons as it is with the Internet.

Just as there is a through line that stretches from the Sears catalog to, there is an equally clear line from the sword to the predator drone.

History is shaped by these technological innovations. It is impossible to imagine a Roman empire without its Legionary sword, or a British one without muskets. The German blitzkrieg required fast engines and lots of gasoline. The Atomic bomb, which has shaped every political action since its first use at Hiroshima, required nuclear physics (and, obviously, flight).

With those historical realities in mind, take a look back at the technology that changed weapons and the weapons that changed everything.

1. Bone

Ok, so we don’t know exactly when and where this went down, but we all know it happened.  Two cavemen had a dispute that led to violence…and in this melee, one caveman wrapped his brand new shiny opposable thumbs around a bone, stick or sharp rock and discovered that the damage he could inflict on his fellow man increased exponentially. No longer would combat be restricted to teeth, claws, fists and feet. The age of weapons had begun.


2. The Greek Phalanx (750 BCE)

Technically the Phalanx is the formation.  The individual soldiers were armed with the “Sarissa,” a long heavy spear, and a shield, but a phalanx is most notable for being greater than the sum of its parts.  The Greek soldiers would stand in rigid formation, their shields interlocked to protect not just themselves, but perhaps most importantly the man to their left.  The phalanx conquered the Mediterranean by moving as a single unstoppable armored unit.  It was in many ways, the tank of its day, and the fact that we still use the word in modern speech is a testament to its effectiveness.


3. The Gladius (400BCE – 300AD)

Roman Legionary’s Sword – If the Greeks mastered uniformity of movement, the Roman’s mastered standardization of equipment.  The short bladed Gladius made up part of the arsenal of every Legionary, along with a shield short throwing spear and dagger.  By ensuring that every soldier had well-made equipment, usable in a variety of situations, the Romans created an empire that, at its peak, stretched from England to North Africa, from the tip of the Iberian peninsula to the Indian mainland.


4. English Longbow (600AD – 1600AD)

Though records of the longbow are found as early as the 7th century, its status as a game-changer is primarily based on its usage in several battles during the 100 Years War, most famously (as immortalized by Shakespeare) at the battle of Agincourt.  Like a giant battlefield-wide version of the famous scene in Raiders of the Lost Ark, the British simply stood to the side and said “Instead charging over there and fighting…why don’t we just shoot them?”  Ranged weaponry may have long predated the longbow, but the bows size gave it the power to punch through armor.  Slowly but surely, the age of hand-to-hand combat was drawing to a close.


5. Gunpowder Weapons (900 AD)

Prior to the invention of Gunpowder, warfare was almost entirely about upper body strength.  Whether this was swinging a sword, thrusting a spear, or drawing a bow, victory was usually decided by who was stronger.  David and Goliath is a famous story simply because Goliaths typically won.  But Gunpowder democratized warfare, making mechanical skill more important than physical strength.  The invention of gunpowder ushered in the age of the cannon.


6. Rifled Barrel (Invented 1600, not widely used until 1848)

The Chinese might have invented gunpowder, and the British might have built an empire on the musket, but it was the invention of the riffled barrel and the “Minie ball” style bullet that made gunfights truly deadly.  A pre-rifling musket was a highly inaccurate weapon, where the odds of hitting your target had about as much to do with luck as skill.  By rifling the barrel and using bullets fitted to this barrel, gun manufactures were able to produce a product that was deadly accurate.  The American Civil War was the proving ground for this invention, and the high body count of such battles as Shiloh, Antietam, and Gettysburg proved its “success”


7. Colt Revolver (1836 to Present)

Rolling out with the advertising slogan, “God made all men, but Samuel Colt made them equal,” the Colt revolver has since go on to be one of the primary symbols of the settlement of the American West.  With its revolutionary revolving chamber, it was able to fire six shots without reloading (hence “six-shooter”) where most infantry rifles required reloading after each shot.  Additionally it’s relatively small size made it usable by women, children and really anyone who could lift it.  For better or worse, the days of high capacity, rapid fire, hand-held weapons were upon us.

8. Belt Fed Machine Gun (20th Century)

Though Richard Gatling’s ‘Gatling Gun’ is probably the early machine gun that most think of, it wasn’t until WWI that the full tactical and psychological impact of the machine gun was felt.  In the brutal trench warfare that covered the European continent, weapons such as the British Vickers, German MG 08, and French Saint-Etienne put the power of an entire firing squad into one man’s trigger finger.  Entire squadrons could be wiped out with one squeeze.  It was the longbow revolution for the industrial age, and the end of soldiers marching to battle in formation.  You only need to look at any modernist poetry at all, to realize the psychological impact this technology had on an entire generation.

9. Tanks (1916 – Present)

If the machine gun was the longbow, the tank was its knight in armor.  A big metal box on treads, the tank could go anywhere and shrug off gunfire like an annoying mosquito.  Constructed on an assembly line and driven by gasoline, it was warfare via the industrial revolution.  The German Panzer rose to infamy as its ability to move fast and strike hard fueled the blitzkrieg, while the durable M4 Sherman enabled the Allied forces to march to victory across North Africa and eventually into the European theater.


10. Atomic Bomb

Upon seeing the successful test for the A-bomb, J. Robert Oppenheimer, one of the key minds behind the development of the Atomic bomb, recalled , the line from the Bhagavad Gita, “I am become Death, the destroyer of worlds.”  And so he had.  Though easily the least used weapon on this list, it was without a doubt the weapon that shaped all of post WWII politics in the 20th century.  Even before the bomb, WWII had made it clear that war was no longer something that happened to soldiers on a battlefield…it was something that happened to everyone.  And that was never clearer than when two Japanese cities were nearly totally destroyed, simply by dropping a hunk of metal from the sky.


11. AK-47 (1947)

Mao Zedong famously said, “Change must come through the barrel of a gun.” Without question, the gun that symbolized change for the 20th century was the AK-47.  Technically called the “Automat Kalashnikova – 47,” the AK was everything its American counterpart the M-16 was not — cheap, easy to manufacture, reliable and accurate, and it didn’t hurt that it just looked cooler.  The AK became the standard weapon of USSR, China, and all of the Warsaw Pact nations.  The ease with which it could be manufactured and its reliability in multiple weather types also made it a gun of choice for the various cold war revolutions.  Hollywood also did their part, preferring its curved clip and natural wood stock to the more boxy and matte black M-16.


12. Drones

The Romans introduced manufacturing to warfare.  The longbow brought distance and gunpowder brought power.  The machine gun removed effort from the equation, but drones take the final step in the dehumanization of warfare.  You don’t even have to be there.  And it probably takes less engagement than a spirited round of Call of Duty…


One Stat About the U.S. Military That Will Surprise You

Daniel Harrison in Politics

August 15, 2013


There is a widespread perception in the U.S. that enlisted soldiers are poor, uneducated and underprivileged, that they choose to enlist and to serve because they have few other options, and that they risk their lives because they have very little to lose. In reality, however, data shows that American soldiers are relatively wealthy, well-educated, and do not choose to serve as any kind of a last resort.


After Vietnam, the U.S. suddenly possessed a military that was unnecessarily large, expensive, and made up of many who had been drafted. The American government acted swiftly to correct this, discharging a large number of new recruits and conscripted soldiers and switching to an all-volunteer army. There were concerns that an all-volunteer army would be disproportionately comprised of minorities, poor people, and others from relatively unfortunate circumstances.

Over 40 years later, these concerns persist. They have manifested into a false perception that the military is indeed made up mostly of the less fortunate, that its recruiters prey upon the poor or the uneducated. This perception is only loosely based on reality.

Even a quick glance at data of the military’s demographics should be enough to cast doubt on this perception. According to a 2008 study by the Heritage Foundation, enlisted recruits in 2006 and 2007 were actually more likely to come from middle or upper class neighborhoods than from lower class ones. In fact, the numbers showed that Americans who came from a neighborhood where the median household income was lower than $40,000 were underrepresented among military recruits during those two years, while those from neighborhoods where the median household income was above $40,000 were overrepresented. There data showed that representation among said recruits increased as neighborhood median household income increased.

The conclusion is simple but surprising: The wealthier the average household is in your neighborhood, the more likely you are to have joined the U.S. military in 2006 and 2007. Considering the sample size, this is likely to be true today, too.

Education is a trickier matter to assess, but the idea that the majority of soldiers are completely uneducated can be easily refuted. According to a 2011 report, 93% of soldiers possess a high school degree and/or some college experience, which means that a soldier is even more likely than the average civilian to possess such a degree. Yet while 82.5% of officers in the military have a bachelor’s degree or higher, that number for enlisted members is only 5.6%. This is due in part to the fact that many young men and women serve in the military as a means of paying for college later in life. But it must be said that these young Americans (defined as 25 years old or younger) make up less than half of enlisted members.

Finally, the perception of the military as a body heavily made up of minorities is also flawed. In 2011, about 30% of active duty soldiers were minorities. Coupled with the 11% of members who identified as Hispanic — Hispanics are not labeled “minorities” by the military, for some reason — it seems that approximately 40% of the U.S. military identifies as non-white, meaning that about 60% of the military is white.

While this is certainly below the percentage of whites in the national population (the most recent census numbers put that number at 78%), it is not the case that the military is made up mostly of minorities. Furthermore, the Heritage Foundation study found that in 2006 and 2007, the percentage of white new enlisted male recruits aged 18 to 24 was actually higher than the percentage of white 18 to 24-year-old men in the population as a whole. Perhaps, then, the race problems in the military are being corrected, at least to a small degree.

The most important takeaway from these numbers is what they say about Americans in general. Many who choose to join the military do not necessarily do so because they are out of options, or because they have nothing to lose. Something more, and something bigger, is at work.

The fact that so many privileged and educated Americans choose to serve says that the importance of camaraderie and brotherhood sometimes outweighs the financial loss or the risks of war. It says that there are things out there greater than sitting in a classroom or making a lot of money. It says that for many of our men and women in uniform — poor or wealthy, black, white, or Hispanic — the greatest currency to possess in their lives is honor and respect. And it says that patriotism is alive and well in this country.


What I believe is that collectively, they are men and women who represent something greater than what we perceive.


There’s no free lunch when it comes to Google’s Gmail

Privacy hubbub sparks online, but users aren’t expected to ditch Gmail

By Sharon Gaudin

August 15, 2013 04:38 PM ET


Computerworld – There’s no such thing as a free email service, at least not when it comes to Google, according to industry analysts.

Google got slammed this week after longtime Google critic Consumer Watchdog lit up the Internet by pointing out a legal argument that Google attorneys made during a class-action lawsuit about the company’s practice of scanning Gmail messages for keywords to help target advertising.

“Just as a sender of a letter to a business colleague cannot be surprised that the recipient’s assistant opens the letter, people who use Web-based email today cannot be surprised if their communications are processed by the recipient’s ECS [electronic communications service] provider in the course of delivery,” Google’s attorneys wrote in a motion (download PDF).

And then quoting a 1979 case, Smith v. Maryland, they added, “Indeed, a person has no legitimate expectation of privacy in information he voluntarily turns over to third parties.”

When that comment became public, it unleashed an online firestorm about Google’s privacy policies and a debate about whether people should expect privacy in their personal or business emails.

“We take our users’ privacy and security very seriously; recent reports claiming otherwise are simply untrue,” wrote a Google spokesperson in an email to Computerworld. “We have built industry-leading security and privacy features into Gmail — and no matter who sends an email to a Gmail user, those protections apply.”

Google doesn’t have a roomful of employees sitting at desks reading everyone’s personal Gmail messages. What the company does have is an automated delivery process that scans incoming emails for spam, viruses and keywords that help it target advertising to users.

That filtering process is laid out in Gmail’s privacy policy.

“We also use this information to offer you tailored content — like giving you more relevant search results and ads,” the company states under the heading “How we use information we collect” in its privacy policy.

“I think the real issue here is naive users thinking that they can get something for nothing,” said Dan Olds, an analyst at Gabriel Consulting Group. “Providers don’t do anything for free. There’s always an angle they’re playing to either increase their revenue or profitability. And Google takes a back seat to no one when it comes to figuring out and exploiting all the angles. One of the best angles is using email contents to aim specific ads at users.”

He added that it’s a common practice for email service providers to scan messages for things like spam and advertising keywords.

“It’s true that these are automated filters, not human beings, reading the emails and matching up the ads,” said Olds. “I don’t think consumers see this as a huge invasion of privacy, not because they don’t have anything to hide, but mainly because they’ve never given it a thought. For most users, I think their main thought is ‘cool, free email’ rather than ‘I wonder why this is free? What are they getting out of it?'”


Rob Enderle, an analyst with the Enderle Group, said he doesn’t see this ruckus as something that will drive people to abandon their Gmail accounts.

“Free email is paid for in some way — Google mines it, others use it as a way to sell you on additional services they provide. You pick your poison,” he said. “The value of your profile is hidden from you and then you effectively exchange it for a set of services…. Google is incredibly profitable, suggesting that value is much higher than we likely realize.”

Patrick Moorhead, an analyst with Moor Insights & Strategy, said that since people actively click a link or check a box saying they accept Google’s terms of service, it’s hard to complain when the company does what it said it was going to do.

“I do not think very many users will leave Gmail over this,” he added. “As we have seen with other publicity over privacy [issues] with Facebook and other social media services, these flare-ups rarely result in lost users.”

Enderle noted that people need to be more aware of what they’re agreeing to in the first place.

“People should recognize that ‘free’ comes with a cost,” he added. “If they don’t know what the cost is, it would be wise to find out.”


Obama and Romney big data experts continue the battle as businesses

Big data is used to help Democrat Senate candidate Cory Booker in New Jersey

By Patrick Thibodeau

August 15, 2013 06:45 AM ET


Computerworld – WASHINGTON — The self-described nerds of President Obama’s presidential campaign last year were back using big data analytics, this time to help Newark Mayor Cory Booker achieve a landside primary win Tuesday in the New Jersey Democratic primary for a vacant U.S. Senate seat.

But, notably, the Obama data scientists are doing this work as consultants, through their own recently formed firm, BlueLabs.

BlueLabs built a turnout model for the Booker campaign, predicting the likelihood of each Democratic voter in New Jersey to vote in the primary.

The primary results “proved that our model was spot on,” said BlueLabs co-founder Chris Wegrzyn, one of the former senior members of the 2012 Obama campaign’s analytical department.

The proof, one supposes, is in the victory. But the Republican data scientists aren’t ceding anything.

At about the same time BlueLabs was formed, the chief data scientist for Mitt Romney’s campaign, Alex Lundry, co-founded Deep Root Analytics.

Lundry gives credit to the Obama campaign data effort, and said “that campaign, without a doubt, in 2012, had data and analytics more fully integrated into their structure.”

But since last year’s election, “what you are seeing is a flurry of activity on the right to make sure that we not only catch them, but surpass them,” Lundry said.


Indeed, while the Democrats were counting votes Tuesday, Deep Root announced a partnership with FourthWall Media, a major source of cable set-top box viewing data.

That data, which is anonymized, records what people watch. Change a channel and a new row of data is created. The idea is to take this data, combine it with insights about the voters, and then place ads on TV shows most likely to reach certain voters, such as swing voters. Lundry said this will improve the efficiency of campaign advertising spending.

Political campaigns have been using data for years to develop sophisticated understanding of voters. But the combination of relatively low-cost cloud computing, large quantities of data collected via online, in public repositories, and from sensors and so on, gave rise to big data analysis as researchers correlated these data sets in search of new insights.

“You are collecting everything you can, and essentially comparing it every way you can,” said James Hendler, a professor in the computer and cognitive science departments at Rensselaer Polytechnic Institute, and head of its Institute for Data Exploration and Applications.

“When you do a poll and you talk to 1,000 people who represent 100,000 people, you get a margin of error plus or minus 3%,” said Hendler. That’s helpful, but it’s not nearly as helpful as having 70,000 of those 100,000 people. “You get much more precise, and start identifying sub-communities that you can’t do in a poll.”

This field is new. The first graduate program in analytics was created in 2007, and universities are rushing to establish programs.

In the 2012 campaign, big data use came of age, Lundry said. This campaign “was definitely the first cycle in which the term ‘data scientists’ was part of the org chart in any campaign.”

Wegrzyn said BlueLabs assembled a creative team of problem solvers, engineers, statisticians, data scientists and domain experts, and said they want campaigns to see analytics as “an agile, team-driven, creative process.”

Wegrzyn was surprised by the attention the analytical effort received during the Obama campaign. He led the selection and deployment of the Hewlett-Packard Vertica platform that the campaign used.

“Usually the nerds in the back room don’t warrant a great deal of attention, especially in politics,” said Wegrzyn, “but the world is changing.”

This article, Obama and Romney big data experts continue the battle as businesses, was originally published at


Small Businesses Can’t Avoid ObamaCare by Switching to Part-Time Workers

by Dr. Susan Berry 5 Aug 2013 407 post a comment


Small businesses hoping to avoid the high costs of ObamaCare by switching to part-time employees got some unwelcome news last Thursday, as Paul Bedard at the Washington Examiner reported that the Small Business Administration launched a website to explain to employers the federal government will add up the number of part-time staff employed to determine if enough hours have been worked to meet the “full-time equivalent” criterion.

Though President Obama has unilaterally suspended the employer mandate for one year, many companies that were anticipating it going into effect October 1st are moving ahead with their plans to lessen the impact on their businesses.

Some businesses believed that if they could get their number of full-time staff to under 50, they could avoid activating ObamaCare mandates by cutting full-time workers and hiring more part-timers. Confusion about the employer mandate calculation has been widespread.


Bedard wrote:

Said Matthew Haller of the International Franchise Association, “while its nice the administration has launched a new website, employers have been scrambling since the law was passed two plus years ago for answers to the laws complicated calculations for determining if they are ‘large’ employers and how many ‘full-time equivalent employees’ they have. The uncertainty created by the [health care act] continues to cause franchises and other small businesses to hit the pause button on job creation.”


Keeping in mind that, under ObamaCare, a “full-time” work week is only 30 hours, the SBA website provides an example:

Company X has 40 full-time employees working 40 hours per week, along with 20 part-time employees working 15 hours per week. The 20 part-time employees are counted as 10 full-time equivalent employees. Company X has 50 full-time employees and is subject to the employer shared responsibility provisions.


The rules are a “problem for employers at the margin” of 50 full-time workers, said Edmund F. Haisimaier of the Heritage Foundation. In addition, the mandates include seasonal employees, so even if a business with seasonal workers has the equivalent of 50 full-time workers for only 121 days, ObamaCare requirements are activated for that business.



Analysts Question Descriptions of Weakened al-Qaida

Aug. 20, 2013 – 02:00PM |



WASHINGTON — Reports of al-Qaida’s demise have been greatly exaggerated — and the organization’s strategic aims greatly misunderstood, security analysts said Tuesday.

The state of al-Qaida has been a central debate in Washington since the Obama administration recently temporarily closed US embassies across the Middle East due to intelligence suggesting an attack on one or more was coming.

Hawkish lawmakers and analysts say the alleged plot is proof that the Obama administration jumped the gun, beginning last year, when it claimed al-Qaida was on the decline.


These administration critics point to a May speech during which President Barack Obama said al-Qaida’s core group in Afghanistan and Pakistan “is on the path to defeat.”

“Their remaining operatives spend more time thinking about their own safety than plotting against us,” Obama said in a much-ballyhooed speech at National Defense University in Washington. “They did not direct the attacks in Benghazi or Boston. They’ve not carried out a successful attack on our homeland since 9/11.”

Thomas Joscelyn of the Foundation for the Defense of Democracies, during an event here Tuesday afternoon, said that is the wrong lens through which to judge al-Qaida.

“To say al-Qaida can’t do another 9/11-style attack, so they’re not a threat is wrong,” Joscelyn said, “and that ignores some of the threat streams we face today.”

An example of those alleged threats is the embassy plot, critics say. And, to them, that begs a question: How did the administration misread the strength of the Islamist group?

“So many got it wrong because we define [its strength] as terrorist threats against us or the West. That’s not their strategic goal,” Joscelyn said, calling al-Qaida’s work to attack US and Western targets a mere “tactic.”

“They define themselves as political revolutionaries who want power for themselves” and are pursuing “political power across the Middle East,” Joscelyn said. “That’s principally what they’re about and what they’re doing.”

The state and future strength of al-Qaida will influence a myriad US defense and national security policies and budget decisions, from force size to what combat hardware to buy to which platforms and troops must stay in the Middle East-North Africa region — making them unavailable for the Obama administration’s strategic “pivot” to Asia.

FDD President Clifford May said the Obama administration has adopted an “oversimplified narrative” about al-Qaida that, by definition, means its counterterrorism policies “will be flawed.”

May warned that when US and NATO troops mostly leave Afghanistan next year, “there will be a threat there.”

Where Obama sees a weakened al-Qaida core in Afghanistan and Pakistan, Joscelyn contends Al-Qaida leader Ayman al-Zawahiri communicates regularly not only with the organization’s most potent cell in Yemen, but also with “dozens” of al-Qaida groups and individuals “across the world.”

Joscelyn ticked off a list of al-Qaida affiliates that did not exist before 9/11, saying while “it’s not the most popular brand in the Muslim world … they’re still capable of coming forward” to plan and carry out attacks.

His list includes al-Qaida cells in Mali, Iraq, Yemen, Syria and Somalia.

“We can’t just say this group isn’t al-Qaida but this group is [because] they’re supporting al-Qaida and its strategic goals,” Joscelyn said.

To that end, senior Obama aides have publicly defended their boss, pointing out that Obama’s May speech made clear he views the affiliate groups as a threat.

“What we’ve seen is the emergence of various al-Qaida affiliates. From Yemen to Iraq, from Somalia to North Africa, the threat today is more diffuse, with al-Qaida’s affiliates in the Arabian Peninsula — AQAP — the most active in plotting against our homeland,” Obama said in May.

“And while none of AQAP’s efforts approach the scale of 9/11, they have continued to plot acts of terror, like the attempt to blow up an airplane on Christmas Day in 2009,” the president said. “Beyond Afghanistan, we must define our effort not as a boundless ‘global war on terror,’ but rather as a series of persistent, targeted efforts to dismantle specific networks of violent extremists that threaten America.”

US Army Leaders Give Subordinates Just Weeks to Cut Staffs, Budgets by 25 Percent

Defense News

Aug. 19, 2013 – 03:45AM |

By PAUL McLEARY |     


The Secretary of the Army and the Army Chief of Staff have given their staffs just weeks — until Sept. 11 — to report back with “a comprehensive set of recommendations” as to where the service can make 25 percent cuts in funding and manning levels at all Army headquarters elements at the 2-star level and above.

The “2013 Army Focus Area Review Group” plan was spelled out in an August 14 Army document obtained by Defense News.

In some of the strongest language yet about how seriously Army leadership is taking the cuts, the memo bluntly says that “Let there be no mistake, aggregate reductions WILL TAKE PLACE. The money is gone; our mission now is to determine how best to allocate these cuts while maintaining readiness. We expect Army leaders, military and civilian, to seize this opportunity

to re-shape our Army. This effort will take PRIORITY OVER ALL other Headquarters, Department of the Army activities.”


The Group is being led by Deputy Undersecretary of the Army Thomas Hawley and head of Army’s Office of Business Transformation Lt. Gen. Thomas Spoehr. The memo states that the group will have seven “Focus Area” teams, each tasked with developing “bold executable recommendations which will be used to balance the already directed reductions” in the budget projections from 2015-2019. The initial focus areas are:


■ Institutional Headquarters Reductions


■ Operational Headquarters Reductions

■ Operational Force Structure and Ramps

■ Readiness

■ Acquisition Work Force

■ Installation Services and Investments

■ Army C31 and Cyber


When it comes to Institutional Headquarters reductions, service leadership warned subordinates that “movement of personnel outside of headquarters to subordinate units is not a legitimate means of achieving savings. Teams should consider consolidation, reductions, and closing organizations.”

When it comes to the Operational Force Structure, staffers are tasked with looking at “specific interest areas [such as] forces that provide support to other Services, task organization of Corps and Divisions, Logistics, Theater capabilities and other echelon above Brigade Combat Team (BCT) forces.”

Given that reductions in the defense budget outlined in the 2011 Budget Control Act now look like they’re going to remain the law of the land, the Army of the future is likely going to look very different from what recent projections assumed.

To that end, service chief Gen. Ray Odierno and Secretary John McHugh wrote that “We must focus on the Army’s core missions, sustaining the Army’s ability to provide a smaller, more capable Army able to provide ready land forces to meet combatant commanders’ global requirements; develop leaders for the 21st century, while maintaining the bonds of trust with Soldiers and Families. To ensure Army readiness at these reduced budget levels, we must make the best and maximum use of every single dollar provided to the Army.”

And all this has to be fleshed out in the next two weeks.


Obama Has Until Aug. 31 to Formally Propose a 2014 Fed Pay Raise

By Kellie Lunney

August 20, 2013

Another federal pay raise deadline is looming.

Congress is in recess and Washington is (relatively) quiet, so it’s easy to forget that Aug. 31 is the deadline by which President Obama has to announce his 2014 pay raise proposal for federal employees.

If the president doesn’t inform Congress of his alternative pay plan for feds by that date, then the increase mandated by the 1990 Federal Employees Pay Comparability Act kicks in. Under FEPCA, the raise would be determined by the change in the Employment Cost Index minus 0.5 percent. For 2014, that equals 1.3 percent, a slightly higher increase than Obama’s proposal of 1 percent in his fiscal 2014 budget.

Obama’s recommendation should come this week or next. He is expected to formally propose a 1 percent across-the-board pay hike for civilians and keep locality pay rates at their current levels.

In 2012, Obama announced his proposal for the 2013 federal pay raise to congressional leaders in an Aug. 21 letter. Congress eventually rejected Obama’s proposed 0.5 percent raise for feds in 2013 in March, as part of the continuing resolution to keep the government open.

Presidents largely have ignored the FEPCA formula in their federal pay raise proposals, preferring to offer their own figure. Congress created FEPCA, which provides an annual across-the-board salary boost and a locality pay adjustment for General Schedule employees, to close the public and private sector pay gap. The latest Federal Salary Council report concluded that federal employees are underpaid relative to private sector workers by approximately 34.6 percent.

The reality, however, is that Congress will end up determining whether federal employees receive a pay raise next year; civilian workers have been under a pay freeze since 2010.

So far, lawmakers have not shown much enthusiasm for ending the three-year freeze on federal employees, despite Obama’s repeated calls to do so. None of the House fiscal 2014 spending bills to date contain funds for a civilian pay raise. The House has passed four of the 12 spending bills for fiscal 2014: Defense; Energy and Water Development; Homeland Security; and Military Construction, Veterans Affairs and Related Agencies. The Homeland Security and Military Construction-VA bills did not endorse a civilian pay raise, but didn’t forbid it either.

“The committee does not include requested funding for a civilian pay increase,” lawmakers wrote in both of those spending bills. “Should the president provide a civilian pay raise for fiscal year 2014, it is assumed that the cost of such a pay raise will be absorbed within existing appropriations for fiscal year 2014.”

The House Defense spending bill also does not include money for a civilian pay raise next year. The current House and Senate Financial Services and General Government spending bills, typically the vehicle for federal pay provisions, leave the decision to Obama by omitting language related to an across-the-board raise for government workers. The Senate Defense spending bill, which the Appropriations Committee approved before the congressional recess, contains a 1 percent raise in 2014 for Defense civilian employees.

FEPCA allows the president through an executive order to set a pay raise for government employees if Congress doesn’t specify one and doesn’t pass legislation prohibiting it. But there’s still plenty of time left in 2013 for lawmakers to extend the federal pay freeze for another year, one way or another.


Draft Rules and draft Team Agreement for NASA’s Unmanned Aircraft Systems Airspace Operations Challenge

by Press • 21 August 2013

The draft Rules and draft Team Agreement for NASA’s Unmanned Aircraft Systems (UAS) Airspace Operations Challenge (AOC) have been posted on the Unmanned Aircraft Systems Airspace Operations Challenge

Potential competitors, technical experts, interested parties, and the public are encouraged to submit comments via the website on the competition structure, requirements, and award metrics that should be added, modified, or deleted.

This Challenge is a competition among unmanned aircraft technology innovators to encourage further development of the technologies necessary for unmanned systems to begin safely using the National Airspace System along with piloted aircraft.

The flight competition will test innovative “sense and avoid” technologies created by various aerospace developers—technologies that are critical to the widespread use of unmanned systems because they help prevent incidents with other aircraft.

In the first phase of the competition, developers will demonstrate basic airmanship and air vehicle technologies through a series of ground and flight events intended to measure key performance capabilities ensuring that air vehicles are safe, sustainable, and practical.  Competitors will need to demonstrate a high level of operational robustness as well as the ability to “sense and avoid” other air traffic.

This phase of competition will be conducted in May 2014 at Camp Atterbury, Indiana, which is one of the test ranges comprising the Ohio/Indiana Unmanned Aircraft Systems (UAS) Center and Test Complex. Development Projects Inc. of Dayton, OH is managing the challenge for NASA under a Space Act Agreement.

The UAS AOC is part of NASA’s Centennial Challenges Program, in which NASA providesthe prize purse and the competitions are managed by non-profit organizations that cover the cost of operations through commercial or private sponsorships.  NASA’s Centennial Challenges seek unconventional solutions to problems of interest to NASA and the nation. Competitors have included private companies, student groups, and independent inventors working outside the traditional aerospace industry. Unlike contracts or grants, prizes are awarded only after solutions are successfully demonstrated.

There have been 24 Centennial Challenges competition events since 2005. NASA has awarded almost $6 million to 16 challenge-winning teams. For more information about the Centennial Challenges program and descriptions of each of the challenge competitions, visit:


A Gloomy Fall Forecast for Fed Pay, Benefits

By Kellie Lunney

August 21, 2013

Just a few weeks remain before fall returns, ushering in yet another season of stress and uncertainty for federal employees.

There’s the threat of a government shutdown, the impending political battle over raising the debt ceiling, the possibility of more furloughs or even layoffs because of sequestration, and the fear that Congress will freeze federal salaries for a fourth consecutive year. All of those scenarios adversely affect federal pay one way or another, not to mention morale, recruitment and retention.

So, what do federal retirees have to look forward to? In October, the government will announce the 2014 cost-of-living adjustments, and it’s looking like the percentage will be smaller than the 2013 figure.

The Congressional Budget Office recently estimated that the 2014 COLA will be around 1.5 percent. That’s slightly less than the 1.7 percent boost that federal retirees and Social Security beneficiaries received this year. It’s a lot less than the 3.6 percent increase they received in 2012. Still, it’s better than zero, which is what current federal employees are looking at if Congress decides to extend the freeze on their across-the-board annual pay adjustment.

CBO’s estimate was tucked into a cost estimate for a Senate bill that would increase the amount paid to veterans for disability compensation and to their survivors for dependency and indemnity compensation to the same COLA available to federal retirees and Social Security recipients.

Of course, the 1.5 percent COLA estimate is just that: an educated guess based on available information to date. The annual COLA is based on the percentage increase (if any) in the average Consumer Price Index for Urban Wage Earners and Clerical Workers (CPI-W) for the third quarter of the current year over the average for the third quarter of the last year in which a COLA became effective. The CPI-W measures price changes in food, housing, gas and other goods and services.

September’s inflation figure is the final data point needed to calculate the 2014 COLA. Stay tuned.

Salary Sleuth

Want to know how much a certain public affairs specialist at the Homeland Security Department earns? Check out, an online searchable database that provides pay information on federal jobs, agencies and specific employees nationwide. It’s the latest tool aimed at making publicly available salary data easier to find.

The average annual salary for a public affairs specialist in the federal government is $98,849, according to the website. A brief search of employees with that title working in Washington, D.C., shows many salaries over $100,000.

Benefits Survey

The Office of Personnel Management will survey a random selection of federal employees this summer on the quality of their benefits, well-being and their efforts at healthy living.

OPM will administer the Federal Employee Benefits Survey via email to the random sample of participants and employees can complete the 15-minute survey during work hours, said acting Director Elaine Kaplan in a memorandum to agencies.

The last such survey was in 2011.

The 2013 survey will be available online for a month with periodic reminders sent to respondents.



Secretly Tag 3-D-Printed Objects With InfraStructs

Microsoft Research combines terahertz scanning with 3-D printing to embed codes and shapes inside objects

IEEE Spectrum

By Davey Alba

Posted 20 Aug 2013 | 14:40 GMT


Ever wanted to send secret information just by passing along an ordinary-looking object? That’s the premise of Microsoft Research’s new project, InfraStructs, which pioneers techniques for concealing identifiers inside 3-D-printed objects. Anyone with a 3-D printer can fabricate an object with coded air pockets, or voids. And then anyone with a terahertz scanner can decode the tag hidden in the object—whether it’s a shape, a line of binary, or a gray code. It’s the modern microdot.

Microsoft presented the scheme at Siggraph 2013, the 40th International Conference and Exhibition on Computer Graphics and Interactive Techniques, sponsored by the Association for Computing Machinery, on 25 July. IEEE Spectrum’s Davey Alba had Andy Wilson, a principal researcher at Microsoft Research who studies human-computer interaction, explain how InfraStructs works.




Davey Alba: Welcome to the Full Spectrum. I’m Davey Alba for IEEE Spectrum. We’re here today with Andy Wilson from Microsoft Research. He’s here to tell us about InfraStructs, a research project that pioneers techniques for reading hidden information in objects. InfraStructs lets you embed binary codes, 3-D watermarks, and geometric shapes into 3-D-printed objects, and then it reads it back with terahertz scanning. Andy, welcome.


Andy Wilson: Hi, how’s it going? Thanks for having me.


Davey Alba: Thanks so much for joining us. So, what exactly is InfraStructs and how does it work?


Andy Wilson: So there’re all different kinds of imaging technology we’re interested in. This is some work where we’ve investigated terahertz imaging. So we put the 3-D printing stuff together with the terahertz-imaging techniques that are out there and combined them in an interesting way to embed patterns within 3-D-printed objects. People are studying various techniques like object recognition in the field of computer vision. InfraStructs does something rather differently, where we actually make it relatively easy to identify an object by constructing a recognizable pattern underneath the surface.


So you’ve probably also seen techniques in augmented reality, for example, where you actually put a printed code on the surface of an object. So this is a little bit related to that, where we’re looking at different kinds of binary patterns on objects, except that we don’t have to make the object look different here. We actually can hide the pattern underneath the surface and make it part of the structure itself. Another sort of related area would be RFID [radio-frequency identification]. Typically with RFID you need some kind of little circuit embedded in the object, along with an antenna, you know, which requires its own fabrication considerations and manufacturing. And here we don’t need those kinds of special circuitry and antennas. We actually just put the structure in the object, and we rely on the terahertz scanning to recover that. Another application would be embedding what’s called a gray code in the object, and that’s a particular binary pattern, which actually varies over the surface of the object. And when you recover the binary pattern, you then know exactly where you are in the binary pattern.


There are limits to what you can do. We found that the approach works best when we had material and then interfacing with a void or an air pocket within the material, since chiefly what you have is the ability to sense changes in the index of refraction in the material. So the biggest and easiest way to create a change in the index of refraction is to go from the 3-D-printed material to air. And so that gives you a big jump in the index of refraction. That’s the part that—these interfaces reflect very nicely in the terahertz domain.


Davey Alba: So is it possible to combine the different tags that you talked about into one object?


Andy Wilson: Oh, absolutely. You could certainly construct an object where part of the object has a certain tag scheme and another part has a different tag scheme. One idea is to just print some kind of identifier into the object so that the camera can actually look at some part of the structure underneath the surface of the object and identify it very quickly.


Davey Alba: So what exactly inspired you to do this work? Why 3-D printing?


Andy Wilson: So this work was pursued by Karl Willis, who was an intern. In trying to figure out what to do, what kind of projects would interest both of us, we both came to this particular combination of his interests in 3-D printing and mine in sensing.


Davey Alba: What sorts of challenges did you run into while you were developing this project? And can you expand on some of those, give us examples?


Andy Wilson: Well, so there’s a couple of interesting challenges. One is just in understanding the limits of the technology. So we, Karl and I, put together a bunch of prototypes that just tried to test the limits of what you could sense with terahertz imaging. So that involved building very specialized little objects that looked at—varied the angle for example, that the beam would hit the objects. So we performed a number of different tests like this, which weren’t especially useful but then allowed us to perform some examination later, processing of the scans to determine that yes, you could get away with—you could see a surface that was up to, I think, about 15 degrees away from the camera. So that was the first thing, sort of establishing the performance of the device. And then that led us into designing the actual patterns themselves, starting to figure out what kinds of structures you can at first fabricate with a 3-D printer, or a laser cutter, or other kinds of digital fabrication techniques. And then in concert with that, to be able to write computer programs that can efficiently and reliably recover those patterns. So that was definitely one of the more challenging aspects of the work.


Davey Alba: So this technology kind of lends itself—actually this was one of our first thoughts—to sci-fi sort of spying applications, passing secret messages, using objects where you embed these messages in these seemingly innocent-looking objects. Can you say anything to that?


Andy Wilson: Well the thing of it is, it has to—it’s a little tricky because that message has to be there at the time the object is created. You can’t just sort of tuck it into an existing object. I’m really reminded, now that I think about it, of some of the stuff that Craig Venter’s company is doing where they, you know, assembling DNA strings and then putting a copyright message in there or encoding other kinds of data into DNA. I think that’s the kind of thing you can do kind of invisibly, I guess. Is that sci-fi? It seems to me like all this stuff is like, you know, not too far off.


Davey Alba: Great. Thanks so much for your time, Andy.


Andy Wilson: Thanks! Thank you so much.


Davey Alba: We’ve been talking to Andy Wilson from Microsoft Research about the InfraStructs project. For IEEE Spectrum, I’m Davey Alba.



Against the Dollar Coin


The Atlantic

Alexander Abad-Santos

Aug 21, 2013


In the halls of Congress, a measure is being pushed by four senators that could well ruin American lives. Called the COINS Act, it allegedly aims to save the government money by replacing dollar bills with dollar coins.

The bipartisan bill is actually called the Currency Optimization, Innovation and National Savings (COINS; get it?) Act. Introduced in June, it is sponsored by Senators Tom Harkin, John McCain, Michael Enzi, and Tom Coburn, who mostly represent “states with mining and metal-processing interests,” writes Johns Hopkins professor Steve H. Hanke in an Op-Ed arguing against this foolish idea in today’s Wall Street Journal.

Aside from their states’ own interest, the senators are pushing this misguided bill because they say it will save the government money in the long run: some $13.8 billion over the next 30 years reported CNN. John McCain even says it will help strippers earn bigger tips. Granted, that’s an important consideration. However, overall, Hanke explains that the idea of a dollar coin is terrible.


We agree. Here’s why:

No One Likes Dollar Coins

“Oh man, let me trade you my dollar for that cool Sacagawea dollar coin you have,” is a sentence that has never been uttered in human history. At my local post office, I once saw a woman wince, whisper something under her breath, and her body crumple into a slouch when the stamp machine spat out a dollar coin. When similar legislature was introduced in 2011, a poll from Lincoln Park Strategies found that 76 percent of Americans “strongly opposed” the idea of the dollar coin. Hanke points out that “The Federal Reserve already holds over a billion-dollars worth of $1 coins in storage due to the fact that people simply don’t want to use them.” You hear that, Mr. McCain?


Coins Are Meant To Be Lost

Coins are hard to keep track of. Thin little disks of metal slip out of pockets with abandon. And the amount of change Americans lose in their couches alone is staggering. Time reported that in 2011, the good people of this nation forfeited some $15 billion to the cushions of their furniture. America, are you prepared to lose more? In these tough fiscal times, this is not a gamble we can afford.


More Money on the Floor

I am strict believer that there is no better feeling in this world than coming home after work, taking off real pants and putting on a pair of sweats. There is, moreover, a direct connection between happiness and the speed at which real pants come off. A pocket full of dollar coins would make this process obnoxious and, perhaps, lead to a diminishment of happiness among people who suddenly find themselves scrambling around the floor for lost dollar coins instead of kicking back in their sweats, watching Seinfeld reruns. We can’t have that, obviously.


Coin Purses and Fanny Packs Are Awful

Think coins are cool? Then just recall the device needed to carry them. You wallet is not built for coins, which is why you may have to invest into a coin purse or fanny pack. Are you ready to look like a Times Square tourist? Didn’t think so.


You Will Never Carry Cash Again

You know those people who pay for a tomato with a credit card? Well, with the advent of the dollar coin, we may all join their ranks. Mass aversion to the dollar coin may lead to a greater boycott of cash, which would lead us all to make trifling purchases with our cards. Just imagine all the waiting around, waiting for the receipt to print out so you can sign it. Horrendous.



Why the Chinese Love Bitcoins

August 22, 2013

By Lulu Yilun Chen

he Chinese have found a new investment that sidesteps the scrutiny of the country’s authorities. And for now, it’s paying off big.

Bitcoins, a virtual currency invented in 2009, is attracting investment-hungry Chinese who often struggle to find good assets that generate high returns.

The investments have paid off for people like Sun Minjie, a 28-year-old Internet worker who lives in Beijing. He paid about 28 Bitcoins, worth roughly $3,000 at the time, for more than 400 shares in virtual stock exchange 796 Xchange. He’s seen a 46 percent return on his investment since its Aug. 1 debut. Compare that with the 2 percent gain on the Shanghai Composite Index for the same period.

It’s no wonder why China has surged from seventh place in global Bitcoin collections last year to second, ranking only behind the U.S. In the past month, a flurry of local IPOs priced in Bitcoins, including 796 Xchange, Myminer and Labcoin, have also listed on virtual Bitcoin stock exchanges.

For Sun, Bitcoins have so far offered real returns. This is rare in a country where a rising middle class struggles to find valid investments to protect their assets against inflation.

“In China, the stock market, property and bond market are all not so good, so people get really excited when they hear of a new investment that generates high returns,” said Peter Pak, head of trading of BOCI Securities in Hong Kong. “What’s worrisome is that a lot of people could be just treating it as a speculative investment.”

Bitcoin isn’t controlled by any government. That’s part of why the virtual currency is so popular in China, a country where the government controls the flow of money overseas and keeps a tight rein on what it views as undesirable behavior at home. Chinese investors love having the option of buying Bitcoins in yuan and selling them in U.S. dollars or other hard currency.

“The advantage for Chinese users to use Bitcoin is freedom. People can do something without any official authority,” said Patrick Lin, system administrator of He owns about 1,500 Bitcoins. Lin said he’s sticking to the currency itself, rather than initial public offerings, in part because of weak regulation on Bitcoin.

“The Bitcoin world is just like the Wild West — no law, but opportunity and risk,” said Lin.



Drone Boom: Why Drones Aren’t Just for Dropping Bombs Anymore

Paul Detrick | August 20, 2013


When you hear the word drone you may immediately think of bombs being dropped in the Middle East or the surveillance of citizens here in the United States, but engineers and aviation geeks have wondered for decades if unmanned flight might solve a few of our world’s problems or just make our lives a little easier.

Popular Science magazine wrote about a “Superdrone” that could “sniff out pollution.Over 30 years ago, science magazines wondered if drones would “sniff out pollution,” or, “make pilots obsolete,” and engineers are saying that those ideas may be possible now.

“The technology has reached a point where it can be very inexpensive to buy [unmanned aerial system technology],” says John Villasenor, an engineer at UCLA and a senior fellow at the Brookings Institution. Villasenor says that advances in GPS, airframe design, and flight control methods have made unmanned flight available to pretty much anyone.

As a part of the FAA’s re-authorization of funds in February 2012, Congress passed a bill that included the integration of unmanned aircraft into U.S. airspace. First for public entities like law enforcement or fire fighters and second for civilians like farmers or filmmakers with full integration by 2015. In July, the FAA approved two drones for commerical use which could fly as early as 2013.

The industry is growing so quickly worldwide that the intelligence research firm the Teal Group, said in June 2013 that unmanned aerial vehicle spending will more than double over the next ten years from current expenditures of $5.2 billion annually to $11.6 billion–totaling just over $89 billion in the next decade.

“The potential of UAVs benefiting mankind in firefighting, agriculture, pollution, stopping all sorts of loss of life because we were able to send a remote vehicle instead of a human life into that is amazing,” said Alan Tratnor of the California Space Enterprise Center at an unmanned aerial vehicle policy symposium put on by the American Institute of Aeronautics and Astronautics in March 2013.

The symposium is like a lot of public discussions going on around the world right now about drones.

Drone like this one are becoming cheaper and more available to civilians.”It’s a way to have a dialogue across the whole community to make sure we are all thinking of the right things and moving in the right direction together,” said Sandra Magnus, executive director of AIAA.

Some companies have already hit the ground running with low level aerial filmmaking. Drone Dudes is a two year old company of young filmmakers and engineers who shoot sporting events across the United States. Whether it’s biking, surfing, driving or skateboarding, Drone Dudes is able to capture aerial shots that are considerably cheaper and more dynamic that using a crane or a helicopter.

Magnus, who is also a former astronaut, says that she is aware of the concerns people have about the new technology.

“Human beings, our very nature, we’re a little scary about change because it’s the unknown, but we’re explorers too. And we are constantly balancing that tension between what’s the unknown like and part of us yearn to go into the unknown and all the debate you hear about the use of unmanned vehicles on both sides, you’re seeing that tension played out.”

Villasenor points out that in the late 1800s, when cameras became cheap enough for many Americans to buy, there was tension over that new technology too. Some of that tension grew over privacy fears, a topic the unmanned aerial system community can’t seem to escape.

Drone camera”I think civil libertarians have a right to be concerned about privacy,” says Villasenor. “To deny that unmanned aircraft […] will in some cases be used in manner that violates privacy, that would be overly naive. It will happen.”

Villasenor points out that when it comes to government drones with cameras the fourth amendment still should apply when it comes to civilians, there are invasion of privacy statues people must abide by.

“I also think it’s important for people with an interest in civil liberties and everyone else to look at it on the other side […] We have, all of us, an affirmative first amendment right to gather information so unmanned aircraft in the hands of people who are gathering information which includes people in the news media and others can be very powerful tools just like cameras are today,” says Villasenor.

“Technology is a tool and you have to be mindful how you use it,” says Magnus. “But we can’t let our fear keep us from reaping the benefits of our brains, which is where the technology comes from.”


Calamityville site cleared by EPA

Dayton Daily News

Posted: 12:05 a.m. Saturday, Aug. 17, 2013

By Steven Matthews – Staff Writer



Fairborn and Wright State University cleared a major hurdle this week when the Environmental Protection Agency provided an official document that opens the door for the city to turn Calamityville over to WSU.

The Ohio EPA issued its “covenant not to sue” letter to Fairborn for the 52-acre site, which is home to the National Center for Medical Readiness, also known as Calamityville, at 506 E. Xenia Drive.

The property — the former CEMEX plant — is a brownfield site, and the letter states that no additional cleanup is required and it releases any parties from any legal responsibility for past contamination.

“We’re really happy we’re at the end here,” Fairborn City Manager Deborah McDonnell said. “The university can take control and really build the world-class facility they’ve been envisioning.”

WSU leases the property from the city for Calamityville, a collaborative training and research facility that opened in March 2011. Its purpose is to prepare civilian and military medical communities and first-responders for disasters.

It cost $4 million to clean up the Calamityville site — $3 million in state and federal grants as well as a $1 million match from WSU.

“We have received the news from the city that the EPA submitted the letter of no further action, and university leaders will meet soon to discuss next steps,” WSU spokeswoman Stephanie Gottschlich said in a statement.

McDonnell said the city’s and WSU’s legal teams are working on the final purchase agreement, and the university’s board of trustees is expected to consider it at its next scheduled meeting Oct. 4. Potentially coinciding with that will be Fairborn city council’s vote to turn the property over to WSU.

The documentation will then be sent to the state for approval, McDonnell said.

“I’d like to believe it will be done by the end of the year,” she said. “We’re closer than ever.”

The city has made it a point to focus on economic development along the Interstate 675 corridor. The average daily traffic count around Calamityville is 11,278, according to the latest figures available from 2010.

“We’re thrilled that this phase is concluded and the university can move forward with their plans to become more aggressive with that property,” said Chris Wimsatt, the city’s economic development director. “It creates another asset in that part of the community, which has a multiplier effect with regard to development.”

In the city of Fairborn’s 2010 comprehensive annual financial report, Calamityville — when fully operational — has a projected economic impact of $374 million over a five-year period for the Miami Valley region, according to an economic analysis prepared by WSU.

Gottschlich said last month the projected impact still stands today, noting that Calamityville is a “couple years out” from being fully operational.

Calamityville has 12 full-time employees and an annual operating budget of $1.2 million.



Boehner proposes ‘short-term’ bill to avert government shutdown

Washington Post

By Lori Montgomery, Published: August 22


House Speaker John A. Boehner said Thursday that he plans to avert a government shutdown at the end of September by passing a “short-term” budget bill that maintains sharp automatic spending cuts, known as the sequester.

“When we return, our intent is to move quickly on a short-term continuing resolution that keeps the government running and maintains current sequester spending levels,” Boehner (R-Ohio) said on a conference call with GOP lawmakers, according to a person on the call.

“Our message will remain clear,” Boehner said. “Until the president agrees to better cuts and reforms that help grow the economy and put us on path to a balanced budget, his sequester — the sequester he himself proposed, insisted on and signed into law — stays in place.”

On the call, Boehner did not address perhaps the most pressing issue facing Republican leaders: Whether to use the threat of a shutdown — or even a potential government default later this year — to try to force President Obama to delay implementation of his signature health insurance initiative. Conservatives in both the House and Senate, along with influential outside groups, are demanding that GOP leaders use the coming budget battles to undercut the Affordable Care Act, known as Obamacare.

Senior GOP aides confirmed that House leaders are considering making demands related to the health initiative as part of any agreement for raising the $16.7 trillion federal debt limit. “Obamacare is one of many things we can pursue on debt limit,” a leadership aide said Thursday, speaking on the condition of anonymity.

Among the possibilities: delaying the mandate that requires individuals to purchase insurance beginning in January and codifying Obama’s own decision to delay penalties for businesses that fail to offer insurance to their workers next year.

However, the aide said, “This is all in the discussion phase right now.” And by making clear his intention to keep the government open, Boehner signaled that he is not inclined to stage a white-knuckle showdown over the fate of Obamacare right when lawmakers return to Washington on Sept. 9 after the summer break.

Instead, Boehner urged his rank and file to follow the strategy he laid out earlier this summer that calls for “holding votes that chip away at the legislative coalition the president is using to force Obamacare on the nation.” Meanwhile, he urged them to focus on the victory of the sequester, which is scheduled to slice nearly $100 billion a year from the Pentagon and other agency budgets over the next decade.

Obama and other Democrats are eager to turn off the sequester and have offered a plan to replace the savings with a mix of tax in¬creases and reforms to expensive health and retirement programs. Still, Democrats do not expect to resolve the dispute before the new fiscal year begins Oct. 1. And Boehner’s offer to continue funding the government at 2013 levels — rather than the lower levels slated to go into effect in 2014 — could form the basis for an agreement to get through the first of several deadlines facing Congress this fall.

The second deadline — the need to raise the debt limit — is more problematic, in part because no one knows exactly when it will hit.

During a speech Thursday to the Commonwealth Club of California, Treasury Secretary Jack Lew called on lawmakers to raise the debt limit as soon as they get back to town because “it is not possible for us to estimate with exact precision when Treasury will have to depend exclusively on cash on hand to meet our country’s commitments — or how long it will take before that cash runs out.”

“In just a few weeks, we will find ourselves once again perilously close to breaching the debt ceiling if Congress fails to act,” Lew said. “We cannot afford for Congress to wait until some unknowable last minute to resolve this matter on the eve of a deadline.”


FTC chair promises robust regulation of big data

By Anthony Brino, Associate Editor

Published on Government Health IT

Edith Ramirez, the Commissioner of the Federal Trade Commission laid out the case for strong consumer protections regulating the private industry’s use of big data, as the agency asks Congress for the power to level civil fines against businesses for weak consumer data security.

Speaking at the Aspen Forum, Ramirez offered “A view from the lifeguard’s chair,” as her keynote was titled, alluding to her roots in coastal southern California.

“The already intricate data-collection ecosystem is becoming even more complex,” said Ramirez, whose term as commissioner ends in 2015.  Ramirez pointed to the “Internet of Things” as a growing technology that will test the bounds of the law.

In addition to online and cell phone data, “households with smart appliances such as refrigerators, televisions, thermostats… will soon be widespread,” Ramirez said. “These devices will be connected to the Internet, collecting information that will end up in the hands of manufacturers, service providers and others. What are the privacy and security implications? These are questions we are thinking about at the FTC,” she said.

“The fact that big data may be transformative does not mean that the challenges it poses are, as some claim, novel or beyond the ability of our legal institutions to respond.”

The FTC, an independent federal agency that turns 100 years old next year, believes it has “an obligation” to protect consumer privacy, said Ramirez. Congress directed the FTC to prevent unfair commercial practices — “conduct that substantially harms consumers, or threatens to substantially harm consumers, which consumers cannot reasonably avoid, and where the harm outweighs the benefits,” said Ramirez, who prior to joining the FTC was a partner in the Los Angeles office of law firm Quinn Emanuel Urquhart & Sullivan, al law firm specializing in intellectual property litigation with clients including Google, Shell Oil, Motorola, Samsung and Sony.

Many companies are using data technology “in ways that implicate individual privacy,” Ramirez said. “The FTC’s role isn’t to stand in the way of innovation; it is to ensure that these advances are accompanied by sufficiently rigorous privacy safeguards.”

Ramirez argued for a consumer data framework that limits the collection of consumer data, as opposed to “after-the-fact restriction” limiting certain uses of data that’s collected, as many tech firms would prefer. “The indiscriminate collection of data violates the First Commandment of data hygiene: Thou shall not collect and hold onto personal information unnecessary to an identified purpose. Keeping data on the off chance that it might prove useful is not consistent with privacy best practices,” Ramirez argued.  

There’s also the risk of what Ramirez called “data determinism” taking hold in institutional practices. “Individuals may be judged not because of what they’ve done, or what they will do in the future, but because inferences or correlations drawn by algorithms suggest they may behave in ways that make them poor credit or insurance risks, unsuitable candidates for employment or admission to schools or other institutions, or unlikely to carry out certain functions.”

Ramirez noted that she is a big fan of big data. “The fact that decision-by-algorithm may be less than perfect is not to condemn the enterprise. Far from it. Using data-driven analytics to improve decision-making may be an important step forward. After all, human decision-making is not error-free. People often make imperfect decisions for a variety of reasons, including incomplete information, poor decisional tools, or irrational bias. But the built-in imperfections in the decision-by-algorithm process demand transparency, meaningful oversight and procedures to remediate decisions that adversely affect individuals who have been wrongly categorized by correlation. At the very least, companies must ensure that by using big data algorithms they are not accidently classifying people based on categories that society has decided — by law or ethics — not to use, such as race, ethnic background, gender, and sexual orientation.”

Last year the FTC called on data brokers — a relatively new occupation servicing mostly corporate clients with consumer data for business intelligence or targeted advertising — to give consumers access to their information through an easy-to-find, easy-to-use common portal, and the agency argued for legislation giving consumers the ability to access, dispute or suppress data held by brokers.

Further establishing its role as cop on the consumer data beat, the FTC has issued subpoenas to nine data brokers, investigating  “the nature and sources of the consumer information the data brokers collect; how they use, maintain, and disseminate the information; and the extent to which they allow consumers to access and correct their information or opt out of having their personal information sold,” Ramirez said.

The FTC has used its “unfairness authority” against companies for failing to provide reasonable data security — suing the Wyndham hotel chain last year for data security practices that led to three data breaches.The FTC has brought over 40 data security cases under our unfairness and deception authority for failing to provide reasonable security safeguards.

Along with the landmark Fair Credit Reporting Act of 1970, the FTC enforces the Children’s Online Privacy Protection Act requiring companies to obtain parental consent before collecting personal information from kids under 13, and recently updated rules to include social media and mobile applications.

The FTC is pushing Congress for the power to secure civil penalties against businesses that “fail to maintain reasonable security,” Ramirez said. The agency also is urging Congress to pass “baseline privacy legislation” that would increase transparency about companies’ collection of user information, among other goals.


Army Makes Cloud Email Migration

Richard W. Walker


After years of stumbling along in fits and starts, the Army’s migration to an enterprise-wide email system is nearly complete. Most Army users can now access their email securely from anywhere in the world at any time.

Officials said Wednesday that more than 1.4 million Army users have migrated onto the unclassified NIPRNet and 115,000 users onto the classified SIPRNet, completing “the bulk” of the Army’s move to the system, called DOD Enterprise Email (DEE).

The Army’s adoption of the system is the first phase of a Defense Department-wide move to a private cloud hosted by the Defense Information Systems Agency’s Defense Enterprise Computing Centers. Under Department of Defense (DOD) goals, the system will eventually support 4.5 million users across the department.

Officials expect DEE to improve operational effectiveness, security and efficiency, saving the Army $76 million in fiscal year 2013 and $380 million through 2017. Before migration, the Army expended considerable resources managing and securing disparate legacy email systems, officials said.

Mike Kreiger, the Army’s deputy CIO, said that the migration to DEE has been “a learning experience for all of us.”

Indeed, the program has faced a series of stumbling blocks since the first users were moved to the DISA cloud in January 2011, including a lack of uniformity in desktop configuration across the department. Technicians had to standardize desktop configurations before users could be moved to the new system. In addition, legacy networks at some military installations were not optimized to use cloud-based services, causing additional delays.

Between late December 2011 and March 2012, the rollout was suspended to address new rules under the 2012 National Defense Authorization Act, which required the Army to demonstrate that its acquisition approach to the system was technically and financially viable.

A year ago, however, the migration began picking up steam and moving at a steady clip, hitting a major milestone of 500,000 users. At that point, any skepticism that a project of such immense scale could be successful was assuaged.

The move to DEE represents a big change in the user experience. “When you move from installation to installation, your mail is still there,” said Lt. Col. Patrick Lee, an Army branch chief for programs and projects at Ft. Gordon in Georgia, in an interview a year ago, after the system had been rolled out to half a million users.

“When I travel, I’m able to pull up my mail wherever I go,” he said. “I don’t have to worry about authentication on individual networks or my email sitting on someone else’s server.”


Among major features, the system gives users a single email address that follows them everywhere, an increase in mailbox capacity to 4 gigabytes from 100 MB, and the ability to share calendars and collaborate across DOD’s three commands throughout the globe.

Having reached the 1.4 million-user goal, the Army is leveraging lessons learned as it implements other enterprise services, Krieger said.

“We’ve still got plenty of work left to institutionalize DEE and enterprise services in general,” he said.


Ballmer forced out after $900M Surface RT debacle

Major miscalculation and ensuing financial blow precipitated board’s decision to push out the Microsoft CEO, argues analyst

Gregg Keizer

August 23, 2013 (Computerworld)


Steve Ballmer was forced out of his CEO chair by Microsoft’s board of directors, who hit the roof when the company took a $900 million write-off to account for an oversupply of the firm’s struggling Surface RT tablet, an analyst argued today.

“He was definitely pushed out by the board,” said Patrick Moorhead, principal analyst with Moor Insights & Strategy, in an interview Friday. “They either drove him out, or put him in a situation where he felt he had to leave to save face.”

The biggest clue that Ballmer was pushed and didn’t leave of his own free will was the 12-month timetable Microsoft said it would use to find a CEO successor. “Typically, a board will be working behind the scenes for a replacement, but they’ve given themselves 12 months,” said Moorhead. “I think this went down very quickly.”

Microsoft announced Ballmer’s retirement earlier Friday.

Ballmer, who has been CEO since 2000 and at Microsoft since 1980, will remain CEO until his successor is selected. In a statement, the board indicated that could take as long as 12 months. It has drafted a committee to oversee the selection process; co-founder, former CEO and current chairman Bill Gates will serve on the committee.


In his email to Microsoft workers, Ballmer seemed to hint that the retirement was not his idea, but that he was falling on his sword. “This is an emotional and difficult thing for me to do. I take this step in the best interests of the company I love,” he wrote.

Later Friday morning, Ballmer told Mary Jo Foley of ZDNet that he had been thinking of retirement for some time — true, as he’d informally named 2018 as the likely date five years ago — but that his thinking had “intensified really over the last couple, two, two and a half months.”

Moorhead had a different timeline.

“What could have precipitated the quick move?” Moorhead asked, then answered his own question. “It was the $900 million write-down. That caught the attention of the board, and based on Ballmer’s over-enthusiastic public commentary on Windows RT and Surface RT, they lost a lot of credibility. So did Ballmer. How can you be that far off what consumers want? Was it that you’re not listening to your team? Was it because the team was afraid to give him advice? Was it because the team saw a different reality? Or was it that the team lacked the skill set to anticipate the failure?”

Whatever the reason, it ultimately led to Ballmer being blamed. “The buck stopped with Ballmer,” said Moorhead.

Microsoft announced the write-off in mid-July during its second-quarter earnings call with Wall Street. But the company would have known weeks before that it would have to declare the charge-against-earnings. And the board, too, would have known about the massive hit.

Other analysts saw a longer process, where Ballmer knew he was on his way out for months, and one in which he was not exactly ousted, but saw the logic of retirement, both for himself — and as Ballmer wrote in his email — for the good of the company.

“He has been looking for the right time to retire for a long time, the right person to hand the reins to,” said David Cearley of Gartner. “I think it’s very likely that Ballmer’s decision [to retire] is part of a broader strategy within Microsoft as expressed by the reorganization in July that is geared toward shifting the corporate culture.”

The reorganization Cearley referred to was one announced by Ballmer himself July 11 that eliminated the long-standing product-centric divisions and reshuffled executives and responsibilities along more horizontal lines and with more control shifted to the CEO.

To Cearley, Ballmer’s departure was a mutual decision, one based on the realization that the reorganization and the company’s earlier pronouncement that it would become a “devices-and-services” vendor required a new CEO near the beginning of the process, not in the middle of the transition.

“They came to the decision that this overall strategy required bringing in a new CEO who can execute from the beginning to put their own imprint on that strategy,” said Cearley.

Ballmer spoke of the timing in his email. “My original thoughts on timing would have had my retirement happen in the middle of our transformation to a devices and services company,” Ballmer wrote. “[But] we need a CEO who will be here longer term for this new direction.”


AF releases nuclear enterprise’s future plan

By Staff Sgt. David Salanitri, Secretary of the Air Force Public Affairs

Published August 23, 2013

Flight Plan for the Air Force Nuclear Enterprise


WASHINGTON (AFNS) — The Air Force recently announced a long-term vision for the service’s nuclear enterprise.


The plan, signed by the chief of staff and secretary of the Air Force and approved by the 4-star-level Nuclear Oversight Board, provides a framework for advancing and monitoring the overall health of the Air Force nuclear enterprise, supporting infrastructure and processes.

The plan is organized into three main sections. The first explains the Air Force’s perspective on 21st century deterrence and assurance, and how that differs from the Cold-War era.

The second section outlines five strategic vectors for the nuclear enterprise, and the final segment explains how the plan will be used to monitor and advance progress across the enterprise.

“All Airmen should understand the basics of the deterrence mission and its importance to our Air Force and the nation,” said Maj. Gen. Garrett Harencak, the Air Force’s assistant chief of staff for Strategic Deterrence and Nuclear Integration.

To promote understanding of the mission, the first part of the plan explains how Airmen across the Air Force contribute to national security by providing nuclear capabilities that deter potential adversaries, and assure our allies and partners.

The section concludes by describing the capabilities across the Air Force that contribute to effective deterrence and outlines the Air Force’s commitment to sustain and modernize capabilities to meet the changing demands of the 21st century.

Section two of the plan identifies the “five vectors designed to advance and monitor the overall health of the nuclear enterprise and further develop our Airmen, organizations, processes, capabilities and strategic thinking,” Harencak said.

By outlining a vector for each of these areas, the general said the Air Force will be able to implement a continuous improvement process to assess, develop action plans for improvements, and track the progress in each area.

Finally, the plan outlines how the Nuclear Oversight Board and Nuclear Issues Resolution and Integration Board will oversee efforts to meet plan objectives.

Though it is not intended to supplement any programming guidance, nor outline specific force structures, the plan may be used by planners, programmers and others to inform their efforts, Harencak said.

“We encourage commanders and Airmen at all levels to use the flight plan as a starting point for discussion and debate about deterrence in the changing 21st century environment, and the Air Force role in meeting those challenges,” Harencak added.


Rasmussen Reports

What They Told Us: Reviewing Last Week’s Key Polls

Saturday, August 24, 2013

The tragic events in the Middle East and the sentencing of a former soldier who leaked classified documents has put the spotlight on U.S. foreign policy this week.

Following deadly clashes in Egypt between the military and supporters of ousted President Mohamed Morsi, 54% of voters said the United States should not continue military and financial aid to the embattled nation.  Eighteen percent (18%) said U.S. aid to Egypt should continue.

Belief among U.S. voters that Egypt will become a peaceful and democratic nation in the near future has diminished since the Arab Spring protests in 2011.

The United States, France and several other countries are pushing for an investigation into allegations that the Syrian government used chemical weapons in an attack that killed hundreds of citizens. In June, 55% of voters disagreed with the president’s decision to provide weapons and military assistance to anti-government rebels. It will be interesting to see if opinions have changed when Rasmussen Reports releases new data on the Syrian crisis next week.

Chelsea Manning, formerly known as Bradley, was sentenced to 35 years in prison for releasing classified government documents to the website WikiLeaks. Forty-two percent (42%) of voters say that punishment is about right, but 33% say it doesn’t go far enough. Fifty-one percent (51%) believe Manning should have to serve the entire sentence without the possibility of parole, while 39% disagree. 

When it comes to policing at home, 52% of voters say on-duty police officers should be required to wear uniform cameras but say the practice wouldn’t have much of an impact on crime.  Voters are also more likely to view racial profiling by police as a civil rights violation.

Congress is currently on August recess, a time when many legislators hold town hall meetings to connect with their voters. Seventy-nine percent (79%) of voters believe that it’s more important for members of Congress to hear from their constituents rather than explain legislation to them. But 39% of voters now say the protests at town hall meetings are usually phony efforts drummed up by special interest groups and lobbyists. Thirty-seven percent (37%) believe the protesters are citizens reflecting the concerns of their neighbors. Another 24% are not sure. 

Congress’ top leaders have some work to do in winning back voters when they return from the recess. More than 50% of voters have an unfavorable opinion of Republican House Speaker John Boehner and his predecessor Nancy Pelosi. Pluralities of voters also view Senate leaders Harry Reid and Mitch McConnell unfavorably. 

Voters remain evenly divided on their views of Vice President Joe Biden, but most Democrats continue to see him in a favorable light

On the economic front, just 44% of Americans are confident in the stability of the U.S. banking system. 

Just half of homeowners believe their home is worth more than what they still owe on it, but confidence in the short-term housing market remains higher than it has been in several years. 
Most homeowners are also confident that they know what their home is worth in today’s market, and more now say it’s worth more than when they bought it

Eighty percent (80%) of Americans say they are paying more for groceries now compared to last year, and most expect to be paying higher grocery bills a year from now.

When it comes to grocery shopping, most Americans choose a major chain supermarket. But they still consider it at least somewhat important to find food that is locally grown and organic, even though they overwhelmingly agree it’s more expensive. Most adults say they eat dinner at home on a typical day, and most opt to cook their own meal rather than microwave a ready-made one. 

Consumer and investor confidence remain near their highest levels in several years.

Twenty-nine percent (29%) say the United States is heading in the right direction for the second straight week

In other news last week:

– Americans continue to overwhelmingly believe the so-called war on drugs is failing, but they are more divided on how much the United States should be spending on it. 

– Adults still believe kids are spending too much time in front of computers and other electronic devices, and nearly half think their quality of life is worse than it was for children a generation ago. Most adults also continue to believe their fellow Americans watch too much TV

–  Seventy-one percent (71%) of Likely U.S. Voters say the federal government helps businesses that are politically connected and hurts those that are not. 

– Most voters continue to have an unfavorable opinion of the health care law, and believe it will increase the nation’s deficit and drive up health care costs. 

– The number of voters who consider the bailouts of the auto industry a success is at an all-time low, while the fewest in a year think the bailouts were good for the country

– Republicans and Democrats run even for the second straight week on the Generic Congressional Ballot

– Most voters continue to think parents should be able to choose between schools based on such things as uniforms, prayer and how long the school year lasts.



August 17 2013




DoD weighs major COCOM realignment

Aug. 11, 2013 – 09:49AM |

By Marcus Weisgerber

Staff writer


WASHINGTON — The Pentagon is considering a major overhaul of its geographical combatant commands, possibly realigning oversight within hot-button areas of the world and eliminating thousands of military and civilian positions,according to defense sources..

While the plans for combatant command (COCOM) realignment and consolidation are still notional, sources say some options include:

■ Combining Northern Command and Southern Command to form what what some are calling “Americas Command” or “Western Command.”

■ Dissolving Africa Command and splitting it up among European Command and Central Command.

■ Expanding Pacific Command to include Afghanistan and Pakistan, which are part of Central Command.

In all, the realignments could shutter two COCOMs and eight service-supporting commands, totaling more than 5,000 people both uniformed and civilian.

Defense Secretary Chuck Hagel for the first time hinted at the consolidations of the COCOMs during a July 31 press conference when he announced significant budget-cutting options the Defense Department would have to make should federal spending cuts remain in place across the decade.

Defense officials would not comment on specific consolidation plans being considered.

The sequester is forcing the Pentagon to look for ways to cut spending quickly. Shuttering a COCOM would impact U.S. relations abroad, and underscores the need to eliminate the budget caps, a defense official said.

“Combining combatant commands is certainly not something that we want to do, but something that we have to consider because all cuts have to be on the table,” the official said.

Members of the Joint Staff and other defense officials have been exploring options for COCOM realignment since last year, according to sources.

Regional experts agree the Pentagon could reorganize its combatant commands to better align the headquarters with long-term strategic goals.

Combining Northern and Southern commands could lead to greater resources for activities in South and Central America, which experts say has long been DoD’s most neglected region.


Combining the regions could better address cross border issues — particularly drug trafficking — between Mexico, South America and the United States, said Bob Killebrew, a retired Army colonel and senior fellow at the Center for a New American Security.

Mexico is part of Northern Command, which also includes the contiguous United States, Alaska and Canada.

“[I]t makes … sense not to have a kind of artificial DoD boundary, not only between Mexico and Central America, but between Mexico and the American border as well,” Killebrew said.

Organizing oversight of Africa has been a topic of debate — mostly in the academic community — ever since Africa Command split from European Command and became a stand-alone COCOM in 2008. Before that, European Command oversaw much of the continent, with Central Command overseeing the Horn of Africa.

“The [oversight] that was diffused over multiple commands really wasn’t something that was in our best interest nor in the best interest of our partners on the continent,” said Kip Ward, a retired Army general who was the first commander of Africa Command.

Major changes to the existing Africa Command construct are not likely during a COCOM reorganization, experts say. US military operations in Africa, ranging from the 2011 overthrow of Moammar Gadhafi in Libya to the recent campaign against terrorists in Mali, underscore the need for a dedicated COCOM, defense officials say.

Since its establishment, Africa Command has added value and has been well received on the continent, Ward said.

“I think that the focus that AFRICOM is able to bring to that vital, important part of the world is still important,” he said.

Meanwhile, experts agree that Afghanistan, Pakistan and India should fall under the same COCOM, regardless of whether it’s Pacific or Central. India falls under Pacific Command while Pakistan and Afghanistan are part of Central Command.

Since security, foreign policy, economic and trade issues with India predominantly involve Pakistan and vice versa, placing them under the same COCOM could better streamline U.S. military ties with each country, some experts say.

The same is true for security and policy issues involving Afghanistan, since much of the violence in the nation is along the Pakistani border.

“It’s better that the people who are dealing with India are the ones that are at least fully aware of, or completely in the picture, of what was discussed between two top dignitaries between the U.S. and Pakistan,” said Hassan Abbas, a former Pakistani government official who served in the administrations of Prime Minister Benazir Bhutto and President Pervez Musharraf.

“If they are the same team, or the same group of people that are in the same institution who are dealing with India, it actually provides more leverage for the United States and more opportunities to go for a coherent policy rather than dealing through two different commands,” said Abbas, a senior adviser and fellow at the Asia Society and a professor at National Defense University’s College of International Security Affairs. He stressed that this is his personal view and he was not speaking for the university or the U.S. government. “I think this makes sense.”


Budget consolidation

Federal budget caps would cut about $500 billion from planned defense spending over the next decade. The caps have already cut $37 billion from the Pentagon’s 2013 budget.


With that in mind, Hagel on July 31 announced the findings of the four-month-long Strategic Choices and Management Review, an effort that examined options the Pentagon could take to meet those spending targets, while trying to achieve the goals of its Pacific-focused military strategy.

If the caps remain in place across the decade, “additional consolidations and mission reductions,” such as “consolidations of regional combatant commands, defense agency mission cuts, and further IT consolidation” might be necessary, Hagel said.

“These changes would be far-reaching and require further analysis and consideration,” he said. “Though defense bureaucracies are often derided, the fact is that these offices perform functions needed to manage, administer and support a military of our size, complexity and global reach.”

The actual COCOM realignments would be laid out in the Unified Command Plan, a document that “establishes the missions, responsibilities and geographic areas of responsibility” for COCOM.

The Unified Command Plan — which originated in late the 1940s — is usually reviewed every two years and was last updated in April 2011.

The Pentagon last shuttered a major command in 2010 when it closed U.S. Joint Forces Command, in an attempt to trim overhead. But many of the positions associated with that command remained as the Joint Staff absorbed nearly 3,000 uniformed and civilian workers.

The growth in headquarters staff sizes at the COCOMs and across the entire military has been a major issue of late.

Between 2010 and 2012, staff sizes at the six geographical COCOMs have increased more than 7 percent, adding nearly 1,000 civilian and military positions.

In addition to those numbers, each service operates its own subordinate commands to the individual COCOMs. Many positions at the service commands are redundant to positions at the COCOMs, according to the Government Accountability Office.

“If they were to streamline or combine [COCOMs], you certainly won’t have as many components,” said Arnold Punaro, a retired Marine Corps major general and former Senate staffer who chairs the Reserve Forces Policy Board. “It depends on how they do it.”

Hagel announced this summer a plan to cut COCOM headquarters and Pentagon staffs by 20 percent and reduce duplication.

Pentagon officials overseeing the reorganization should also consider changing the title of the four-star generals and admirals who oversee these regions from combatant commanders to something less invasive, Killebrew said.

“It’s a horrible [title] because what most of these commanders do is military assistance and military cooperation with other countries,” Killebrew said. “When you say you’re a combatant commander the first thing somebody in a small country says is, ‘But I don’t want to go to war. I just want to talk.’ “

This is especially the case when dealing with some African and Central and South American nations.

A combatant commander used to hold the title commander in chief, or CINC, a title that former Defense Secretary Donald Rumsfeld abolished in 2002.

Killebrew said the title should be changed to “unified commander or something that implies working with other countries and not invading them.”

Paul McLeary contributed to this report.


Sequester May End Civilian Pensions for Military Retirees


By Eric Katz

August 8, 2013

Civilians at the Defense Department have had a rough go of it lately, thanks in large part to sequestration.

They recently received some good news, however, as Secretary Chuck Hagel shaved the number of furloughs each worker must take, from 11 down to six. The celebrations may be short lived.

The Pentagon is hoping to avoid furloughs in fiscal 2014, but may resort to mandatory layoffs if sequestration remains in effect. The department would have to cut $52 billion from its budget next year if the spending caps stay, an increasingly likely prospect. In the long term, however — current law defines sequestration as a 10-year program — Defense will have to cut $250 billion over the next decade, and simple reductions in force will not suffice.

Hagel recently laid out various budget scenarios in the department’s Strategic Choices and Management Review, including proposals for dealing with sequestration. One suggestion: eliminating civilian pensions for retired military personnel working in the civil service.

Currently, the approximately 134,000 military retirees working for the Pentagon may forfeit their military retirement pay if they wish to apply their years of military service toward their civilian pension. Alternatively, they can keep their military retirement pay and start their service credit fresh when beginning their civilian careers. The new proposal would strip these options from military retirees, who would instead have to rely on their military retirement payments alone.

The plan — which, for now, remains very much in the hypothetical stage — would save $100 billion over ten years, Hagel said, when combined with ending subsidies for defense commissaries and restricting the availability of unemployment benefits.

Although they would have a “significant impact” on the Pentagon’s workforce, Hagel said “a sequester-level scenario would compel us to consider these changes, because there would be no realistic alternative that did not pose unacceptable risks to national security.”

Cuts to TRICARE Prime Proceed, Despite Protests

Over the last few months, Government Executive has tracked a Defense Department plan to phase out TRICARE Prime — the cheaper health care alternative to TRICARE Standard — for certain military retirees and their dependents. 

The move will affect more than 170,000 Prime enrollees who live more than 40 miles from a military clinic or hospital. The Pentagon will automatically transfer those beneficiaries to TRICARE Standard beginning Oct. 1, increasing their fees by about $240 a year on average. “Prolonged protests” had delayed the plan’s implementation, the Defense Department said.

Enrollees who live outside the acceptable radius for Prime, but within 100 miles of a military  facility, may have a one-time opportunity to stay enrolled in the economy health care plan, depending on the availability of services.

Little Victories

While the outlook for former military personnel is, in recent months, mostly marked by disheartening headlines, at least one subsect received positive news this week.

Both chambers of Congress have passed the Helping Heroes Fly Act, introduced in the House by Rep. Tulsi Gabbard, D-Hawaii. The bill will — once it receives President Obama’s signature — improve and make permanent the Wounded Warrior Screening Program, which requires the Transportation Security Administration to offer “sensitive screening of severely injured or disabled service members and veterans.”

“There is nothing more frustrating than to see these heroes returning home after defending our nation only to have to go through secondary screening in our airports. It’s offensive and insulting,” said Rep. Michael McCaul, R-Texas, House Homeland Security Committee chairman. “The Helping Heroes Fly Act will put an end to this and treat our wounded warriors with the dignity they deserve.”


MSPB Indefinitely Delays Processing of Defense Furlough Appeals


By Kellie Lunney

August 12, 2013

The Merit Systems Protection Board has delayed processing and adjudication of furlough appeals from Defense Department employees until it can get a better handle on them.

Approximately 30,400 furlough challenges have poured into the small federal agency to date this fiscal year; of those, about 96 percent so far are from Defense employees forced to take unpaid leave because of sequestration. MSPB, which has a little more than 200 employees in Washington and in eight regional offices nationwide, called the number of appeals “unprecedented” and “overwhelming.” The agency will continue to process as usual furlough appeals from employees of non-Defense agencies.

The delay will allow MSPB to glean “a better understanding of the claims being raised, the work locations of the appellants, the MSPB offices in which the appeals have been filed, and whether or not the appellants are represented,” the agency said in a statement on its website. The board encouraged employees to file appeals electronically if they can, and to check online periodically for updates.

Among its other responsibilities, MSPB adjudicates appeals of “adverse personnel actions” from federal employees who’ve been fired, suspended for more than 14 days, furloughed for 30 days or less, demoted or had their pay cut. Agencies must give furloughed employees 30 days’ advance notice; once on furlough, employees have 30 days to file an appeal with MSPB. The agency’s regional offices received about 6,000 appeals during fiscal 2012; as of Monday, the regional offices received approximately 35,000 appeals, the bulk of which are furlough appeals. And there’s still more than a month left in fiscal 2013.

“As you can imagine, our regional offices have been overwhelmed, but our employees are working hard and doing the best they can do under difficult circumstances,” said Bryan Polisuk, MSPB general counsel, in an email. MSPB administrative judges have issued 16 decisions so far in furlough appeal cases, all from employees of the Federal Aviation Administration and the Environmental Protection Agency. The judges agreed with the agency’s decision to furlough the employees in each of those cases.


MSPB is docketing Defense furlough appeals as they come in, and then notifying employees that it will contact them “at a later date” with information on the status of their appeals. “MSPB will also be unable to respond quickly to inquiries regarding these furlough appeals,” the website statement said. “We ask therefore that parties to DoD furlough appeals refrain from contacting MSPB’s regional and field offices until we inform you that processing of your appeal has begun.” The board also sent letters in July to the general counsels of the Navy, Air Force and Army informing them of the processing delay related to Defense furlough appeals.

The agency’s regional offices have docketed 10,700 furlough appeals to date, Polisuk said. “The docketing process itself takes some time,” he said. “We’re hoping to be in a position soon to make decisions on how to move forward and adjudicate these cases in the most efficient manner possible, but again, it will depend on the volume of appeals.”

Defense furloughed 650,000 civilian employees this fiscal year because of sequestration. The department originally told employees that they would have to take 22 days of unpaid leave through Sept. 30; that number now is down to six days.

MSPB officials have worried since before the mandatory budget cuts took effect in March that the tiny, quasi-judicial agency could be flooded with appeals from furloughed employees across government. The only other time something like this happened, according to officials, was in the 1980s when the agency had to process 12,000 personnel-related appeals during the air traffic controller strike. That took two years, and at the time, MSPB had about double the number of employees it has now.

The average appeal processing time in fiscal 2012 was 93 days for an initial decision from the agency. The losing party can then file a petition of review with the agency’s three-member board in Washington. It took an average of 245 days to process those petitions in fiscal 2012.



Dalai Lama’s Chinese website infecting visitors, expert warns

By Jim Finkle

BOSTON (Reuters) – A prominent computer security firm has warned that the Dalai Lama’s Chinese-language website has been compromised with malicious software that is infecting computers of visitors with software that could be used for spying on its visitors.

Kaspersky Lab researcher Kurt Baumgartner told Reuters that he is advising web surfers to stay away from the Chinese-language site of the Central Tibetan Administration’s site until the organization fixes the bug.

He said he believes the group behind the campaign was also behind previous breaches on the site that have gone unreported as well as attacks on websites belonging to groups that focus on human rights in Asia.

Officials with the Office of Tibet in New York could not be reached for comment. That office houses the Dalai Lama’s official representative to the United States.

Baumgartner said that the Chinese-language site of the Central Tibetan Administration, which is the official organ of the Dalai Lama’s government in exile, has been under constant attack from one group of hackers since 2011, though breaches have been quietly identified and repaired before garnering public attention.


“They have been trying repeatedly to find vulnerabilities in the site,” he said.

He said that it is safe to visit the group’s English and Tibetan sites.

He said he believes the same group of attackers has repeatedly infected the site with malicious software that automatically drops viruses on computers running Microsoft Corp’s Windows and Apple Inc’s Mac operating systems. They infect machines by exploiting security bugs in Oracle Corp’s Java software.

That gives them “back doors” into those computers. “This is the initial foothold. From there they can download arbitrary files and execute them on the system,” Baumgartner said.

An Oracle spokeswoman had no immediate comment.

The Dalai Lama, Tibet’s 78-year-old exiled spiritual leader, fled China to India in 1959 after an abortive uprising against Chinese rule.

Beijing considers the globetrotting monk and author a violent separatist and Chinese state media routinely vilify him. The Dalai Lama, who is based in India, says he is merely seeking greater autonomy for his Himalayan homeland.



U.S. Air Force to shut down ‘Space Fence’ surveillance system

August 12, 2013

WASHINGTON (Reuters) – The U.S. Air Force will shut down its space surveillance system that tracks satellites and other orbiting objects by October 1 due to budget constraints caused by automatic federal budget cuts known as the sequestration, it announced Monday.

Deactivating the system by October 1 would save the Air Force Space Command $14 million annually starting in fiscal year 2014.

The surveillance system got the nickname “Space Fence” because it transmits a “fence” of radar energy vertically into space that can detect any object or debris that crosses it without being cued to do so.

Commander of the Air Force Space Command, General William Shelton, said the system – which has been in operation since 1961 – was outmoded and that newer technology will provide more accurate observations.

Shelton said a new Space Fence is being planned now, which will provide more precise positional data on orbiting objects and would become the most accurate radar in the Air Force’s space surveillance network.

The system that will be discontinued on October 1 is a series of three transmitters and six receivers located across different points in the southern United States. It is operated by Five Rivers Services in Colorado.

“When combined with the new Joint Space Operations Center’s high performance computing environment, the new fence will truly represent a quantum leap forward in space situational awareness for the nation,” Shelton said in a statement Monday.

(Reporting by Valerie Volcovici; Editing by Lisa Shumaker)


How Congressional staff can keep their healthcare coverage

The Hill

By David Farber

08/13/13 02:30 PM ET


On August 6, the federal Office of Personnel Management (OPM) released its proposed rule to fund 72-75 percent of the health care premium costs for Members of Congress and their staff. Because of the proposed rule, recent media reports have suggested, the Congressional staff health care crisis would be averted. However, the proposed rule and its preamble only address half the issue. In fact, there are two major questions that OPM needed to address – only one of which the Agency has signaled it understands. OPM and the White House have the opportunity to do right by Congressional staff in the upcoming rulemaking but to do so, they will need to read the Patient Protection and Affordable Care Act (PPACA) – all of it.

In fact, while both the OPM rulemaking and media reports have focused on premium payments, there are really two questions facing Congressional staff: (1) will staff be able to remain in their existing Federal Employee Health Benefit Plan (FEHBP) coverage, or will they be forced out of that coverage into an exchange plan; and (2) if staff must get exchange coverage, can the federal government actually pay exchange premiums, or will those premiums have to be paid by staff out of pocket?

The rulemaking addresses the second question in a way that will allow the government to pay staff premiums. But the more important question is whether staff even need to switch out of existing coverage into an exchange. Indeed, as demonstrated by IRS official Danny Werfel’s candid testimony before the House Ways and Means Committee on August 1, federal employees, including Hill staff, should not want to leave the robust FEHBP coverage they enjoy today for uncertain exchange coverage in the future.

What is the argument for Hill staff keeping FEHBP coverage? While section 1312(d)(3)(D)(i) of the Affordable Care Act suggests that Congressional staff may only have access to coverage “created under this Act” or through an “Exchange,” there is another provision in the same law – the grandfathering clause – that may be more relevant.

That provision, found in section 1251(a), states: “nothing in this Act (or an amendment made by this Act) shall be construed to require that an individual terminate coverage under a group health plan or health insurance coverage in which such individual was enrolled on the date of enactment of [the Affordable Care] Act.” In other words, if you are in FEHBP today, nothing in the Act, including section 1312(d)(3)(D), should be allowed to terminate your coverage come January 1, 2014. While it might be argued that the two clauses conflict, that is where an OPM rulemaking could do the right thing, and apply Agency discretion to have section 1251 trump section 1312 – at least for all staff who were on the payroll on the date of enactment, and, in OPM discretion, also to staff employed on December 31, 2013. Things may be a bit more ambiguous for staff hired next year, but that can be fixed later.

The above analysis is not new – the Congressional Research Service laid out the arguments in an April 2, 2010 Congressional Distribution Memorandum, on page 11. While CRS does a good job explaining the pros and cons, the roadmap to maintain coverage is more than there. OPM was aware of the CRS analysis and used it for other reasons. Yet, on the more significant question of whether Congressional staff can remain in their FEHBP coverage, OPM has failed to pick up the argument, much less run with it.It is not too late.

Every Congressional staffer who wants to stay in his or her FEHBP coverage should submit a comment to OPM and should urge the agency to apply the Affordable Care Act. Comments should urge OPM to read every section of the Act – not just section 1312, and to give meaning to section 1251 – the grandfather clause. The agency has the ability to fix this.

Where there is an agency will, there is a regulatory way. OPM has the chance to get this right, and not only answer who can pay premiums, but answer the more important question of whether Congressional staff can keep their FEHBP coverage. Every other American who has enjoyed employer-based coverage will get to keep their existing plan come January 1. If Congress intended to subject itself and staff to the same rules as every other American, then Congressional staff should be able to retain their existing FEHBP coverage as well. The agency has clear authority to make this happen. For all those staff who serve above and beyond, OPM should give them the right answer.

Read more:


What if everyone plugs in their cars at once?

One day, when electric cars rule the road, owners might crash the power grid if they all were to plug their cars in at once. A smart charger developed by Northwest scientists would prevent that calamity.

Seattle Times

Originally published Saturday, August 10, 2013 at 8:07 PM

By Sandi Doughton

Seattle Times science reporter


RICHLAND — Electric cars account for fewer than 0.05 percent of passenger vehicles in the United States today, but Michael Kintner-Meyer envisions a future where plug-ins rule the roads.

The proliferation of electric cars will bring benefits — like lower tailpipe emissions — ­but could also create unique headaches, says Kintner-Mayer, who leads a project at Pacific Northwest National Laboratory to improve the vehicles and tackle the problems.

Now, he and his colleagues have crafted a solution to the scenario that gives power-grid operators nightmares: The prospect that millions of Americans will get home from work and plug in their cars at the same time.

“It would create havoc,” said Kintner-Meyer. “You could have the lights go out. You could have rolling brownouts.”

The way to avoid widespread overloads is to spread out the demand — which is what PNNL’s “grid-friendly” charger does. The device, which is about the size of a shoebox, monitors the status of the grid and adjusts accordingly, switching off when demand is high and switching on when power is plentiful.

Adaptive charging could lower car owners’ electricity bills by allowing them to draw power when rates are lowest. And if enough cars use the systems, they could also collectively provide a valuable service to the power grid by dampening swings in electrical generation from the growing number of wind farms and solar arrays.

Drivers could save up to $150 a year, the grid would be protected from crashing, and the overall power system would run more smoothly, Kintner-Meyer said.


California-based AeroVironment, Inc. licensed the technology from PNNL and is integrating it into beta versions of a charging station. Alec Brooks, the company’s chief technology officer for efficient energy systems, has been using one to charge his Nissan Leaf.

At PNNL, Kintner-Mayer runs the system on a 2009 Prius hybrid that he and his team converted to a plug-in.

Because the grid-friendly system switches off and on, it takes longer than a conventional charging system, Brooks said. But it doesn’t matter — as long as the car is fully charged in the morning.

“There’s usually plenty of slack time,” he said. “I can’t think of the last time I was waiting for my vehicle to charge.”

Owners in a hurry can simply bypass the grid-friendly feature.

With fewer than 150,000 electric cars in the nation, power-grid operators don’t have much to worry about yet. But plug-in cars are the fastest-growing sector of the automotive industry, according to the advocacy group Plug In America.

The Green Car Report estimates 2013 sales will approach 100,000, nearly double last year’s total.

The Northwest is helping lead the charge, with 5,400 electric cars in Washington and about 3,000 in Oregon. The nation’s highest-selling Leaf dealership is in Bellevue, said Redmond resident Chad Schwitters, an electric-car enthusiast and vice president of Plug in America.

AeroVironment operates what’s called the West Coast Electric Highway — a network of fast-charging stations that will soon number 55, situated along I-5 and other highways in Washington and Oregon.

The Northwest is also ahead of the rest of the nation in experiencing growing pains caused by the proliferation of wind farms. In 2011, the Bonneville Power Administration ordered some turbines to shut down because the region had more power than it knew what to do with.

Now, grid operators usually compensate for swings in power by adjusting the spill at hydropower dams or having power plants increase or decrease their power output, Brooks explained. “Power plants get paid to do that.”

But if large numbers of electric cars were hooked into grid-friendly chargers, they could soak up excess electricity, then power down when the grid is running low, he explained.

There’s no billing arrangement yet that would allow electric-car owners to be compensated for helping even out the grid. But Brooks predicts that could change, which would provide a powerful incentive to switch to grid-friendly charging stations.

“I think it can be a very large market once there’s a recognition that you’re providing a useful service to the grid,” he said.



Could Elon Musk’s Hyperloop Derail High-Speed Rail?

Two years of background work by the Tesla Motors and SpaceX founder reveals an affordable, open-source alternative to California’s proposed high-speed rail.


Getting from California to China could become a trip that takes less than two hours within Elon Musk’s lifetime. On August 12, Musk, founder of SpaceX and co-founder of PayPal and Tesla Motors, revealed his plan for a new transport system that is far cheaper and faster than high-speed rail, outlining tentative details for a possible San Francisco to Los Angeles route that reaches top speeds of 760 mph. Musk has previously described Hyperloop as a cross between a “Concorde, a railgun, and an air hockey table.”

Musk released a 57-page document outlining an alpha-stage plan for Hyperloop. The South African entrepreneur also fielded questions from reporters around the world on his vision. The document contains pictures of what the pneumatic-tube system and vehicles might look like, possible approaches to various engineering problems, an itemized budget for a California route totaling just $6 billion, maps of proposed routes and network expansions, and technical diagrams and explanations of various technologies that would be integrated into the system.

 A Hyperloop connecting San Francisco and Los Angeles could be constructed within seven to 10 years for $6 billion, Musk said, adding that California’s proposed $70 billion high-speed rail system is a mistake. Musk’s financial model would price a one-way ticket from San Francisco to Los Angeles at about $20 per person. The trip would last just 35 minutes.

Musk explained that Hyperloop is a “low priority” for him now as he is busy with SpaceX and Tesla Motors, and that is partially why the project relies on an open-source model. The project is also open-source, Musk said, to attract interest and leverage ideas from as many people as possible. During the press conference, he repeatedly emphasized the importance of someone developing a prototype of the technology to work out engineering problems — a task he said he will probably take on himself.

While the document released shares many ideas developed by SpaceX and Tesla Motors engineers, nothing is set in stone yet, Musk said, and that’s why it’s an alpha. Government Technology participated in Musk’s telephone media question and answer session as Musk explained the project as it now stands. Here is an abridged transcript of the call:

What will the experience of being on Hyperloop feel like?

It would actually feel a lot like being in an aeroplane. There would be initial acceleration, and once you’re traveling at speed, you wouldn’t really notice the speed at all. It would be extremely smooth, like you’re riding on a cushion of air. You wouldn’t feel lateral acceleration because the pod would bank against the side of the tube, so the g-force would always point down. With a maximum g-force of around a half a G of increase, so that’s also comparable to what you would see on an aeroplane and far less of what you would see on, say, a rollercoaster. It should just feel real super smooth and quiet. And obviously there’d never be any turbulence or anything.

I heard you say this could never crash. How could that be in earthquake country?

Well, obviously “never” is a very strong word. It’s extremely difficult, I suppose. Unlike an aeroplane, it’s not moving in three dimensions. It’s not going to fall out of the sky, nor can it really be derailed as a train can. The thought I had was in the pylons upon which the tube is mounted to have earthquake dampeners sort of similar to those sort of things you have in buildings in California. They’re like basically shock absorbers and they have two laterally mounted and one vertically mounted in the post. Now, there’s going to be some earthquakes that are gigantic that can overcome the dampeners, but then we have that same problem in buildings, so if LA falls down, I guess Hyperloop will, too. But relative to say, a train, it should be quite a bit safer.

What is the likelihood of this actually being built?

I’ve been thinking about that and I’m somewhat tempted to make at least a demonstration prototype and I’ve come around a bit on my thinking here to create a sub-scale version that’s operating and then hand it over to somebody else. I think that some of the more difficult things is just ironing out the details at a sub-scale level. I think I’ll probably end up doing that. It just won’t be immediate because in the short term I’m focused on SpaceX and Tesla.

If somebody else goes and does a demo, that would be really awesome. And I hope somebody does, but if it doesn’t look like that’s happening or it looks like that’s not happening in the right way, then I would. I don’t really care much one way or the other if I have any economic action here, but it would be cool to see a new form of transport happen.

While planning this project with SpaceX and Tesla engineers, did you talk a lot about power consumption?

Quite a fundamental question is, ‘Can you contain enough energy in a battery pack in a pod to pump the air from front to rear?’ And we can. In fact, if we just use some version of the [Tesla] Model S motor, maybe a few of them in series, and the Model S battery back, assuming today’s current technology, we can make it work.

[A more technical outline of Hyperloop’s power system can be found in the Hyperloop Alpha pdf.]

There has been talk of using solar power for Hyperloop as sustainability is one of its core features. Is this a viable option?

There’s actually way more surface area on the top of the tube than you really need. If you did actually put solar panels on the whole thing, you would have to dump the power somewhere, because you would have more than you can consume.

Why can Hyperloop be supported by pylons whereas high-speed rail requires a much stronger foundation?

It’s a weight thing. This was designed to be super-light and trains are just amazingly heavy. They don’t try very hard to make trains light. Yeah, [laughing] trains are heavy. This is designed more like an aircraft.

How many people and for how long did they work on coming up with this project?

There were probably in total a little over a dozen people working on it, but it was very much a background task. This was not something that was anybody’s full-time job. I started thinking about it maybe two years ago and then started involving others about ten months ago. And we’ve just been batting it around in the background, and in the last basically few weeks we did allocate some full-time days to it.

What do you think of California’s proposed high-speed rail project?

Um, I don’t think we should do the high-speed rail thing, because it’s currently slated to be roughly $70 billion but if one ratio is the cost at approval time versus the cost at completion time… you know most large projects escalate quickly… I think it’s going to be north of $100 billion. And then it seems it’s going to be less desirable to take that than take a plane, so California taxpayers aren’t just going to have to pay $100 billion, they’re also going to have to maintain and subsidize the ongoing operation of this train for a super long time as kind of California’s AmTrak. That just doesn’t seem wise for a state that was facing bankruptcy not that long ago.



Microsoft’s Windows 8.1 update to launch in October

Nancy Blair, USA TODAY 1:16 p.m. EDT August 14, 2013


Microsoft today said it will start rolling its much-anticipated Windows 8.1 update in October.

Windows 8.1 will start rolling out as a free update worldwide on Oct. 17 at 7 a.m. ET (or midnight on Oct. 18 in New Zealand, Microsoft notes in a blog post).

Consumers can get the update through the Windows Store. It will also be available at retailers and on new devices on Oct. 18.

Microsoft has taken some critical knocks for the Windows 8 operating system, which it set out to build to work equally well across traditional computers as well as touchscreen devices. Sales have been less than stellar for machines running Windows 8, which represented a dramatic overhaul of the Windows OS.

Windows 8.1 has been available as a preview since earlier this summer. Among other things, it brings the ability to boot up your PC in the traditional-looking desktop rather than the colorful tiles that are a hallmark of the new OS.


XP’s retirement will be hacker heaven

Hackers will bank bugs until after Microsoft retires Windows XP in April 2014; expect attacks, say security experts

Gregg Keizer

August 12, 2013 (Computerworld)


Cyber criminals will bank their Windows XP zero-day vulnerabilities until after Microsoft stops patching the aged operating system next April, a security expert argued today.

Jason Fossen, a trainer for SANS since 1998 and an expert on Microsoft security, said it’s simply economics at work.

“The average price on the black market for a Windows XP exploit is $50,000 to $150,000, a relatively low price that reflects Microsoft’s response,” said Fossen. When a new vulnerability — dubbed a “zero-day” — is spotted in the wild, Microsoft investigates, pulls together a patch and releases it to XP users.

If the bug is critical and being widely used by hackers, Microsoft will go “out-of-cycle,” meaning it will issue a security update outside its usual monthly Patch Tuesday schedule.

But after April 8, 2014, Microsoft has said it will retire Windows XP and stop serving security updates. The only exceptions: Companies and other organizations, such as government agencies, that pay exorbitant fees for custom support, which provides critical security updates for an operating system that’s officially been declared dead.

Because Microsoft will stop patching XP, hackers will hold zero-days they uncover between now and April, then sell them to criminals or loose them themselves on unprotected PCs after the deadline.

“When someone discovers a very reliable, remotely executable XP vulnerability, and publishes it today, Microsoft will patch it in a few weeks,” said Fossen. “But if they sit on a vulnerability, the price for it could very well double.”

Minus any official patching from Microsoft, XP zero-days and their associated exploits could remain effective for months, maybe even years, depending on how well security software detects and quarantines such attacks.

If Fossen’s thesis is correct, there should be signs of bug banking, most notably a sharp reduction in the number of publicly-disclosed or used-in-the-wild XP vulnerabilities during the fourth quarter of 2013 and the first quarter of 2014.

“[Hackers] will be motivated to sit on them,” Fossen stressed.

There really aren’t precedents to back up Fossen’s speculation, he acknowledged, because the last time Microsoft pulled the plug on an edition was July 2010, when it retired Windows 2000. But according to metrics firm Net Applications, at the time Windows 2000 powered just four-tenths of one percent of all PCs.

Windows XP will have a much larger share when it’s retired next year: Based on XP’s current rate of decline, Computerworld has projected that the old OS will still run between 33% and 34% of the world’s personal computers at the end of April 2014.


That would be 80 times the share of Windows 2000 when it retired.

But even with Windows 2000’s minuscule share when it left support, there were reports that an edition-specific zero-day was created and sold.

“I heard rumors of a new zero-day being found and sold after the support period expired [for Windows 2000],” said HD Moore, creator of the popular Metasploit penetration testing toolkit and the chief security officer of security company Rapid7. “But there were few if any examples that ended up in the public eye.”


Moore agreed with Fossen that XP bugs would be more valuable after April 2014, but contended that all Windows vulnerabilities would jump in value.

“Something more common [three years ago] was backporting new security advisories into functional exploits on Windows 2000,” said Moore in an email. “Every time a server-side vulnerability was found in Windows XP or 2003 Server, quite a few folks looked at whether this would also work against Windows 2000. My guess is that the retirement of Windows XP will result in all Windows vulnerabilities being of slightly higher value, especially given the difference in exploit mitigations between XP and newer platforms.”

It’s far easier to exploit flaws in Windows XP than in newer editions, such as Windows 7 and Windows 8, noted Moore, because of the additional security measures that Microsoft’s baked into the newer operating systems.

Microsoft has said the same. In the second half of 2012, XP’s infection rate was 11.3 machines per 1,000 scanned by the company’s security software, more than double the 4.5 per 1,000 for Windows 7 SP1 32-bit and triple the 3.3 per 1,000 for Windows 7 SP1 64-bit.

“Windows XP vulnerabilities will be valuable as long as enterprises utilize that version of the operating system,” said Brian Gorenc, manager of HP Security Research’s Zero Day Initiative, the preeminent bug bounty program. But Gorenc also argued that any XP zero-days would be outweighed by higher-priority hacker work.

“Researchers are primarily focused on the critical applications being deployed on top of the operating system,” said Gorenc in an email reply to questions today. “Attackers and exploit kit authors seem to rely on the fact that the update process and tempo for applications are not as well defined as those for operating systems.”

Fossen, convinced that XP would be a big fat target after April 8, wondered whether Microsoft might find itself in a tough spot, and back away from the line in the sand it’s drawn for XP’s retirement.

“If hackers sit on zero-days, then after April use several of them in a short time, that could create a pain threshold [so severe] that people organize and demand patches,” said Fossen.

The consensus among analysts and security experts is that Microsoft will not back down from its decision to retire XP, come hell or high water, because it would not only set an unwelcome precedent but also remove any leverage the company and its partners have in convincing laggards to upgrade to a newer edition of Windows.


But a few have held out hope.

“Suppose we get to a date post the end of Extended support, and a security problem with XP suddenly causes massive problems on the Internet, such as a massive [denial-of-service] problem?” asked Michael Cherry, an analyst with Directions on Microsoft, in an interview last December. “It is not just harming Windows XP users, it is bringing the entire Internet to its knees. At this time, there are still significant numbers of Windows XP in use, and the problem is definitely due to a problem in Windows XP. In this scenario, I believe Microsoft would have to do the right thing and issue a fix.”

Jason Miller, manager of research and development at VMware, had some of the same thoughts at the time. “What if XP turns out to be a huge virus hotbed after support ends? It would be a major blow to Microsoft’s security image,” Miller said.


Another option for Microsoft, said Fossen, would be to take advantage of a post-retirement disaster to do what it’s been doing for years, push customers to upgrade.

“They might also respond with a temporary deal on an upgrade to Windows 8,” said Fossen, by discounting the current $120 price for Windows 8 or the $200 for Windows 8 Pro. “Then they could say, ‘We’re aware of these vulnerabilities, but you should upgrade.'”


China’s Emerging C4ISR Revolution

August 13, 2013

By Shane Bilsborough, ATIC


China’s military modernization has given rise to an enormous Western literature dissecting its scope and progress. Despite this boom, many analysts have paid relatively little attention to recent advances in the People’s Liberation Army’s (PLA) command, control, communication, computer, intelligence, surveillance, and reconnaissance (C4ISR) capabilities.

The PLA’s growing complement of manned and unmanned aircraft, reconnaissance satellites, and sophisticated ground-based infrastructure comprises the operational foundation of China’s emerging network-centric military. It is also the means by which better-known systems, such as the DF-21D “carrier-killer” anti-ship ballistic missile or the J-20 stealth fighter, could actually fulfill their intended roles during a major regional contingency.

From recent developments in China’s C4ISR infrastructure, it is clear that PLA is well on its way to becoming a sophisticated global military possessing many of the same C4ISR capabilities enjoyed by U.S. forces although it remains to be seen whether organizational barriers will short-circuit this trend.


Airborne C4ISR

Much if not most Chinese thinking on C4ISR and military modernization stems from analysis of the United States’ military performance in recent conflicts. For example, learning from the United States’ successful employment of specialized flying C4ISR systems, such as the E-3 Sentry, and the J-8 STARS, the PLA has identified Airborne Early Warning Command and Control (AEWC&C) aircraft as central to waging war against intervening naval and air forces. According to multiple Chinese analyses, a single airborne AEWC&C aircraft is the operational equivalent of roughly ten ground-based systems of comparable sophistication. In addition to facilitating real-time intelligence gathering, border surveillance, and command and control, these systems are expected to make PLAAF and PLAN fighter aircraft less susceptible to detection by affording them enhanced situational awareness without using their own radar systems. Historically, this capability has afforded the U.S. Air Force significant advantages in beyond visual range engagements that may now be lost.

In keeping with the Chinese analyses of their significance, the PLAAF is already fielding advanced systems of this type. The PLAAF’s current top-of-the-line AEWC&C system, the KJ-2000, is believed to be one full generation ahead of U.S. E-3 AWACS and E-2 Hawkeye aircraft. Among other advancements, the KJ-2000 boasts an indigenously produced phased array radar capable of tracking sixty to one hundred aerial targets simultaneously at a distance of up to four hundred and seventy kilometers. Although somewhat less technologically sophisticated, the PLAN’s Y-8J AEW system affords China’s naval air forces a similar upgrade in situational awareness and is reportedly capable of detecting objects as miniscule as a submarine periscope within its effective range of up to one-hundred eighty-five kilometers.

The United States’s unmanned C4ISR capabilities are also being replicated by the PLA. While information beyond mock-ups displayed at China’s annual Zhuhai airshow is sparse, recent disclosures by Chinese official sources suggest unmanned aerial vehicles (UAVs) will play a major role in China’s emerging C4ISR architecture. According to a PLA statement posted online in July 2011, a ground operator controlled a UAV called the Silver Eagle that participated in South China Sea naval exercises. The UAV reportedly disrupted communications and responded to red team countermeasures while acting as a node for a PLA communications network.

Other modern Chinese UAV’s, such as the Guizhou Aircraft Industry Corporation’s Xianlong long-range UAV and Beijing University of Aeronautics and Astronautics’ BZK-005 UAV are believed to be capable of loitering over a combat zone for roughly forty hours, much like the U.S. Global Hawk. The Chengdu aircraft Design Institute also appears to be developing its own indigenous Global Hawk, the Long Haul Eagle, which was first revealed in 2008. These systems will greatly enhance the PLA’s Broad Area Maritime Surveillance (BAMS) while adding new capabilities.


Space-based C4ISR

China has made still greater strides in its space program and is emerging as a leading space power. Senior PLA and Chinese Communist Party (CCP) leaders have identified space technology as a national priority and allocated significant resources to improving China’s space-related research, development, and launch infrastructure. As part of the PLA’s integrated civil-military space program, counter-space technologies and systems have been a parallel area of focus following China’s landmark 2007 anti-satellite test.

Recent years have seen a number of major advancements in China’s C4ISR related space development programs. The Beidou-2 satellite series, China’s indigenous GPS alternative, has already achieved full regional coverage and is on schedule to achieve global coverage by 2020. With at least 13 successful launches since April 2006, the Yaogan series of electro-optical, synthetic aperture radar, and electronic intelligence satellites have also proven a major success. Apart from these and other satellite programs, such as the somewhat more mysterious Shijian series, China has also successfully modernized and expanded its space launch infrastructure. Under the executive authority of its General Armaments Department, the PLA and its civilian partners now operate three satellite launch centers at Taiyun, Xichang, and Jiuquan, with a fourth large complex under construction at Wenchang on Hainan island.

China’s great leap forward in space and airborne C4ISR capabilities has already impacted the Asia-Pacific military balance. If current trends in technological development, procurement, and satellite launch capacity hold, the next 15-20 years will see the PLA benefit from vastly improved geolocation and precision strike capabilities, persistent global satellite surveillance, and a survivable military communications and data-link architecture. Concurrent improvements in counter-space capabilities will also put U.S. and allied space, air, and sea-based assets at risk, seriously complicating air and naval access to the region.


Organizational Issues

Despite its ongoing technological transformation, it should be noted that the PLA still faces serious obstacles it must overcome before it can take full advantage of its modern C4ISR systems and capabilities. Of these challenges, cultural and organizational problems have proven particularly stubborn.


Many PLA units have proven reluctant to adopt cutting-edge communication and ISR systems due to endemic interoperability problems and lack of experience with modern military technology. Communication and information sharing problems continue to arise in part due to a growing technological mismatch between mainline PLA units, which still employ outdated equipment, and their far less numerous but more sophisticated counterparts. On a broader level, the PLA’s constituent services, and even operational units within the same service, use different and incompatible models and generations of equipment that severely diminish their overall military effectiveness.

In part due to deep-seated inter-service rivalry, PLA joint training still leaves much to be desired. Often, exercises are only joint for certain segments rather than their entirety. Worse, those joint training efforts that do take place are often rudimentary or unrealistic. Consider that one Chinese article praises a joint exercise in which Navy units practiced ship loading and unloading while ground forces practiced loading aircraft onto railcars for the PLAAF. Another document touts the “jointness” of an exercise in which top service leaders communicated via teleconferencing. In addition, as evidenced in these and other exercise reports, no standard metric exists for evaluating joint performance either in C4ISR or other military spheres.



Without breaking down the technological and organizational barriers between its constituent services, the PLA will not be able to implement the “system-of-systems” approach to anti-access operations and C4ISR that its leadership envisions. The PLA has taken some halting actions towards promoting joint operations and information sharing, such as the relatively decisive step of appointing Xu Qiliang, a PLAAF general, to command the deployed force during joint exercises in 2007. This was possibly the first time a PLAAF general was given such a high-profile command, although it hardly constitutes a breakthrough.

Assuming its Central Military Commission successfully limits the tendency of the ground forces to assert control over military doctrine and planning during the next ten to fifteen years, China’s leadership will find that the PLA already possesses most if not all of the C4ISR systems and integrative technologies necessary to complete the PLA’s transformation into a 21st century force.


Shane Bilsborough is an intelligence analyst at the Advanced Technical Intelligence Center (ATIC).



Springfield UAS center names director

Ohio and Indiana efforts will be coordinated by former Dayton Development Coalition official.

Dayton Daily News

Posted: 4:30 p.m. Wednesday, Aug. 14, 2013

By Andrew McGinn

Staff Writer



The new Springfield-based office that will act as the official hub of unmanned aerial systems testing in Ohio and Indiana has named its first director.


Dick Honneywell, a retired Air Force Reserve colonel, will lead the Ohio/Indiana UAS Center and Test Complex at Springfield’s Nextedge Applied Research and Technology Park, Ohio Gov. John Kasich announced this week.

Honneywell, who most recently served as vice president of aerospace at the Dayton Development Coalition, will oversee the center that’s vying to become one of six national test centers for unmanned aircraft that the Federal Aviation Administration is expected to name later this year.

The FAA will use those six centers to study how best to integrate so-called drones into manned airspace.

An industry report earlier this year predicted that Ohio stands to gain more than 2,700 new jobs by 2025 in the blossoming UAS industry. Drones also are expected to one day play a big role in precision agriculture.

The Ohio Department of Transportation on July 1 began leasing 2,060 square feet of office space for the center at Nextedge, located along U.S. 40 just east of Springfield. The two-year lease will cost a total of $70,000.

It’s hoped the center led by Honneywell will act as a magnet for new federal, military and commercial research and testing of UAS.

“Dick brings a tremendous combination of technical and management expertise, as well as a broad knowledge of aerospace systems from his work in the Air Force and private sector,” Kasich said in a statement.

Ohio and Indiana have pooled their efforts to become an FAA test site, with Ohio offering such research powerhouses as the Air Force Research Laboratory at Wright-Patterson Air Force Base and NASA’s Glenn Research Center in Cleveland.

Indiana, on the other hand, has valuable airspace, including the 55,265-acre Jefferson Proving Ground in the southeast part of the state, where from 1941 to 1995, the Army tested 25 million rounds of munitions.

Springfield was deemed a central location to manage the combined effort.


DHS Awards $6 Billion Cybersecurity Contract To 17 Vendors

By: Judi Hasson

08/14/2013 ( 8:00am)


The Department of Homeland Security on Monday awarded a $6 billion contract to 17 companies to protect the government against cybersecurity threats.

The contractors included big names such as IBM, Lockheed Martin Corporation, General Dynamics Corporation, Computer Sciences Corporation and Northrup Grumman Corporation and others. DHS tapped the companies for the Continuous Diagnostics and Mitigation (CDM) Program.

The CDM Program will provide specialized information technology tools and services to combat cyber threats in the civilian “.gov” networks and various network endpoints, including mobile devices. The CDM program, also known as Continuous Monitoring as a Service (CMaaS), shifts the government’s cybersecurity posture from a compliance reporting model to a real-time approach to combating threats, according to the General Services Administration (GSA).

Deputy Under Secretary for the National Protection and Programs Directorate (NPPD) Suzanne Spaulding said in a statement released Aug. 13 that federal departments and agencies will be able to enhance their cybersecurity assessments by implementing automated network sensor capacity and prioritizing risk alerts.

“Results will feed into agency-level dashboards that produce customized reports that alert information technology managers to the most critical cyber risks, enabling them to readily identify which network security issues to address first, thus enhancing the overall security posture of agency networks,” Spaulding said. “Summary information from participating agencies will feed into a central Federal-level dashboard, managed by DHS’ National Cybersecurity Communication and Integration Center, to inform and prioritize cyber risk assessments across the Federal enterprise and support common operational pictures that provide cybersecurity situational awareness to our stakeholders,” she said.

“This significant contract award is designed to support federal civilian networks and the extensive number of cybersecurity requirements for any federal custom and cloud application over the life of the contract, and will be funded through each participating department and agency, not solely by DHS,” Spaulding said.

“The CDM program is a huge step forward for government security,” said Kenneth Kartsen, vice president and head of federal business at McAfee. “The necessary but limited and largely manual check-the-box approach of FISMA (Federal Information Security Management Act) was like looking through a rear-view mirror. By contrast, the CDM program illustrates the real progress DHS and the government are making in cyber security.”

DHS CDM Program PhasesThe first phase of the $6 billion program focuses on four functional capabilities: management of hardware and software assets, configuration management, and vulnerability, “which are baseline capabilities to protect data,” according to a description of the program by the US Computer Emergency Readiness Team (US-CERT).

So far, DHS has committed $185 million to kick-start the program. GSA will manage the contract on behalf of DHS and will charge agencies a 2 percent fee for using the blanket purchase agreement.

“Our success with DHS demonstrates that IBM’s $4B annual investment in cyber security and security analytics research puts IBM in a unique position to help government agencies meet evolving cybersecurity threats,” said Anne Altman, General Manger of IBM’s US Federal business.

“IBM will draw from decades of experience working with federal agencies and worldwide clients and our own internal experience in securing the worldwide networks used by our 400,000 plus employees.”

A spokesman for Lockheed Martin said the company will draw upon its years of investment in continuous monitoring technology and partnerships with members of the Lockheed Martin Cyber Security Alliance for this service.

“Beginning in 2009, Lockheed Martin leveraged advances within its NexGen Cyber Innovation and Technology Center Lab and the Lockheed Martin Cyber Security Alliance to implement a comprehensive solution,” the company said in a statement.


The winning companies are:

Booz Allen Hamilton










Lockheed Martin



Northrop Grumman





Why Feds Are Embracing Agile

Roger Baker     | August 14, 2013 09:06 AM

Numerous federal agencies are moving to Agile software development methods for some or all of their IT systems development projects. In an era of tightening federal budgets and increased demand on technology to help meet mission requirements, agencies are searching for ways to deliver critical mission functionality more quickly and with less risk. For a number of agencies, Agile has become the answer.

On its face, the case for Agile is straightforward: Break the software development process into a series of short “sprints,” each of which delivers on a small portion of the requirements of a system. This modular approach enables (and encourages) frequent delivery of new functionality to end users, and facilitates (even demands) user participation and feedback during system creation. In contrast, the “Waterfall” development approach used traditionally within government requires users to be able to fully describe what they want in a system up front and to wait years until the system is finished.

Agencies typically adopt Agile to avoid large-scale failures in systems development programs. The Department of Veterans Affairs (VA), an early adopter of Agile in the federal government, moved to Agile in 2009 for a critical new system (the New GI Bill) when the department was failing on much of the rest of its development portfolio. As a result, VA successfully delivered its first new large-scale system in years, and decided to adopt Agile for the development of a number of other critical systems.

Agencies are also moving to Agile to better ensure that the system being developed actually meets the needs of the mission. Programs using Agile development provide customers with early production versions of the product to use and critique, ensuring customer involvement and buy-in. More importantly, because change happens, Agile’s frequent releases provide the ability to rapidly respond to changing mission priorities, customer preferences, or even requirements imposed by new laws.

Critical to today’s federal environment, Agile also cuts system development costs. Frankly, this can be the hardest to justify. The initial estimates for the cost to develop a system using either Waterfall or Agile are likely to be the same. Logically, if both processes work as well in practice as they do in theory, either process should result in the same system for much the same price. In reality, metrics show that incremental programs (including Agile) successfully meet their delivery commitments at a rate nearly three times that of Waterfall. In my experience, this equated to on-time delivery jumping from under 30% to over 80% for a $1 billion systems development portfolio.

Using Agile for systems development frequently has an immediate positive impact on mission results. By delivering and then improving production versions of a system early in the development cycle, Agile programs allow the agency to begin realizing the benefits of the new system to their missions much earlier. And with system users intimately and continually involved in its design and development, the end solution better addresses their real-world requirements, allowing them to work more productively.

Finally, using Agile can help improve the position of the CIO and the IT organization in the agency. With daily active engagement between users and IT, and frequent on-time delivery of new, mission-prioritized system functionality, customers start to see IT as a full, essential and productive partner in accomplishing the agency’s mission. And that has substantial implications during the budget process, during resource discussions, and on the agency’s willingness to give more authorities to the CIO.

After all, IT is an investment in improved mission effectiveness. If they see that investment returning frequent, reliable, positive results, they’re going to look to find more ways to invest.

Roger Baker is chief strategy officer for Agilex, a leading provider of mission and technology solutions to the federal government. He was previously CIO for the Department of Veterans Affairs from 2009-13 and served as CIO for the Department of Commerce from 1998-2001.


What the NSA’s Massive Org Chart (Probably) Looks Like

By Marc Ambinder

August 14, 2013

Want to understand how an organism really works?  Take a look at its plumbing. Figure out where the pipes fit together. That’s the approach I take to national security and that’s the spirit behind this look at the structure of one of the most important institutions in U.S. intelligence: the National Security Agency.

Some intelligence organizations, such as the National Reconnaissance Office and the National Geospatial Intelligence Agency, have declassified most of their organizational charts. The NRO develops, launches and controls spy satellites; the NGA analyzes and distribute imagery. For these agencies, the plumbing matters less than what flows through the pipes, which is highly classified. 

But the NSA, with its triple mission — break codes, secure data, collect signals intelligence — has not made its structure public.  Even by the standards of U.S. intelligence agencies whose existence was declassified much later, the NSA’s organization chart is largely impermeable to outsiders. The best of its chroniclers, like Jeff Richelson, James Bamford, Bill Arkin and Matthew Aid, have managed to collect bits and pieces of open source data, but many senior intelligence officials who don’t work for NSA still have only a vague idea of what signals intelligence collection entails, and even fewer understand the NSA bureaucracy. The map to the NSA’s inner sanctum is generally given only to a select few members of Congress and their staff.

In the interests of transparency and in an effort to establish a basis for continued public exploration of the world of intelligence, I’ve cobbled together a rough and incomplete but still rather comprehensive organizational chart of the agency’s operational, analytical, research and technology directorates. With only a few exceptions, the information does not come from inside sources. It builds on the work of the researchers mentioned above and it represents the culmination of a lot of time spent cross-checking government documents and LinkedIn profiles,job postings and agency announcements.

The NSA prefers not to discuss how it works and what it does. Defense One offered NSA the opportunity to review the organization chart and address any national security concerns. “We won’t fact check a chart that appears to be based largely on speculation,” the agency replied through a spokesperson.

Undoubtedly, some of what follows is incomplete and I hope knowledgeable readers will set me straight. 

It has five operational directorates, several administrative directorates and three large operational centers.  Each is headed by an associate director, and each associate director has a technical director. They report to the executive director, who reports to the deputy director, who reports to the DIRNSA, which is NSA-speak for Director of the NSA, Gen. Keith Alexander. He’s also the commander of the Defense Department’s U.S. Cyber Command and the Central Security Service, the military signals and cyber intelligence units that contribute personnel to the NSA.  The CSS is essentially the NSA.

The NSA’s Foreign Affairs Directorate interacts with foreign intelligence services, counterintelligence centers and the UK/USA and FIVE EYES exchanges, an alliance of intelligence operations between the U.S., UK, Canada, Australia and New Zealand that dates back to 1946. It also includes the Office of Export Control Policy. 

The Information Assurance Directorate is the center of NSA’s cyber warfare and defense program offices.  It’s also responsible for generating the codes that the U.S. uses.

The Signals Intelligence Directorate is the largest functional directorate. It has three subdivisions. One helps determine the requirements of what the NSA calls its customers — other agencies, the president, the military.  The agency’s main analytical centers live here, too.  The super-secret work of SIGINT collecting and offensive cyber warfare is the responsibility of S3, with its many bland sounding and compartmentalized branches.

The Research Directorate figures out how to break codes and how to best penetrate the telecom infrastructure of tomorrow. The Technical Directorate puts everything together. It’s responsible for the infrastructure for everything NSA does.

Two other directorates are responsible for training and human resources and for acquisition and procurement.

The NSA’s three operational centers are its main watch facility, the National Security Operations Center, or NSOC; the National Threat Operations Center, which is the U.S. government’s primary worldwide cybersecurity warning office; and the NSA/CSS Commercial Solutions center, which interacts with private companies, uses commercial technologies for classified purposes and conducts research on cryptography that the government is willing to share with the public.

Another NSA office is the Special Collection Service, which is run jointly with the CIA and operates classified listening posts from embassies and other special facilities worldwide. The SCS is responsible for NSA listening posts that aren’t inside of U.S. or allied military facilities.

Inside the United States, the NSA has very large off-site campuses in Hawaii, Texas, Utah and Georgia.  In Maryland, it owns and hosts offices in Linthicum, Finksberg, Bowie and College Park, alongside Ft. Meade, its home, and adjacent properties.  There’s an NSA office inside the Cheyenne Mountain Operations Center, where NORAD and NORTHCOM have their backup command center. And NSA has a big presence at Site R, the site of the Alternate National Military Command Center, near Ft. Ritchie, Md.

[Related: The NSA’s New Spy Facilities are 7 Times Bigger Than the Pentagon]

All these sites are connected by an architecture called NSANet, which exists in parallel to the regular telephone switch system. Real-time feeds of SIGINT reports and time-sensitive cyber information can be sent to users anywhere in the world, such as those on Navy ships, using the NSA’s Integrated Broadcast Service.

The NSA uses a bewildering amount of technical tools and databases. You’ve now heard of PRISM, which was revealed in the Edward Snowden leaks. This system collects digital network information from U.S. content providers. 

The NSA also has several tools and databases, including metadata collection, a repository of malicious network signatures and an Air Force/Navy tool that tracks ships in real time.

Here are the other main NSA collection tools and databases:

ONEROOF: Main tactical SIGINT database  (Afghanistan), consisting of raw and unfiltered intercepts

NUCLEON:  Global telephone content database

XKEYSCORE: Collection tool for international metadata

AIRGAP: Priority missions tool used to determine SIGINT gaps

HOMEBASE: Tactical tasking tool for digital network identification

SNORT: Repository of computer network attack techniques/coding

WIRESHARK: Repository of malicious network signatures

TRAFFICTHIEF: Raw SIGINT viewer for data analysis

BANYAN: NSA tactical geospatial correlation database

OILSTOCK: Air Force/Navy tool to track ships in real time

MAINWAY: Telephony metadata collection database

ASSOCIATION: Tactical SIGINT social network database

MESSIAH/WHAMI: Electronic intelligence processing and analytical database

MARINA: Internet metadata collection database

PINWALE: Internet data content database

SURREY:  Main NSA requirements database, where targets and selectors are “validated” by NSA managers

PROTON: SIGINT database for time-sensitive targets/counterintelligence

OCTAVE/CONTRAOCTAVE: Collection mission tasking tool

WRANGLER: Electronic intelligence intercept raw database

ANCHORY: Main repository of finished NSA SIGINT reports going back three years.

AQUADOR: Merchant ship tracking tool

So how do you get NSA to spy on someone? You send in an IN. An IN is an Information Need. The INs go into the collection requirements databases, like SURREY, and are evaluated. Are they time-sensitive? Are they critical, meaning intel is needed within three days? Do they fit in with the National Intelligence Priority Framework, which lays out in broad terms the targets the NSA is working on in any given year?

To invert a phrase from biology, in the intelligence community, function follows form. To begin to understand the NSA from the outside, you need to understand what it looks like from the inside. 


By Marc Ambinder // Marc Ambinder is senior Defense One contributor. A Los-Angeles-based writer who covers national security, Ambinder is also a contributing editor at The Atlantic, GQ, and writes The Compass blog for The Week. He is the author of “Deep State: Inside the Government Secrecy Industry,” and is working on a history of Cold War nuclear strategy.

August 14, 2013

The Atlantic

The Government Now Admits There’s an ‘Area 51′

National Security Archive / AP


Newly declassified documents, obtained by George Washington University’s National Security Archive, appear to for the first time acknowledge the existence of Area 51. Hundreds of pages describe the genesis of the Nevada site that was home to the government’s spy plane program for decades. The documents do not, however, mention aliens.

The project started humbly. In the pre-drone era about a decade after the end of World War II, President Eisenhower signed off on a project aimed at building a high-altitude, long-range, manned aircraft that could photograph remote targets. Working together, the Air Force and Lockheed developed a craft that could hold the high-resolution cameras required for the images, a craft that became the U-2. Why “U-2″?

They decided that they could not call the project aircraft a bomber, fighter, or transport plane, and they did not want anyone to know that the new plane was for reconnaissance, so [Air Force officers] Geary and Culbertson decided that it should come under the utility aircraft category. At the time, there were only two utility aircraft on the books, a U-1 and a U-3. told Culbertson that the Lockheed CL-282 was going to be known officially as the U-2.

The next step was to find a place from which the top-secret aircraft could be flown.

On 12 April 1955 [CIA officer] Richard Bissell and Col. Osmund Ritland (the senior Air Force officer on the project staff) flew over Nevada with [Lockheed’s] Kelly Johnson in a small Beechcraft plane piloted by Lockheed’s chief test pilot, Tony LeVier. They spotted what appeared to be an airstrip by a salt flat known as Groom Lake, near the northeast corner of the Atomic Energy Commission’s (AEC) Nevada Proving Ground. After debating about landing on the old airstrip, LeVier set the plane down on the lakebed, and all four walked over to examine the strip. The facility had been used during World War II as an aerial gunnery range for Army Air Corps pilots. From the air the strip appeared to be paved, but on closer inspection it turned out to have originally been fashioned from compacted earth that had turned into ankle-deep dust after more than a decade of disuse. If LeVier had atrempted to land on the airstrip, the plane would probably have nosed over when the wheels sank into the loose soil, killing or injuring all of the key figures in the U-2 project.

That’s the first acknowledged mention of the Groom Lake site, according to Chris Pocock, a British author who’s written extensively about the program and provided his thoughts to the GWU archive. Nor, it seems, has the low-contrast image that accompanies that section (below) been seen. 


Rasmussen Reports

What They Told Us: Reviewing Last Week’s Key Polls

Saturday, August 17, 2013

President Obama may be on vacation this week, but he and his administration still made plenty of news.

Just before the president left for vacation, he announced in a press conference tighter restrictions on the National Security Agency’s domestic surveillance program.  But most voters still don’t trust the government to protect their constitutional rights, and very few expect the program to cut back on monitoring the phone calls of innocent Americans. 

While voters’ views of the president’s leadership have returned from post-election highs to levels seen for much of his first term in office, positive ratings for his handling of national security are at their lowest level in over three years of weekly tracking. 

More voters than ever believe that the United States is not spending enough on the military and national security, but they are also less aware of how much the country actually spends in this area. 

On Tuesday, Obama addressed the public about the crisis in Egypt.  A month ago, 73% said the United States should leave the situation alone. Rasmussen Reports will release new data on the public’s reaction to the situation early next week.

U.S. Attorney General Eric Holder announced a plan to scale back the number of strict minimum prison sentences for non-violent low-level drug offenders, and just over half of Americans are on board with this proposal. One reason for this support is the fact that 55% believe there are too many Americans in prison today.

Americans are still divided on whether or not to legalize marijuana, but most think it should be up to the states to decide, not the federal government.

The federal Department of Housing and Urban Development has announced plans to make the nation’s neighborhoods more diverse, but Americans strongly oppose this idea.  A plurality (49%) believes it is important for most neighborhoods in America to be racially or ethnically diverse, but just 15% think most neighborhoods in this country that are not racially or ethnically diverse are that way primarily because of racism.

Some members of Congress spend their time home during the August recess reconnecting with their constituents, but more voters than ever now believe a group of people randomly selected from the phone book could do a better job addressing the nation’s problems than the current Congress. 

But Republicans are more dissatisfied with their party leaders in Congress than Democrats are.  Thirty-eight percent (38%) of Republican voters think the average Republican in Congress shares their views, while 45% of Democrats believe the average congressional Democrat is about the same as they are.

Republicans and Democrats run even on the Generic Congressional Ballot for the week ending August 11.  Belief that U.S. elections are fair continues to fall and has now reached its lowest level in nine years of surveys.

Meanwhile, consumer and investor confidence remain near their highest levels in several years. 

But just 29% of voters say the country is heading in the right direction

In other news last week:

– Voters are closely divided over the Justice Department’s decision to contest the merger of US Airways and American Airlines, but nearly half predict airline fares will go up if the merger goes through.

– Forty-one percent (41%) of voters share an at least somewhat favorable view of the health care law, while 53% view it unfavorably.  But 53% also say the law has had no impact on their lives. 

Voters remain concerned about global warming, but they still express more urgency about the economy and don’t feel selfish for doing so.

– Sixty-eight percent (68%) of voters believe that policies that encourage economic growth are Very Important, compared to 49% who feel that way about policies that encourage economic fairness.  

Americans are becoming even less enthused about the Internet’s influence on American culture, politics and journalism.

– Some professional athletes just can’t seem to stay out of trouble, and most Americans are noticing.

– Most Americans (81%) believe that their fellow Americans pay too much attention to celebrity news, and 86% say the media covers those celebrities too much.

– Eighty-one percent (81%) of Americans believe that Amber Alerts are at least somewhat effective in abduction cases

August 3 2013




White House Employees’ Personal Email Hacked

By Aliya Sternstein

July 29, 2013 7:38 AM ET


Three White House staffers have had their personal Gmail accounts breached in what appears to be a malicious operation directed at the team responsible for the Obama administration’s social media outreach, according to individuals familiar with the incident. 

The penetrated accounts have been sending other White House digital media employees bogus emails containing fraudulent links that can extract their personal email logins and Twitter credentials. More than a dozen current and former staffers were targeted, the individuals said. The scheme was ongoing as of Sunday night. 

The goal of the intruders might be to glean sensitive government information, some cyber researchers said. White House personnel are prohibited by law from using personal Webmail accounts for business communications, but not all employees comply with the rules. The Twitter scam could be aimed at spreading misinformation through seemingly-official channels to citizens.

The “phishing” links — labeled to look like legitimate BBC or CNN articles — direct users to an authentic-looking Gmail or Twitter login screen to access the news content. At this point, the users have unwittingly been rerouted to fake login forms that enable hackers to capture their sign-on information. 

White House social media employees might be relatively easy game within the administration, since their role is to make the executive branch more open to the public. “I imagine that the names and email addresses of people at the White House in digital media or anything related to media are easy to find since their job involves public access. A list of targets would be created from open sources and that’s who the phishing email would be delivered to,” said Jeffrey Carr, a cybersecurity analyst with consultancy Taia Global.

The objective for harvesting Gmail account information might be to capture administration-related email messages and contacts, he speculated.

The Presidential Records Act bars work communication outside of official email accounts. However, a 2012 House committee report showed that former White House Deputy Chief of Staff Jim Messina used his personal email account to conduct official business involving a deal between the pharmaceutical industry and the Senate Finance Committee. And in 2010, the Washington Post reported that administration officials reprimanded then White House Deputy Chief Technology Officer Andrew McLaughlin, a former Google official, after document requests revealed technology policy-related emails from Google employees in his personal Gmail account.

The purpose of assembling Twitter sign-on information might be to disseminate disruptive messages, Carr postulated. This spring, a hacked Associated Press Twitter account informed the public that explosions at the White House had harmed the president. The Dow tumbled in response. 

Sources familiar with the Gmail hack say the ploy is unique in the White House. In the past, one or two staffers who used two-step authentication to protect their Gmail accounts would receive text messages, indicating someone had entered the correct password to trigger the text authentication code. 



Air Force Asks Students to Solve Real-World Problems

NY Times


Published: July 28, 2013


THE Air Force, as part of its recruitment efforts, is approaching young people for help in solving real-world technological problems using a collaborative online platform.

The initiative, which will be introduced on Thursday, will create a digital program called the Air Force Collaboratory, in which young people will be challenged to develop technologies for search-and-rescue operations in collapsed structures; to create software code for a quadrotor, a type of unmanned, aerial vehicle; and to determine where to place the newest GPS satellite.

The Air Force hopes the program will attract students in so-called STEM subjects — science, technology, engineering and mathematics — to work with its airmen on developing solutions for the three challenges, and, ideally, consider enlisting.

The initiative — which the Air Force will promote through digital advertising, social media and partnerships with groups like Discovery Education — is the latest recruiting effort created for the Air Force by GSD&M, an agency based in Austin, Tex., that is part of the Omnicom Group.

GSD&M has been the Air Force’s agency since 2001, developing campaigns to help it attract the over 28,000 recruits it needs annually; the agency said its work had helped the Air Force meet its recruiting goals each year.

GSD&M’s recruiting strategy for the Air Force — which has always sought tech-savvy candidates — previously featured an “Airman Challenge” online video game. A separate campaign included television spots whose theme was, “It’s not science fiction.”

Col. Marcus Johnson, chief of the strategic marketing division of the Air Force Recruiting Service, said the Air Force focused on “going after the best and brightest young men and women, with an emphasis on the STEM subjects. Whether they’re in high school or college, those topics translate into what we do in the Air Force.”

He said the collaboratory program was meant to appeal to men and women ages 16 to 24, including high school students still determining their future plans.

Ryan Carroll, a creative director at GSD&M, said the Air Force was “very much like the Apples and Googles of the world in recognizing the huge need for scientists and engineers. They reach out to kids at an early age and show them the amazing things they can do with science and technology.” He pointed to initiatives like the Google Science Fair, an online, annual, global science competition for teenagers, as an example.

Similarly, the collaboratory program aims to “inspire the next generation of scientists, engineers, technologists and mathematicians, and to show them all the amazing, science-related things the Air Force does,” Mr. Carroll said. The program will also allow students to “participate and solve real problems the Air Force solves every day,” he added.

Young people will be able to learn more about the initiative’s challenges at the Web site, which will act as a forum. Challenge participants will be able to use custom-built tools to share ideas and work with airmen and other experts to develop solutions.

Not surprisingly, digital media will primarily be used to promote the program. Custom editorial content is being developed for the STEM hub of, a global community of “pragmatic idealists,” while custom videos are being filmed for DNews, an online video series from Discovery Communications; the videos will feature the DNews hosts Trace Dominguez and Anthony Carboni. The technology network Technorati is asking bloggers to create custom posts on the collaboratory and related subjects, while the Air Force will pay to place videos on Web sites like YouTube, Blip and Machinima. In addition, the Air Force will promote the initiative on Facebook and Twitter.

Digital banner advertising will run on the Web sites of Scientific American, Popular Science and The Verge. One set of ads depicts an Air Force helicopter approaching a scene of destruction after a 7.0-magnitude earthquake that has trapped dozens of survivors. The copy reads, “Your idea could save them. The Air Force Collaboratory. Search and rescue 2.0 is now open. Start collaborating.”

The Air Force also is working with Discovery Education, a division of Discovery Communications, on an outreach program for high school science and math teachers.

Colonel Johnson said that although the collaboratory would run through November, new challenges could be created after that. In addition, he said the Web site would carry no overt recruiting messages, nor would the Air Force actively recruit challenge participants, since the initiative was meant to raise interest in the Air Force and possibly encourage participants to seek out more information about opportunities there.

The budget for the campaign is $3.7 million.

Diane H. Mazur, a former Air Force officer, professor emeritus of law at the University of Florida and author of “A More Perfect Military: How the Constitution Can Make Our Military Stronger,” said that although the collaboratory concept was “good, it’s not sophisticated to the degree it needs to be to attract the people they think they want to get.” She added, “This is a good direction if you do it well.”

David R. Segal, a professor of sociology at the University of Maryland who specializes in military sociology, said that while recruiting high school students to “work in military laboratories on military problems” was not new, “what seems new is having interns work online with Air Force scientists.”

“I think they will certainly recruit a good number of high school students interested in science, engineering, technology and math to work on the problems identified. That part is easy,” he said. “Recruiting the same people then to come into the Air Force as enlisted men and women might be more difficult. They are likely to want to go to college.”

As a result, he said, the collaboratory would probably be more successful recruiting Air Force Reserve Officers Training Corps students than airmen.


Small business contracting numbers inflated by errors and exclusions, data show

Washington Post

By J.D. Harrison, Published: July 28 | Updated: Monday, July 29, 5:00 AM


The federal government is required by law to try to direct nearly a quarter of all contracting dollars to small businesses, and every year since 2005, officials have reported missing the goal by the slimmest of margins.

Then again, it depends on who is counting.

A number of contractors and advocacy groups say the government has repeatedly inflated the share of contracting dollars awarded annually to small firms, masking serious problems in the procurement process that prevent small businesses from securing more government work.

In 1958, when Congress created the Small Business Administration, it tasked the agency with establishing an annual small-business contracting goal of “not less than 23 percent of the total value of all prime contract awards.”

This month, for the seventh year in a row, SBA officials reported that the government narrowly missed the goal, reporting that small firms received 22.25 percent (or $89.9 billion) of contracting dollars in fiscal year 2012 — better than 21.65 percent last year, but down from 22.7 percent in 2010.

In a blog post announcing the report, John Shoraka the agency’s associate administrator for government contracting, called the achievement “real progress” toward the goal.

However, the SBA’s calculations come with several caveats, in large part because the agency excludes certain contracts and entire agencies from its measure.

Officials do not take into consideration, for instance, any contract work for the Federal Aviation Administration, the Transportation Security Administration or the Central Intelligence Agency, nor do they account for any contracts for goods sold overseas or any work performed outside the United States.

In all, SBA officials have determined that about one-fifth of all federal contract spending is not “small-business eligible,” and so it excludes that portion from its calculations.

The portion includes spending by agencies that are not subject to certain federal acquisition regulations, and those that do not report into the Federal Procurement Data System, from which the SBA pulls its data, Skoraka said. Other exclusions have been made on the basis that those contracts do not lend themselves to competition by small firms.

Shoraka noted that the current list of exclusions was finalized during the second term of former president George W. Bush. The Obama administration elected to leave them in place in order to “compare apples to apples,” he said.

Critics argue that is not what Congress mandated.

“They are simply not following the letter of the law,” said Charles Tiefer, a professor of government contracting at the University of Baltimore Law School. “It states 23 percent of all contracts, and there is no reason to think Congress wanted some of these exclusions.”

Contracts out of reach for small businesses, he said, should be considered as part of the 77 percent of government spending available to large and international companies, rather than removed from the equation altogether. He pointed out that some of the excluded contracts, including intelligence gathering and work overseas, are areas in which government spending has surged in recent years.

The SBA’s Office of Inspector General has also urged the agency to discontinue some of its exclusions, particularly for contracts performed overseas. In an advisory memorandum from December 2011, the office cited a 2008 legal opinion issued by the SBA Office of General Counsel, which states it would be “a reasonable interpretation” of the law to assume the targets include contracts performed outside the country.

Congress has recently taken issue with the exclusions, too. In the 2013 National Defense Authorization Act, lawmakers ordered the SBA administrator to review the goal guidelines to ensure that the process “does not exclude categories of contracts” based on the types of goods or services solicited or, in some cases, whether the agency is subject to federal acquisition regulations.


The nuances of the government’s measurements can sometimes get lost in public discussions about contracting.

In Shoraka’s blog post announcing the government’s performance, he wrote that 22.25 percent represented the small business share of “all” federal contracting dollars last year.

After On Small Business asked about the language, given the exclusions to the calculations, officials updated the blog to read 22.25 percent “of all small business eligible contracts.”

Two studies show small-biz getting 19% of contracts

To get a sense of what effect the exclusions have on the numbers reported, On Small Business asked Fedmine, the data analysis firm that conducts the contracting calculations for the SBA, to crunch the numbers based on total federal contract spending reported into the FPDS, without any exclusions.

The revised calculations show that small businesses received less than 19 percent of all prime contracting dollars in 2012. In contrast to the progress cited by the SBA, that was actually down from the year before (20 percent).

The House Small Business Committee conducted its own analysis of last year’s federal data, eliminating many of the SBA’s exclusions. The committee also found that the small-business share of total federal contracting was around 19 percent.

“The administration shouldn’t be allowed to cook the books,” Committee Chairman Sam Graves (R-Mo.) said in a statement earlier this month.

Small-business advocates say the agency’s exclusions are not the only source of padding in the small-business contracting numbers. Of greater concern, they say, is the number of contracts labeled by the government as “small business” awards that actually go to large companies — a problem that has long plagued the federal government.

During his run for the presidency in 2008, then-Sen. Barack Obama emphasized small business, at one point saying that “it is time to end the diversion of federal small-business contracts to corporate giants.”

Nearly five years later, in her most recent management report, SBA Inspector General Peggy Gustafson said the agency’s top challenge is still that “procurement flaws allow large firms to obtain small-business awards and agencies to count contracts performed by large firms towards their small-business goals.”

Inspector general “audits and other governmental studies have shown widespread misreporting by procuring agencies since many contract awards that were reported as having gone to small firms have actually been performed by larger companies,” Gustafson wrote. “Most of the incorrect reporting results from errors made by government contracting personnel, including misapplication of small-business contracting rules.”

Shoraka says the agency has taken steps to stop the errant reporting. A few years ago, for example, officials began running a computerized “anomaly” process to identify red flags in the federal data system.

The program searches for conflicting reports, missing fields of information, and the names of Fortune 100 firms that were awarded small-business contracts. When potential errors are flagged, contracting agencies are asked to take a second look at their reports and fix any mistakes.

Some say the process does not appear to be working.

The American Small Business League, an advocacy group, combed through the Fedmine data following the SBA’s report earlier this month. In its analysis, the group found that more than half (57) of Fortune 100 companies or their subsidiaries won awards that were labeled in the federal data system as small-business contracts, including industry giants such as General Electric, Apple and Citigroup (the latter two declined to comment).

In one instance, General Dynamics, a defense contractor with roughly 80,000 employees based in Falls Church, received more than $230 million in small-business contracts in 2012 and roughly $2 billion in the five years prior, according to data from Fedmine.

In total, the largest 100 corporations in the country received nearly half a billion dollars in small-business contracts last year, according to ASBL.

General Dynamics spokesman Rob Doolittle directed attention to the SBA OIG report, which suggested most size-classification errors are the result of mistakes by federal contracting officials. He also noted that small firms acquired by a large company during the life of a contract are permitted to keep those contracts. However, the business is not required to ensure that the contract categorization is updated in the federal data system.

Sebastien Duchamp, a spokesman for General Electric, said the federal database sometimes erroneously shows the company as a small business, adding that the firm regularly reviews the data for errors and alerts contracting officers when necessary.

While Inspector General Gustafson suggests most of the problem stems from those types of reporting errors, some of it boils down to fraud.

In March, the chief executive of Arlington-based security contractor PSI pleaded guilty to major government fraud for allegedly operating a shell company, SAC, that shuffled more than $31 million in small-business set-asides to his much-larger company. Keith Hedman, the executive, was sentenced to six years in prison, and last month, the employee he tapped to run the front company was sentenced to four years.

One of the small-business set-aside contracts Hedman’s shell company won was for security services at Walter Reed Medical Center, edging out a competing bid by Davis-Paige Management Systems, a small service-disabled veteran-owned business in Annandale. Micheal Davis, the company’s chief executive, said his company spent around $100,000 chasing the contract and stood to gain around $11 million in revenue by winning.

When his firm lost to SAC, Davis said he was forced to let several employees go and move several others to part time.

“It takes work away from companies like ours that took the time to get the proper certifications to compete for these contracts,” Davis said in an interview, adding that the SAC case makes him wonder how many other “small” contractors are actually small.

An SBA OIG report to Congress last year documented several other examples of large firms that have been prosecuted for masquerading as small businesses to win contracts.

“It isn’t miscoding, it isn’t computer errors, it isn’t anomalies,” American Small Business League President Lloyd Chapman said. “These numbers are being inflated and misrepresented.”


Rules aren’t being followed

The elevated small-business contracting numbers help conceal a number of systemic problems in the federal procurement process, according to a former head of small-business contracting at the Defense Department.

“The real problems meeting these goals are tactical ones, down at the operations level, where contracts are being written and awarded,” said Daniel Gill, who headed the agency’s Office of Small Business Development under President Bill Clinton.


Gill, who now consults with government services firms and recently taught courses at the Defense Department’s acquisition training school for contracting officers, argued that the government does not need new regulations to meet its small-business contracting mandate. Instead, he said the goal would be “a piece of cake” if contracting officials in each agency simply adhered to existing procurement protocols.

“A lot of contracts are going to large business that should be going to small businesses, and it’s not just a matter of large businesses miscategorizing themselves,” Gill said. “It’s often that the proper set-aside determinations are not being made to reserve small contracts for small businesses.”

The most common example, he said, concerns long-standing regulations that require agencies to reserve all contracts worth between $3,000 and $150,000 for certified small businesses, unless the agency cannot identify two small businesses that can provide the product or service at a fair market price.

In the past few years, the Government Accountability Office has identified numerous instances in which federal officials either never did the market research to determine if small businesses were available to meet a contract’s requirements, or conducted the research, but failed to set aside those contracts for small businesses.

Phoenix Environmental Design, a small service-disabled veteran-owned firm in Plankinton, S.D., that provides pesticides and herbicides to the federal government, has filed more than 30 protests to the GAO in the past two years against agencies for faulty contracting practices. About half of them concerned contracts intended for small businesses that went to large corporations.

In every instance, the government has pulled back the award and solicited bids from small firms, according to the company’s owner, Chad Gill. What’s more, government documents show that the small firms that won the contacts the second time around routinely did so with a lower bid than the initial award to the large company.

“When we got them to do it right, and there is competition and accountability, it ends up costing the federal agency less money,” Gill said.

The problem, he said, is that many of the contracting officers he works with in various agencies do not understand the small-business set-aside process. He is not the only one who has made that observation.

In one of the GAO’s rulings last fall, General Counsel Lynn H. Gibson noted that the Veterans Administration, for instance, has repeatedly failed to set aside contracts reserved for small firms, later suggesting that contracting officers have demonstrated “a fundamental misunderstanding of the agency’s obligations” under contracting laws.

Charles Baker, who owns an electrical company that services the Defense Department, said his firm has suffered from similar contract classification errors. In many cases, he said, contracts that fall into the $3,000-to-$150,000 range are offered for general solicitation rather than reserved for small firms.

“The system is fundamentally broken, and it can destroy a small company like mine,” Baker, who owns MCB Lighting & Electric in Owings, Md., said. “There is no compliance with the laws, no enforcement.”

Maureen Schumann, a spokeswoman for the Defense Department, said Baker’s comment “clearly illustrates some of the frustrations felt by our industrial base” and that the agency is taking measures to “ensure that the right policies, procedures and programs are in place to increase contracting opportunities for small businesses.”

“We are constantly analyzing data and the [department] is using every available regulation to identify specific contracts that can be set aside for small businesses” Schumann said.


She noted that the department contracted with small businesses on 68 percent of contracts in the $3,000-to-$150,000 range last year, an increase from 2011.


Funding schemes in Congress could ground drones; FAA pressured over privacy

Washington Times

Ben Wolfgang

July 28, 2013

The lagging federal effort to fully integrate drones into U.S. airspace is in danger of falling even further behind schedule.

A funding bill now before the Senate essentially would stop the process in its tracks by prohibiting the Federal Aviation Administration from moving forward until it completes a detailed report on drones’ potential privacy impact.

The report, called for in the Senate’s fiscal 2014 transportation appropriations measure, would be yet another hurdle in the FAA’s already complex, time-consuming drone integration initiative.

The agency has been charged by Congress to write rules and regulations allowing drones — now used primarily by the military, law enforcement and researchers — to operate commercially in U.S. skies by September 2015, but the industry fears that deadline is likely to be missed.

Requiring the FAA, which traditionally deals only with airspace safety and has little experience in writing Fourth Amendment protections, to craft a comprehensive privacy report would all but guarantee the date will be pushed back.

Leaders in the unmanned aerial systems sector warn that such setbacks will hamper American technological innovation and carry economic consequences.

“Privacy is an important issue, and one that deserves to be considered carefully. But further restrictions on FAA integration will only set back important progress,” said Ben Gielow, government relations manager with the Association for Unmanned Vehicle Systems International (AUVSI), the drone sector’s leading trade group.

“If we are not able to keep the integration on track, the U.S. could lose out on creating tens of thousands of jobs and undermine the growth of a new industry at a time when we need it most,” he said.

The Senate bill doesn’t explicitly call for the FAA to stop drone integration efforts, but it would establish a de facto moratorium by cutting off funding for the process.

A section of the legislation, put forward by Sen. Patty Murray, Washington Democrat, states that “none of the funds in this act may be used to issue regulations on the integration of unmanned aerial systems into the national airspace” until the privacy report is completed and presented to the House and Senate appropriations committees.

The Senate Appropriations Committee directed questions on the bill to Ms. Murray, who is chairwoman of the panel’s subcommittee on transportation. Her office did not return emails or calls seeking comment.

The House’s transportation funding bill does not include such language, and the Senate provision could be changed or dropped entirely in the coming months.

For now, however, the bill underscores the deep fear in Congress and among the American public that widespread drone use will be a serious blow to personal privacy.

Sen. Dianne Feinstein, California Democrat, said last month that she considers drones to be “the greatest threat to the privacy of Americans.”

Coming from Ms. Feinstein, chairwoman of the Senate Select Committee on Intelligence, those words carry extra weight. She is intimately familiar with classified details of the National Security Agency’s data-collection programs and other efforts that, critics say, erode Americans’ Fourth Amendment rights.

Many other members of Congress, civil liberties groups, privacy advocates and others have said drones — increasingly small, undetectable and able to be equipped with state-of-the-art cameras and other monitoring equipment — pose real privacy threats.

AUVSI and other drone industry leaders agree that the issue must be addressed, and it’s already being tackled across the nation.

More than 30 states and a growing number of local governments have drafted regulations to govern what drones can do and what types of data they can collect.

At the federal level, however, many analysts question why Congress is placing the job in the lap of the FAA.

“The FAA should focus on ensuring the safety of our skies. Safety has always been the FAA’s mission, and we believe the agency should stick to what it does best,” Mr. Gielow said.

The FAA’s experience in writing drone privacy regulations has been limited. The agency has drafted privacy guidelines to be used at drone “test sites,” congressionally mandated locations where the craft will be put through a battery of tests in preparation for airspace integration by 2015.

While widespread, private-sector drone use still is years away, the FAA has begun to make exceptions.

On Friday, the agency issued two “restricted category type certificated” to a pair of unmanned aerial systems, the first step in allowing them to operate in U.S. airspace later this summer.

A “major energy company,” the FAA said, will be allowed to use a drone to survey ocean ice floes and migrating whales in Arctic oil exploration areas off the coast of Alaska. Unmanned systems also will be permitted to support emergency response crews for oil spill monitoring and wildlife surveillance over the Beaufort Sea.

Read more:


Two Drone Airframes Approved for Commercial Flights

By Bob Brewin

July 29, 2013

The Federal Aviation Administration has certified two small drones for commercial use, heralding the move as “one giant leap for unmanned-kind” that will lead to the first approved commercial flight of an unmanned air system off the Alaska coast later this summer.

The 2012 FAA Modernization and Reform Act defined Arctic operational areas and included mandates for Arctic UAS commercial operations. Until now obtaining an experimental airworthiness certificate – which specifically excludes commercial operations – was the only way the private sector could operate UAS in the nation’s airspace, the FAA said.

In an announcement Friday, the agency said it type-certified the Scan Eagle X200, manufactured by Boeing subsidiary Insitu, and the AeroVironment PUMA under a restricted category that allows aerial surveillance. The catapult-launched Scan Eagle, has a maximum weight of 44 pounds, a wingspan of just more than 10 feet and a length of 4.5 feet; it can stay in the air for 28 hours without refueling. The AeroVironment PUMA, a hand-launched drone that weighs 13 pounds, has a wingspan of just over nine feet and a fuselage of four feet.

The agency said previous military acceptance of the Scan Eagle and PUMA UAS designs allowed it to issue the restricted category type certificates. The Navy operates the Scan Eagle and the Air Force, Army, Marines and the Special Operations Command fly the PUMA.

A major energy company plans to fly the Scan Eagle off the Alaskan coast in international waters starting in August, the FAA said, without identifying the company. Plans for the initial ship-launched flights include surveys of ocean ice floes and migrating whales in Arctic oil exploration areas.

The PUMA is expected to support emergency response crews for oil spill monitoring and wildlife surveillance over the Beaufort Sea, but the agency again did not identify the operator.

The certification of the Scan Eagle and the PUMA could be the start of unbridled use of drones in the United States, according to Rep. Ted Poe, R-Texas, who in February predicted that by 2030, “30,000 drones will be cruising American skies – looking, observing, filming, and hovering over America.”

On Feb 4 the FAA kicked off a process to set up six test sites to conduct drone research and development toward a goal of widespread use of UAS by law enforcement agencies, universities and other organizations in domestic airspace by 2015.


Government Conferences Pay for Themselves, Industry Says

By Charles S. Clark

July 29, 2013

Agency spending on travel to conferences is “vital to making government more efficient and effective,” says a study released this month by the U.S. Travel Association. It argues that current Obama administration guidelines and legislation to curb conference spending is counterproductive.

Government travel for meetings and events had a total economic impact of $24.4 billion in 2011, supported 343,800 U.S. jobs and $14.5 billion in U.S. wages, and contributed $5.5 billion in tax revenue, according to the data compiled by Rockport Analytics LLC.

The report put a number on the economic effect of canceling the 2013 Military Health System Conference, an annual training event for several thousand military medical personnel. Replacement expenses and lost revenue, it said, cost the government more than $800,000.

As a result of NASA’s decision to pull out of the April 2013 National Space Symposium, a gathering for representatives of 30 nations, “important international partnerships are jeopardized, important international programs are placed at risk, and the U.S. government places serious strain on relationships with countries around the world,” according to Elliot Pulham, CEO of the private National Space Foundation.


The study also found that government meetings are more efficient than private-sector meetings in terms of expenses, and that private-sector conferences are more productive when government employees participate. A survey found that 89 percent of government supervisors believe that government meetings and events benefit citizens, and 85 percent of government respondents agreed that meetings and events added value to employee development and training.

“We hope these new findings will encourage congressional leaders to reevaluate proposals to drastically cut government travel budgets across the country,” said Roger Dow, president and CEO of the travel association. “When conducted responsibly, federal workers who travel for conferences and meetings deliver important services and real value to our nation.”



Drones Approved for Commercial Flights

By Bob Brewin

July 29, 2013

The Federal Aviation Administration has certified two small drones for commercial use, heralding the move as “one giant leap for unmanned-kind” that will lead to the first approved commercial flight of an unmanned air system off the Alaska coast later this summer.

The 2012 FAA Modernization and Reform Act defined Arctic operational areas and included mandates for Arctic UAS commercial operations. Until now obtaining an experimental airworthiness certificate – which specifically excludes commercial operations – was the only way the private sector could operate UAS in the nation’s airspace, the FAA said.

In an announcement Friday, the agency said it type-certified the Scan Eagle X200, manufactured by Boeing subsidiary Insitu, and the AeroVironment PUMA under a restricted category that allows aerial surveillance. The catapult-launched Scan Eagle, has a maximum weight of 44 pounds, a wingspan of just more than 10 feet and a length of 4.5 feet; it can stay in the air for 28 hours without refueling. The AeroVironment PUMA, a hand-launched drone that weighs 13 pounds, has a wingspan of just over nine feet and a fuselage of four feet.

The agency said previous military acceptance of the Scan Eagle and PUMA UAS designs allowed it to issue the restricted category type certificates. The Navy operates the Scan Eagle and the Air Force, Army, Marines and the Special Operations Command fly the PUMA.

A major energy company plans to fly the Scan Eagle off the Alaskan coast in international waters starting in August, the FAA said, without identifying the company. Plans for the initial ship-launched flights include surveys of ocean ice floes and migrating whales in Arctic oil exploration areas.

The PUMA is expected to support emergency response crews for oil spill monitoring and wildlife surveillance over the Beaufort Sea, but the agency again did not identify the operator.

The certification of the Scan Eagle and the PUMA could be the start of unbridled use of drones in the United States, according to Rep. Ted Poe, R-Texas, who in February predicted that by 2030, “30,000 drones will be cruising American skies – looking, observing, filming, and hovering over America.”

On Feb 4 the FAA kicked off a process to set up six test sites to conduct drone research and development toward a goal of widespread use of UAS by law enforcement agencies, universities and other organizations in domestic airspace by 2015.




The Data Economy Is Much, Much Bigger Than You (and the Government) Think

The Atlantic

July 25, 2013

By Michael Mandel\


It’s become conventional wisdom among pundits that the tech and data boom is generating lots of wealth, but not much in the way of jobs or economic growth. The skeptics point to lack of job gains in the “information” sector, as defined by the Bureau of Labor Statistics, and to the country’s sub-2 percent GDP growth figures.

But as the U.S. shifts to a data-driven economy, the benefits of fixed and mobile broadband are showing up in ways that are not counted by traditional statistics. For just one example, take the number of jobs generated by the development and deployment of mobile apps. According to a new calculation by the Progressive Policy Institute, employment in the App Economy now comes to 752,000 jobs, up roughly 40% over the past year. This is a conservative estimate, based on tracking online help-wanted ads.

Auto companies are hiring software developers and testers to turn their vehicles into highly connected data platforms. Drugstores are going online to let their customers know when prescriptions are ready. Hospitals are ramping up their employment of clinical data managers to help handle the shift to electronic health records. Bed and breakfasts have shifted their entire booking operations online, driven by digital ads.

More broadly, demand for tech workers in the New York City region outstrips every other metro area, including San Francisco and San Jose, according to figures from The Conference Board. That reflects demand in finance, advertising, and media.

The data-driven economy is built on several pillars: Broadband providers, mobile phone operators, and other communications companies are investing almost $100 billion annually to vastly improve their networks. Makers of smartphones, routers, sensors, wireless medical gear, and the like are upgrading and extending the capabilities of their equipment. Meanwhile new applications and uses are coming out of app developers, online game and entertainment companies, web companies like Facebook and Google, content providers, electronic health record providers, and “Internet of Everything” companies that connect the physical world with the data world. Tableau Software, a Seattle-based data visualization company that just went public, increased its full-time employees from 188 to 749 from the end of 2010 to the end of 2012.

What’s more, data is also the fastest-growing component of trade. Consider the United States and Europe: telecom providers have doubled transatlantic cable capacity over the past five years, according to figures from Telegeography. Meanwhile imports and exports of goods and services between the U.S. and Europe are barely above pre-recession peaks.

These flows of data do not show up in the monthly trade report released by the Census Bureau and the BEA. Indeed, most of the growth of data domestically is not counted in the economic statistics either. For example, fixed broadband traffic in North America rose by 39% in the first half of 2013 over a year earlier, according to Sandvine, a Canadian-based network management company. This number does not show up in any official measures.


Will all this growth continue? People still remember the tech bust of the early 2000s, when the unemployment rate in Silicon Valley surged to over 9 percent. This time, though, the surge in data-related jobs is not likely to stop soon. A 2010 policy brief from the Progressive Policy Institute showed that the jobs and industries that grow during a recession are the ones that lead the expansion, and that’s exactly what is happening here.

Before the financial crisis, the housing and debt boom made the U.S. economy look better than it really was, especially housing construction is very visible and easy to measure. By contrast, we may be in the opposite situation now. Data is intangible and difficult to count, so the benefits of the tech and data boom may be underestimated.\



Is Sugar Really Toxic? Sifting through the Evidence

Scientific American

By Ferris Jabr | July 15, 2013


Our very first experience of exceptional sweetness—a dollop of buttercream frosting on a parent’s finger; a spoonful of strawberry ice cream instead of the usual puréed carrots—is a gustatory revelation that generally slips into the lacuna of early childhood. Sometimes, however, the moment of original sweetness is preserved. A YouTube video from February 2011 begins with baby Olivia staring at the camera, her face fixed in rapture and a trickle of vanilla ice cream on her cheek. When her brother Daniel brings the ice cream cone near her once more, she flaps her arms and arches her whole body to reach it.

Considering that our cells depend on sugar for energy, it makes sense that we evolved an innate love for sweetness. How much sugar we consume, however—as well as how it enters the body and where we get it from in the first place—has changed dramatically over time. Before agriculture, our ancestors presumably did not have much control over the sugars in their diet, which must have come from whatever plants and animals were available in a given place and season. Around 6,000 BC, people in New Guinea began to grow sugarcane, chewing and sucking on the stalks to drink the sweet juice within. Sugarcane cultivation spread to India, where by 500 BC people had learned to turn bowls of the tropical grass’s juice into crude crystals. From there sugar traveled with migrants and monks to China, Persia, northern Africa and eventually to Europe in the 11th century.

For more than 400 years, sugar remained a luxury in Europe—an exotic spice—until manufacturing became efficient enough to make “white gold” much more affordable. Christopher Columbus brought sugarcane to the New World in 1493 and in the 16th and 17th centuries European powers established sugarcane plantations in the West Indies and South America. Sugar consumption in England increased by 1,500 percent between the 18th and 19th centuries. By the mid 19th century, Europeans and Americans had come to regard refined sugar as a necessity. Today, we add sugar in one form or another to the majority of processed foods we eat—everything from bread, cereals, crunchy snacks and desserts to soft drinks, juices, salad dressings and sauces—and we are not too stingy about using it to sweeten many raw and whole foods as well.


By consuming so much sugar we are not just demonstrating weak willpower and indulging our sweet tooth—we are in fact poisoning ourselves according to a group of doctors, nutritionists and biologists, one of the most prominent members of which is Robert Lustig of the University of California, San Francisco, famous for his viral YouTube video “Sugar: The Bitter Truth.” A few journalists, such as Gary Taubes and Mark Bittman, have reached similar conclusions. Sugar, they argue, poses far greater dangers than cavities and love handles; it is a toxin that harms our organs and disrupts the body’s usual hormonal cycles. Excessive consumption of sugar, they say, is one of the primary causes of the obesity epidemic and metabolic disorders like diabetes, as well as a culprit of cardiovascular disease. More than one-third of American adults and approximately 12.5 million children and adolescents in the U.S. are obese. In 1980, 5.6 million Americans were diagnosed with diabetes; in 2011 more than 20 million Americans had the illness.

The argument that sugar is a toxin depends on some technical details about the different ways the human body gets energy from different types of sugar. Today, Americans eat most of their sugar in two main forms: table sugar and high-fructose corn syrup. A molecule of table sugar, or sucrose, is a bond between one glucose molecule and one fructose molecule—two simple sugars with the same chemical formula, but slightly different atomic structures. In the 1960s, new technology allowed the U.S. corn industry to cheaply convert corn-derived glucose intro fructose and produce high fructose corn syrup, which—despite its name—is almost equal parts free-floating fructose and glucose: 55 percent fructose, 42 percent glucose and three percent other sugars. Because fructose is about twice as sweet as glucose, an inexpensive syrup mixing the two was an appealing alternative to sucrose from sugarcane and beets.

Regardless of where the sugar we eat comes from, our cells are interested in dealing with fructose and glucose, not the bulkier sucrose. Enzymes in the intestine split sucrose into fructose and glucose within seconds, so as far as the human body is concerned sucrose and high-fructose corn syrup are equivalent. The same is not true for their constituent molecules. Glucose travels through the bloodstream to all of our tissues, because every cell readily converts glucose into energy. In contrast, liver cells are one of the few types of cells that can convert fructose to energy, which puts the onus of metabolizing fructose almost entirely on one organ. The liver accomplishes this primarily by turning fructose into glucose and lactate. Eating exceptionally large amounts of fructose taxes the liver: it spends so much energy turning fructose into other molecules that it may not have much energy left for all its other functions. A consequence of this energy depletion is production of uric acid, which research has linked to gout, kidney stones and high blood pressure.

The human body strictly regulates the amount of glucose in the blood. Glucose stimulates the pancreas to secrete the hormone insulin, which helps remove excess glucose from blood, and bolsters production of the hormone leptin, which suppresses hunger. Fructose does not trigger insulin production and appears to raise levels of the hormone grehlin, which keeps us hungry. Some researchers have suggested that large amounts of fructose encourage people to eat more than they need. In studies with animals and people by Kimber Stanhope of the University of California Davis and other researchers, excess fructose consumption has increased fat production, especially in the liver, and raised levels of circulating triglycerides, which are a risk factor for clogged arteries and cardiovascular disease. Some research has linked a fatty liver to insulin resistance—a condition in which cells become far less responsive to insulin than usual, exhausting the pancreas until it loses the ability to properly regulate blood glucose levels. Richard Johnson of the University of Colorado Denver has proposed that uric acid produced by fructose metabolism also promotes insulin resistance. In turn insulin resistance is thought to be a major contributor to obesity and Type 2 diabetes; the three disorders often occur together.

Because fructose metabolism seems to kick off a chain reaction of potentially harmful chemical changes inside the body, Lustig, Taubes and others have singled out fructose as the rotten apple of the sugar family. When they talk about sugar as a toxin, they mean fructose specifically. In the last few years, however, prominent biochemists and nutrition experts have challenged the idea that fructose is a threat to our health and have argued that replacing fructose with glucose or other sugars would solve nothing. First, as fructose expert John White points out, fructose consumption has been declining for more than a decade, but rates of obesity continued to rise during the same period. Of course, coinciding trends alone do not definitively demonstrate anything. A more compelling criticism is that concern about fructose is based primarily on studies in which rodents and people consumed huge amounts of the molecule—up to 300 grams of fructose each day, which is nearly equivalent to the total sugar in eight cans of Coke—or a diet in which the vast majority of sugars were pure fructose. The reality is that most people consume far less fructose than used in such studies and rarely eat fructose without glucose.

On average, people in America and Europe eat between 100 and 150 grams of sugar each day, about half of which is fructose. It’s difficult to find a regional diet or individual food that contains only glucose or only fructose. Virtually all plants have glucose, fructose and sucrose—not just one or another of these sugars. Although some fruits, such as apples and pears, have three times as much fructose as glucose, most of the fruits and veggies we eat are more balanced. Pineapples, blueberries, peaches, carrots, corn and cabbage, for example, all have about a 1:1 ratio of the two sugars. In his New York Times Magazine article, Taubes claims that “fructose…is what distinguishes sugar from other carbohydrate-rich foods like bread or potatoes that break down upon digestion to glucose alone.” This is not really true. Although potatoes and white bread are full of starch—long chains of glucose molecules—they also have fructose and sucrose. Similarly, Lustig has claimed that the Japanese diet promotes weight loss because it is fructose-free, but the Japanese consume plenty of sugar—about 83 grams a day on average—including fructose in fruit, sweetened beverages and the country’s many meticulously crafted confectioneries. High-fructose corn syrup was developed and patented in part by Japanese researcher Yoshiyuki Takasaki in the 1960s and ’70s.

Not only do many worrying fructose studies use unrealistic doses of the sugar unaccompanied by glucose, it also turns out that the rodents researchers have studied metabolize fructose in a very different way than people do—far more different than originally anticipated. Studies that have traced fructose’s fantastic voyage through the human body suggest that the liver converts as much as 50 percent of fructose into glucose, around 30 percent of fructose into lactate and less than one percent into fats. In contrast, mice and rats turn more than 50 percent of fructose into fats, so experiments with these animals would exaggerate the significance of fructose’s proposed detriments for humans, especially clogged arteries, fatty livers and insulin resistance.

In a series of meta-analyses examining dozens of human studies, John Sievenpiper of St. Michael’s Hospital in Toronto and his colleagues found no harmful effects of typical fructose consumption on body weight, blood pressure or uric acid production. In a 2011 study, Sam Sun—a nutrition scientist at Archer Daniels Midland, a major food processing corporation—and his colleagues analyzed data about sugar consumption collected from more than 25,000 Americans between 1999 and 2006. Their analysis confirmed that people almost never eat fructose by itself and that for more than 97 percent of people fructose contributes less daily energy than other sugars. They did not find any positive associations between fructose consumption and levels of trigylcerides, cholesterol or uric acid, nor any significant link to waist circumference or body mass index (BMI). And in a recent BMC Biology Q&A, renowned sugar expert Luc Tappy of the University of Lausanne writes: “Given the substantial consumption of fructose in our diet, mainly from sweetened beverages, sweet snacks, and cereal products with added sugar, and the fact that fructose is an entirely dispensable nutrient, it appears sound to limit consumption of sugar as part of any weight loss program and in individuals at high risk of developing metabolic diseases. There is no evidence, however, that fructose is the sole, or even the main factor in the development of these diseases, nor that it is deleterious to everybody.”

To properly understand fructose metabolism, we must also consider in what form we consume the sugar, as explained in a recent paper by David Ludwig, Director of the New Balance Foundation Obesity Prevention Center of Boston Children’s Hospital and a professor at Harvard. Drinking a soda or binging on ice cream floods our intestines and liver with large amounts of loose fructose. In contrast, the fructose in an apple does not reach the liver all at once. All the fiber in the fruit—such as cellulose that only our gut bacteria can break down—considerably slows digestion. Our enzymes must first tear apart the apple’s cells to reach the sugars sequestered within. “It’s not just about the fiber in food, but also its very structure,” Ludwig says. “You could add Metamucil to Coca Cola and not get any benefit.” In a small but intriguing study, 17 adults in South Africa ate primarily fruit—about 20 servings with approximately 200 grams of total fructose each day—for 24 weeks and did not gain weight, develop high blood pressure or imbalance their insulin and lipid levels.

To strengthen his argument, Ludwig turns to the glycemic index, a measure of how quickly food raises levels of glucose in the blood. Pure glucose and starchy foods such as Taubes’s example of the potato have a high glycemix index; fructose has a very low one. If fructose is uniquely responsible for obesity and diabetes and glucose is benign, then high glycemic index diets should not be associated with metabolic disorders—yet they are. A small percentage of the world population may in fact consume so much fructose that they endanger their health because of the difficulties the body encounters in converting the molecule to energy. But the available evidence to date suggests that, for most people, typical amounts of dietary fructose are not toxic.

Even if Lustig is wrong to call fructose poisonous and saddle it with all the blame for obesity and diabetes, his most fundamental directive is sound: eat less sugar. Why? Because super sugary, energy-dense foods with little nutritional value are one of the main ways we consume more calories than we need, albeit not the only way. It might be hard to swallow, but the fact is that many of our favorite desserts, snacks, cereals and especially our beloved sweet beverages inundate the body with far more sugar than it can efficiently metabolize. Milkshakes, smoothies, sodas, energy drinks and even unsweetened fruit juices all contain large amounts of free-floating sugars instantly absorbed by our digestive system.

Avoiding sugar is not a panacea, though. A healthy diet is about so much more than refusing that second sugar cube and keeping the cookies out of reach or hidden in the cupboard. What about all the excess fat in our diet, so much of which is paired with sugar and contributes to heart disease? What about bad cholesterol and salt? “If someone is gaining weight, they should look to sugars as a place to cut back,” says Sievenpiper, “but there’s a misguided belief that if we just go after sugars we will fix obesity—obesity is more complex than that. Clinically, there are some people who come in drinking way too much soda and sweet beverages, but most people are just overconsuming in general.” Then there’s all the stuff we really should eat more of: whole grains; fruits and veggies; fish; lean protein. But wait, we can’t stop there: a balanced diet is only one component of a healthy lifestyle. We need to exercise too—to get our hearts pumping, strengthen our muscles and bones and maintain flexibility. Exercising, favoring whole foods over processed ones and eating less overall sounds too obvious, too simplistic, but it is actually a far more nuanced approach to good health than vilifying a single molecule in our diet—an approach that fits the data. Americans have continued to consume more and more total calories each year—average daily intake increased by 530 calories between 1970 and 2000—while simultaneously becoming less and less physically active. Here’s the true bitter truth: Yes, most of us should make an effort to eat less sugar—but if we are really committed to staying healthy, we’ll have to do a lot more than that.



US Appeals Court Upholds Warrantless Collection of Phone Location Data

The court has ruled that cell site information is business data collected by the service provider

By John Ribeiro

Wed, July 31, 2013

IDG News Service (Bangalore Bureau) — Warrants are not required by the U.S. government to access historical cell site information, an appeals court ruled in an order.

The Fourth Amendment to the U.S. Constitution protects only reasonable expectations of privacy, the U.S. Court of Appeals for the Fifth Circuit wrote in a 2-1 ruling on Tuesday. The Fourth Amendment protects against unreasonable searches and seizures.

“Because a cell phone user makes a choice to get a phone, to select a particular service provider, and to make a call, and because he knows that the call conveys cell site information, the provider retains this information, and the provider will turn it over to the police if they have a court order, he voluntarily conveys his cell site data each time he makes a call,” the court added.

Cell site information is clearly a business record, collected by the service provider for its own business purposes, and without being asked to so by the government, the court said in the order.

The dispute hinged around whether law enforcement agents can access cell site data with a relatively easy-to-obtain order under section 2703 (d) of the Stored Communications Act, which is based on a showing of “specific and articulable facts,” instead of using a search warrant after showing probable cause.

Rights groups American Civil Liberties Union and Electronic Frontier Foundation and others have argued that the government should be required to seek a warrant to access the location information, because it is sensitive and can reveal a great deal about a person. The groups argued in court that SCA grants courts the discretion to require the government to obtain a warrant based upon probable cause before accessing historical cell phone location data.

Ruling that compelled warrantless disclosure of cell site data violates the Fourth Amendment, a magistrate judge earlier denied a government request for the historical cell site data in three applications filed in October, 2010 under the SCA for seeking evidence relevant to three separate criminal investigations. The judge, however, allowed for providing subscriber information.

Following an appeal by the government, a district court held that data “disclosing the location of the telephone at the time of particular calls may be acquired only by a warrant issued on probable cause,” as the records would show the date, time called, number, and location of the telephone when the call was made, which is constitutionally protected.

The Fifth Circuit court clarified that its ruling only covered section 2703(d) orders to obtain historical cell site information, and did not address, for example, orders requesting data from all phones that use a tower during a particular interval or “situations where the Government surreptitiously installs spyware on a target’s phone or otherwise hijacks the phone’s GPS, with or without the service provider’s help.”

The Supreme Court of New Jersey ruled earlier this month that cellphone users have a reasonable expectation of privacy of their cellphone location information, and police are required to get a search warrant before accessing the information. People are not promoting the release of personal information to others when making disclosures to phone companies, the court said in an unanimous ruling.



Bradley Manning-WikiLeaks case turns to sentencing

San Francisco Chronicle

By DAVID DISHNEAU, Associated Press

Updated 6:40 am, Wednesday, July 31, 2013

FORT MEADE, Md. (AP) — Acquitted of the most serious charge against him, Army Pfc. Bradley Manning still faces up to 136 years in prison for leaking government secrets to the website WikiLeaks, and his fate rests with a judge who will begin hearing arguments Wednesday in the sentencing phase of the soldier’s court-martial.

The former intelligence analyst was convicted of 20 of 22 charges for sending hundreds of thousands of government and diplomatic secrets to WikiLeaks, but he was found not guilty of aiding the enemy, which alone could have meant life in prison without parole.

“We’re not celebrating,” defense attorney David Coombs said. “Ultimately, his sentence is all that really matters.”

Military prosecutors said they would call as many as 20 witnesses for the sentencing phase. The government said as many as half of the prosecution witnesses would testify about classified matters in closed court. They include experts on counterintelligence, strategic planning and terrorism.

The judge prohibited both sides from presenting evidence during trial about any actual damage the leaks caused to national security and troops in Afghanistan and Iraq, but lawyers will be allowed to bring that up at sentencing.

The release of diplomatic cables, warzone logs and videos embarrassed the U.S. and its allies. U.S. officials warned of dire consequences in the days immediately after the first disclosures in July 2010, but a Pentagon review later suggested those fears might have been overblown.

The judge also restricted evidence about Manning’s motives. Manning testified during a pre-trial hearing he leaked the material to expose U.S military “bloodlust” and diplomatic deceitfulness, but did not believe his actions would harm the country. He didn’t testify during the trial, but he could take the stand during the sentencing phase.

Lisa Windsor, a retired Army colonel and former judge advocate, said the punishment phase would focus on Manning’s motive and the harm that was done by the leak.

“You’re balancing that to determine what would be an appropriate sentence. I think it’s likely that he’s going to be in jail for a very long time,” said Windsor, now in private practice in Washington.

The judge, Army Col. Denise Lind, deliberated three days before reaching her verdict in a case involving the largest leak of documents in U.S. history. The case drew worldwide attention as supporters hailed Manning as a whistleblower and the U.S. government called him an anarchist computer hacker and attention-seeking traitor.

The verdict denied the government a precedent that freedom of press advocates had warned could have broad implications for leak cases and investigative journalism about national security issues.

Whistleblower advocates and legal experts had mixed opinions on the implications for the future of leak cases in the Internet age.

The advocacy group Reporters Without Borders said the verdict was a chilling warning to whistleblowers, “against whom the Obama administration has been waging an unprecedented offensive,” and threatens the future of investigative journalism because intimidated sources might fall quiet.

However, another advocate of less government secrecy, Steven Aftergood of the Federation of American Scientists, questioned whether the implications will be so dire, given the extraordinary nature of the Manning case.

“This was a massive hemorrhage of government records, and it’s not too surprising that it elicited a strong reaction from the government,” Aftergood said.

“Most journalists are not in the business of publishing classified documents, they’re in the business of reporting the news, which is not the same thing,” he said. “This is not good news for journalism, but it’s not the end of the world, either.”

Glenn Greenwald, the journalist, commentator and former civil rights lawyer who first reported Edward Snowden’s leaks of National Security Agency surveillance programs, said Manning’s acquittal on the charge of aiding the enemy represented a “tiny sliver of justice.”

But WikiLeaks founder Julian Assange, whose website exposed Manning’s spilled U.S. secrets to the world, saw nothing to cheer in the mixed verdict.

“It is a dangerous precedent and an example of national security extremism,” he told reporters at the Ecuadorean Embassy in London, which is sheltering him. “This has never been a fair trial.”

Federal authorities are looking into whether Assange can be prosecuted. He has been holed up in the Ecuadorean Embassy in London to avoid extradition to Sweden on sex-crimes allegations.

The material WikiLeaks began publishing in 2010 documented complaints of abuses against Iraqi detainees, a U.S. tally of civilian deaths in Iraq, and America’s weak support for the government of Tunisia — a disclosure Manning supporters said helped trigger the Middle Eastern pro-democracy uprisings known as the Arab Spring.

To prove aiding the enemy, prosecutors had to show Manning had “actual knowledge” the material he leaked would be seen by al-Qaida and that he had “general evil intent.” They presented evidence the material fell into the hands of the terrorist group and its former leader, Osama bin Laden, but struggled to prove their assertion that Manning was an anarchist computer hacker and attention-seeking traitor.



Secretary of Defense Chuck Hagel

Statement on Strategic Choices and Management Review


Good afternoon.

Earlier today, I briefed key congressional committee leaders on the findings of DoD’s Strategic Choices and Management Review. This afternoon, I want to discuss these findings and clarify the major options and difficult choices ahead.

I directed the Strategic Choices and Management Review four months ago to help ensure the Department of Defense is prepared in the face of unprecedented budget uncertainty. Although DoD strongly supports the President’s fiscal year 2014 request and long-term budget plan for the entire federal government, the deep and abrupt spending cuts under sequestration that began on March 1st this year are the law of the land. Sequestration will continue in the absence of an agreement that replaces the Budget Control Act.

The purpose of the Strategic Choices and Management Review – which was led by Deputy Secretary Ash Carter with the full participation of General Dempsey, Admiral Winnefeld, the Service Secretaries and Service Chiefs – was to understand the impact of further budget reductions on the Department, and develop options to deal with these additional cuts. It had three specific objectives:

• Help DoD prepare for how to deal with sequestration if it continues in FY 2014;

• Inform the fiscal guidance given to the military services for their FY 2015

through 2019 budget plans;

• Anchor the upcoming Quadrennial Defense Review,

which will assess our defense strategy in light of new fiscal realities and the many threats, complexities and uncertainties of this new century.


The Strategic Choices and Management Review did not produce a detailed budget blueprint. That was not the purpose of this review. It generated a menu of options, not a set of decisions, built around three potential budget scenarios:

• The President’s FY 2014 budget, which incorporates a carefully calibrated and largely back-loaded $150 billion reduction in defense spending over the next ten years;

• The Budget Control Act’s sequester-level caps, which would cut another $52 billion from defense in fiscal year 2014, with $500 billion in reductions for the DoD over the next ten years;

• An “in-between” scenario that would reduce defense spending by about $250 billion over the next ten years, but would be largely back-loaded.

It is important to remember that all these cuts are in addition to the $487 billion reduction in defense spending over the next decade required by the initial caps in the Budget Control Act of 2011 which DoD has been implementing. If sequester-level cuts persist, DoD would experience nearly a trillion dollars in defense spending reductions over the next ten years.

To help DoD balance strategic ends, ways and means under these budget scenarios, the Strategic Choices and Management Review scrutinized every aspect of DoD’s budget, including: contingency planning, business practices, force structure, pay and benefits, acquisition practices, and modernization portfolios. Everything was on the table.

As I discussed last week at the VFW Convention in Louisville, four principles helped guide this review:

• Prioritizing DoD’s missions and capabilities around our core responsibility of defending our country;

• Maximizing the military’s combat power by looking to reduce every other category of spending first;

• Preserving and strengthening military readiness, and;

• Honoring the service and sacrifice of DoD’s people.

Those principles, and a rigorous review process, resulted in packages of options that included management efficiencies and overhead reductions, compensation reforms, and changes to force structure and modernization plans.

Allow me to share with you some of the options the review identified in each area I just mentioned.



A tenet of the review was that we need to maximize savings from reducing DoD’s overhead, administrative costs, and other institutional expenses.

For several years, DoD has been paring back overhead. About $150 billion in five-year efficiency reductions were proposed by Secretary Gates, an additional $60 billion in savings were identified by Secretary Panetta, and I submitted a $34 billion savings package in our latest budget. DoD is continuing to implement these efficiency campaigns. Despite much progress, as well as good efforts and intentions, not every proposal has generated the savings we expected, or gained the support of Congress – most notably, our request for a base realignment and closure

round. The review showed that DoD will have to do more in this area, even though it is getting more difficult to find these cuts and it can take years for significant savings to be realized. After considering the results of the review, I determined that it is possible and prudent to begin implementing a new package of efficiency reforms now – ones that should be pursued regardless of fiscal circumstances.

Some of these management efficiencies and overhead reductions include:

• Reducing the department’s major headquarters budgets by 20 percent, beginning with the Office of the Secretary of Defense, the Joint Staff, Service Headquarters and Secretariats,

Combatant Commands, and defense agencies and field activities. Although the 20 percent cut applies to budget dollars, organizations will strive for a goal of 20 percent reductions in government civilians and military personnel billets on headquarters staffs;

• Reducing the number of direct reports to the Secretary of Defense by further consolidating functions within OSD, as well as eliminating positions, and;

• Reducing intelligence analysis and production at Combatant Command intelligence and operations centers, which will also foster closer integration and reduce duplication across the defense enterprise.

These management reforms, consolidations, personnel cuts, and spending reductions will reduce the department’s overheard and operating costs by some $10 billion over the next five years and almost $40 billion over the next decade. They will make the Department more agile and versatile.

Past efficiency campaigns have shown that implementation can be very challenging, so effective follow-through is critical if savings targets are to be realized. This is especially true of OSD reductions. I have asked Deputy Secretary Carter to identify someone from outside DoD who is deeply knowledgeable about the defense enterprise and eminently qualified to direct implementation of the OSD reductions and report to the Deputy Secretary.

In addition to the measures I’ve described, the review identified additional consolidations and mission reductions that could be required if sequester-level caps are imposed over the long- term. These measures include consolidations of regional combatant commands, defense agency mission cuts, and further IT consolidation.

These changes would be far-reaching and require further analysis and consideration. Though defense bureaucracies are often derided, the fact is that these offices perform functions needed to manage, administer, and support a military of our size, complexity and global reach.

Even over the course of a decade, the cumulative savings of the most aggressive efficiency options identified by the review are $60 billion. That is a very small fraction of what is needed under sequester-level cuts. We will have to look elsewhere for savings.



The review confirmed that no serious attempt to achieve significant savings can avoid compensation costs, which consume roughly half of the DoD budget. If left unchecked, pay and benefits will continue to eat into readiness and modernization. That could result in a far less capable force that is well-compensated but poorly trained and poorly equipped.

Any discussion of compensation should acknowledge the following:

• No one in uniform is “overpaid” for what they do for this country;

• People are DoD’s most important asset – and we must sustain compensation

packages that recruit and retain the finest military in the world;

• The significant military pay and benefit increases over the last decade reflected

the need to sustain a force under considerable stress – especially the Army and

Marines – during the height of the Iraq and Afghanistan campaigns;

• One post-9/11 war is over, and the second – our nation’s longest war– is coming

to an end;

• Overall personnel costs have risen dramatically – some 40 percent above inflation

since 2001.


The Department cannot afford to sustain this growth. Reflecting these realities, the President’s Fiscal Year 2014 budget included a package of modest compensation-related reforms that have the strong support of our uniformed leadership. Congress has signaled its opposition to some of these proposals, including modest increases in TRICARE fees for working age retirees. But given our current fiscal situation, DoD has no choice but to consider compensation changes of greater magnitude for military and civilian personnel.

The review developed compensation savings options that we believe would continue to allow the military to recruit and retain the high quality personnel we need. If we were to pursue these options, we would need Congress’ partnership to implement many of them. Examples include:

• Changing military health care for retirees to increase use of private-sector insurance when available;

• Changing how the basic allowance for housing is calculated so that individuals are asked to pay a little more of their housing costs;

• Reducing the overseas cost of living adjustment;

• Continuing to limit military and civilian pay raises.

Many will object to these ideas – and I want to be clear that we are not announcing any compensation changes today. Instead, I’ve asked Chairman Dempsey to lead an effort with the Service Chiefs and Senior Enlisted Advisors to develop a package of compensation proposals that meets savings targets identified in the review – almost $50 billion over the next decade – and still enable us to recruit and retain a high-quality force. We would begin implementing this package in the FY 2015 budget. Senior OSD staff will lead a similar review for civilian pay and benefits.

The review also identified more sweeping changes to meet sequester-level targets – such as eliminating civilian pensions for retired military personnel serving in civilian government service, ending subsidies for defense commissaries, and restricting the availability of unemployment benefits. This package would yield savings of almost $100 billion over the next decade, but would have a significant impact on our service members and our workforce. But a sequester-level scenario would compel us to consider these changes because there would be no realistic alternative that did not pose unacceptable risk to national security.



The efficiencies and compensation reforms identified in the review – even the most aggressive changes – still leave DoD some $350 to $400 billion short of the $500 billion in cuts required by sequestration over the next ten years. The review had to take a hard look at changes to our force structure and modernization plans.

The President’s Defense Strategic Guidance anchored this effort. The goal was to find savings that best preserved the tenets of the President’s strategy, such as strategic deterrence, homeland defense, and the rebalance to the Asia-Pacific. The review concluded we should not take reductions proportionally across the military services. Instead, the options we examined were informed by strategy, and they will guide the services as they build two sets of budgets for FY 2015 through 2019 – one at the President’s budget level and one at sequester-level caps.

While we want to preserve flexibility for each military service to develop the best force possible given reduced resources, the review identified areas where we have excess capacity to meet current and anticipated future needs. In particular, the analysis concluded that we can strategically reduce the size of our ground and tactical air forces – even beyond the current draw down.

I have not made any program or force structure decisions, and more analysis will be required before these decisions are made. But with the end of the war in Iraq, the drawdown in Afghanistan, and a changing requirement to conduct protracted, large-scale counterinsurgency operations, it makes sense to take another look at the Army’s force structure – which is currently planned to reach 490,000 in the active component and 555,000 in the reserves.

One option the review examined found that we could still execute the priority missions determined by our defense strategy while reducing Army end-strength to between 420,000 and 450,000 in the active component and between 490,000 and 530,000 in the Army reserves. Similarly, the Air Force could reduce tactical aircraft squadrons – potentially as many as five – and cut the size of the C-130 fleet with minimal risk.

In the months ahead I will work closely with Chairman Dempsey and each of the Service Chiefs to reach agreement on the proper size of our armed forces, taking into account real-world needs and requirements.

A modest reduction in force structure, when combined with management efficiencies and compensation reforms, would enable us to meet the $150 billion in savings required by the

President’s budget proposal while still defending the country and fulfilling our global responsibilities. We can sustain our current defense strategy under the President’s budget request.

Significant reductions beyond the President’s plan would require many more dramatic cuts to force structure. The review showed that the “in-between” budget scenario we evaluated would “bend” our defense strategy in important ways, and sequester-level cuts would “break” some parts of the strategy no matter how the cuts were made. Under sequester-level cuts, our military options and flexibility will be severely constrained.

Given that reality, the review examined two strategic approaches to reducing force structure and modernization that will inform planning for sequester-level cuts. The basic trade- off is between capacity – measured in the number of Army brigades, Navy ships, Air Force squadrons and Marine battalions – and capability – our ability to modernize weapons systems to maintain our military’s technological edge.

In the first approach, we would trade away size for high-end capability. This would further shrink the active Army to between 380,000 and 450,000 troops, reduce the number of carrier strike groups from 11 to eight or nine, draw down the Marine Corps from 182,000 to between 150,000 and 175,000, and retire older Air Force bombers. We would protect investments to counter anti-access and area-denial threats, such as the long range strike family of systems, submarine cruise-missile upgrades, and the Joint Strike Fighter. And we would continue to make cyber capabilities and special operations forces a high priority.

This strategic choice would result in a force that would be technologically dominant, but would be much smaller and able to go fewer places and do fewer things, especially if crises occurred at the same time in different regions of the world.

The second approach would trade away high-end capability for size. We would look to sustain our capacity for regional power projection and presence by making more limited cuts to ground forces, ships and aircraft. But we would cancel or curtail many modernization programs, slow the growth of cyber enhancements, and reduce special operations forces.

Cuts on this scale would, in effect, be a decade-long modernization holiday. The military could find its equipment and weapons systems – many of which are already near the end of their service lives – less effective against more technologically advanced adversaries. We also have to consider how massive cuts to procurement, and research and development funding would impact the viability of America’s private sector industrial base.

These two approaches illustrate the difficult trade-offs and strategic choices that would face the department in a scenario where sequester-level cuts continue. Going forward, in the months ahead, DoD – and ultimately the President – will decide on a strategic course that best preserves our ability to defend our national security interests under this very daunting budget scenario.

The balance we strike between capability, capacity and readiness will determine the composition and size of the force for years to come. We could, in the end, make decisions that result in a very different force from the options I’ve described today. Our goal is to be able to give the President informed recommendations, not to pre-judge outcomes. Regardless, the decision-making process will benefit from the insights this review provided.

In closing, one of the most striking conclusions of the Strategic Choices and Management Review is that if DoD combines all the reduction options I’ve described – including significant cuts to the military’s size and capability – the savings fall well short of meeting sequester-level cuts, particularly during the first five years of these steep, decade-long reductions.

The reality is that cuts to overhead, compensation, and forces generate savings slowly. With dramatic reductions in each area, we do reach sequester-level savings – but only towards the end of a 10-year timeframe. Every scenario the review examined showed shortfalls in the early years of $30-35 billion.

These shortfalls will be even larger if Congress is unwilling to enact changes to compensation or adopt other management reforms and infrastructure cuts we proposed in our fiscal year 2014 budget. Opposition to these proposals must be engaged and overcome, or we will be forced to take even more draconian steps in the future.

A lot has been said about the impact of sequestration. Before this review, like many Americans, I wondered why a 10 percent budget cut was in fact so destructive. Families and businesses trim their costs by similar proportions. But this analysis showed in the starkest terms how a 10 percent defense spending reduction causes in reality a much higher reduction in military readiness and capability. Unlike the private sector, the federal government – and the Defense Department in particular – simply does not have the option of quickly shutting down excess facilities, eliminating entire organizations and operations, or shedding massive numbers of employees – at least not in a responsible, moral and legal way.

The fact is that half of our budget – including areas like compensation where we need to achieve savings – are essentially off limits for quick reductions. Given that reality, the only way to implement an additional, abrupt 10 percent reduction in the defense budget is to make senseless, non-strategic cuts that damage military readiness, disrupt operations, and erode our technological edge. We have already seen some of the significant effects of the $37 billion reduction occurring in this fiscal year – including halting all flying for some Air Force squadrons, cancelling ship deployments, ending Army Combat Training Center rotations for brigades not deploying to Afghanistan, and imposing furloughs for 650,000 DoD civilians.

In Fiscal Year 2014, this damage will continue if sequestration persists. DoD is now developing a contingency plan to accommodate the $52 billion sequester-level reduction in fiscal year 2014, which I outlined in a letter this month to Senate Armed Services Committee Chairman Levin and Ranking Member Inhofe. Congress will need to help us manage these deep and abrupt reductions responsibly and efficiently.

The bold management reforms, compensation changes and force structure reductions identified by the Strategic Choices and Management Review can help reduce the damage that would be caused by the persistence of sequestration in Fiscal Year 2014, but they won’t come close to avoiding it altogether.

The review demonstrated that making cuts strategically is only possible if they are “backloaded.” While no agency welcomes additional budget cuts, a scenario where we have additional time to implement reductions – such as in the President’s budget – would be far preferable to the deep cuts of sequestration. If these abrupt cuts remain, we risk fielding a force that over the next few years is unprepared due to a lack of training, maintenance, and the latest equipment.

As I mentioned last week at the VFW Convention, a top priority in future year budget plans is to build a ready force, even if that requires further reductions in force structure. No matter the size of our budget, we have a responsibility to defend the country and America’s vital interests around the world. That means crafting the strongest military possible under whatever level of resources we are provided.

DoD has a responsibility to give America’s elected leaders, and the American people, a clear-eyed assessment of what our military can and cannot do in the event of a major confrontation or crisis after several years of sequester-level cuts. In the months ahead, we will continue to provide our most honest and best assessment. And the inescapable conclusion is that letting sequester-level cuts persist would be a huge strategic miscalculation that would not be in our country’s best interests. While I’ve focused today on the impact to DoD, sequester-level cuts would equally harm other missions across government that support a strong economy and a strong national defense by providing support to our service members, veterans, and their families. DoD depends on a strong education system to maintain a pool of qualified recruits, we rely on domestic infrastructure that surrounds our bases and installations, and we count on scientific breakthroughs funded by research and development grants and a strong manufacturing base to maintain our decisive technological edge. All of these areas are threatened by sequestration.

It is the responsibility of our nation’s leadership to work together to replace the mindless and irresponsible policy of sequestration. It is unworthy of the service and sacrifice of our nation’s men and women in uniform and their families. And even as we confront tough fiscal realities, our decisions must always be worthy of the sacrifices we ask America’s sons and daughters to make for our country.


DOD strategy review paints bleak outlook

By Amber Corrin

Jul 31, 2013


In a July 31 press briefing discussing findings of a recent comprehensive strategy review, Defense Secretary Chuck Hagel underscored the tough choices Pentagon decision-makers face amid cuts that could reach nearly $1 trillion.

Hagel, accompanied by Joint Chiefs Vice Chairman Adm. James Winnefeld, said that the sweeping strategic choices and management review he directed earlier this year examined three central scenarios, and that all of them fell short of savings targets.

“To help DOD balance strategic ends, ways and means under these budget scenarios, the Strategic Choices and Management Review scrutinized every aspect of DOD’s budget, including: contingency planning, business practices, force structure, pay and benefits, acquisition practices, and modernization portfolios,” Hagel said. “Everything was on the table.”

The three different scenarios include President Barack Obama’s fiscal 2014 budget, which “backloads” $150 billion in defense cuts over a 10-year period; Budget Control Act sequester cap figures, which cuts $52 billion in fiscal 2014 and $500 billion over the next 10 years; and an “in-between” scenario, in which defense spending is reduced by $250 billion over 10 years.


The review was designed to help Pentagon leadership prepare for the possibility of continued sequestration cuts into 2014, as well as to inform out-year budget planning for the services and set the stage for the upcoming Quadrennial Defense Review. Hagel stressed that the review is not a blueprint but an evaluation that produced the three different options.

One area of focus is reducing DOD’s overhead costs, which take up as much as half the Pentagon budget. Hagel said some options should be implemented immediately, regardless of the fate of sequestration.

The focus on overhead and management efficiencies includes measures such as reducing personnel at major military headquarters by 20 percent, starting with the Office of the Secretary of Defense, the Joint Staff, combatant commands and defense agencies. The measures would also consolidate functions in OSD and in intelligence analysis and production, saving up to $40 billion over the next decade.

To help overcome hurdles in executing the cuts, Hagel said he has directed Deputy Defense Secretary Ash Carter to find an expert from outside DOD to help direct implementation.

Force structure and modernization stand to take significant hits in all three scenarios; the cuts would “bend” and could possibly “break” national security strategy, the officials noted.

“Given that reality, the review examined two strategic approaches to reducing force structure and modernization that will inform planning for sequester-level cuts,” Hagel said. “The basic trade-off is between capacity – measured in the number of Army brigades, Navy ships, Air Force squadrons and Marine battalions – and capability – our ability to modernize weapons systems to maintain our military’s technological edge.”

Trading capacity for capability would result in a more technologically dominant military, but one that is smaller and able to perform fewer missions. Opting for size over capability would sustain power projection and presence, but would seriously slow or curtail modernization programs and cripple cyber development, Hagel said.

“Cuts on this scale would, in effect, be a decade-long modernization holiday,” he said. “The military could find its equipment and weapons systems – many of which are already near the end of their service lives – less effective against more technologically advanced adversaries. We also have to consider how massive cuts to procurement and research and development funding would impact the viability of America’s private sector industrial base.”

And even if DOD combined all the different options, it leaves the Pentagon well short of the targeted reduction levels under sequestration, Hagel and Winnefeld noted. They also warned that it will be even worse if Congress does not cooperate with proposed cuts – any proposal to freeze military pay, for example, could be expected to meet stiff resistance on the Hill.

“The reality is that cuts to overhead, compensation and forces generate savings slowly. With dramatic reductions in each area, we do reach sequester-level savings – but only towards the end of a 10-year timeframe. Every scenario the review examined showed shortfalls in the early years of $30-35 billion,” Hagel said. “These shortfalls will be even larger if Congress is unwilling to enact changes to compensation or adopt other management reforms and infrastructure cuts we proposed in our fiscal year 2014 budget. Opposition to these proposals must be engaged and overcome, or we will be forced to take even more draconian steps in the future.”


Army’s Problems go Deeper Than SCMR

Defense News

Aug. 1, 2013 – 05:29PM | By PAUL McLEARY


WASHINGTON — Few in the Pentagon or the defense industry liked what they heard Wednesday at Defense Secretary Chuck Hagel’s press conference announcing the findings of his four-month Strategic Choices and Management Review (SCMR).

But the Army probably didn’t like what it didn’t hear, as well.

The secretary laid out two paths if Congress and White House remain unable to reach a budgetary “grand bargain” that would reverse the $500 billion in budget cuts that the Pentagon will face over the next decade, beginning with a $52 billion hit in 2014.

One plan would prioritize high-end capabilities over end-strength numbers, while the other would keep end-strength while sacrificing modernization and research and development on next-generation platforms.

While Hagel was short on specifics when it came to platforms that would or wouldn’t be modernized, the secretary provided a hint when he said “we would protect investments to counter anti-access and area-denial threats, such as the long range strike family of systems, submarine cruise-missile upgrades and the Joint Strike Fighter. And we would continue to make cyber capabilities and special operations forces a high priority.”

Notice anything there? No Army platforms were mentioned, save those few presumably used by special operators.

That doesn’t mean that key Army modernization priorities like the ground combat vehicle or joint light tactical vehicle are doomed under Hagel’s scenario. But being left out of a roll call of the Pentagon’s highest priorities may make some people nervous.
Not only was the Army left out of the list of critical programs, but under either plan it would also take by far the largest hit in terms of end strength.

With the end of US involvement in Iraq and the ongoing drawdown in Afghanistan, the Pentagon has concluded that the service could drop as low as 450,000 to 380,000 soldiers, numbers which haven’t been seen since before World War II.

The wartime high of 570,000 grunts was always seen as a temporary spike — save for some in the Army who wanted to keep what they had gained — but the service is working on culling 80,000 troops to reach 490,000 by the end of 2017.

When it comes to prioritizing modernization vs. end-strength cuts, “I would suspect the first impulse would be to protect as much combat power as you could,” said Maren Leed, a former adviser to the US Army chief of staff who is now at the Center for Strategic and International Studies.

Doing so would mean deeply reducing headquarters elements, combining major commands, and the “further pooling of enablers at higher echelons to get those as lean as they can before they start going after combat power.”

That said, “the Army’s going to have a difficult time holding on to as much [combat power] as they might like to” if the cuts are not reversed, she said.

The dirty little secret in the rush to gain some cost savings, however, is that even letting go of 80,000 soldiers won’t actually save the Army a dime.

All of the funds earmarked for paying those soldiers over the 490,000 threshold come from supplemental war accounts, and don’t count toward any sequestration savings — which means barring deeper and faster cuts, the service won’t save any money on force reductions until the fiscal 2018 budget.


And the service desperately needs those savings. The Army already spends 46 percent of its budget on compensation, a number that service chief Gen. Ray Odierno has warned will rise to 80 percent in a decade if compensation trends continue.

What’s more, even forcibly separating soldiers won’t reduce the strain on budgets all that much. Service contracts include provisions for unemployment and other benefits for about a year after a soldier leaves the force, so the service still has to pay for former soldiers months after they separate.

Taking soldiers out of the ranks is one thing. Resizing units to reflect those reductions while still retaining combat punch is another. The Army announced in June that it was reducing the number of brigade combat teams from 45 to 33 — while protecting its overall combat wallop by keeping 95 out of its 98 combat battalions.

The plan is to take the cuts in headquarters positions across the brigades while increasing the number of maneuver battalions in each brigade from two to three, while adding engineering and fires capabilities to each unit.

Odierno called the moves “one of the largest organizational changes probably since World War II” for the service.

“If we go though full sequestration there’s going to be another reduction in brigades, there’s no way around it,” Odierno warned, adding that there will likely be more cuts coming in the heavy armor brigades, sequestration or not.

Fewer brigades, fewer soldiers, less money, and an uncertain modernization profile. With all of this in flux, what missions will the Army prioritize in the future?

“The most important thing that they’ve got to be concerned about is the Korean war plan since it doesn’t necessarily align that well with all the other things the Army believes it also needs to be doing,” Leed said. Those missions include things that Army leadership has spent a lot of time talking about in recent months, such as partner engagement in Asia and Africa, humanitarian response and training for contingencies spanning counterinsurgency to peer combat.

But the continuing instability on the Korean peninsula will mean that “they will be highly reluctant to take risk [in Korea] because of the criticality of it.”

The Army National Guard and reserve — much used in the past decade of conflict but largely spared from the current round of drawdowns in ground force end strength — would also be due for a haircut absent any grand bargain. They would fall from 555,000 soldiers to between 490,000 and 530,000 under the two scenarios.

One of the key questions to be considered when taking combat power out of the active force, but trying to maintain capability in the Guard and reserve, is to what degree can the Guard mitigate various kinds of risk? “Much of the Guard is not particularly well suited to meet the short term-risk in Korea,” for example, Leed said. But “when you’re talking about missions that align well with their competencies they can step in almost immediately.” Missions such as small unit training and advising, medical support, engineering and partnership missions are things that the Guard has traditionally performed well.

When it comes down to it, under any sequestration scenario “the whole Army would be the bill payer,” said Thomas Donnelly, a defense policy analyst at the American Enterprise Institute. Any cut to end strength or modernization would affect the other in serious ways, and would also impact the way the service could respond to contingencies.

The Pentagon has now laid out its thinking — absent any major change in national defense strategy — and now Congress and the White House will have their say.


Senate Panel Approves $594.2B DoD Bill Despite Worries About Spending Caps

Defense News

Aug. 1, 2013 – 12:35PM | By JOHN T. BENNETT

The Senate Appropriations Committee on Thursday approved a $594.2 billion Pentagon spending bill, with some Republicans mounting a symbolic resistance because it surpasses federal spending caps.

The panel’s bill, approved 22-8, would give the Pentagon a $516.4 billion base budget and a $77.8 billion war-funding section. The base section’s topline aligns with the Obama administration’s request, while the Overseas Contingency Operations portion would be $8 billion smaller than the White House’s request.

The opening minutes of a full committee mark up of the legislation focused on a back-and-forth about the Democratic-controlled panel’s decision to push a bill that’s nearly $20 billion larger than national defense spending caps etched into law by the 2011 Budget Control Act.

“The defense bill alone would exceed the Budget Control Act caps for defense-related spending by nearly $19 billion,” committee ranking member Richard Shelby, R-Ala., said. “That’s not even counting the defense-related-spending in the committee’s other bills, which comes to an additional $35.6 billion.” All told, the panel’s 2014 spending bills are $91 billion over defense and domestic spending limits, Shelby says.

Without a major fiscal deal that addresses the spending caps, another sequestration round of across-the-board cuts would be triggered. Instead, Shelby called for “deliberate [cuts] that reflect decisions by Congress about strategic priorities.”

Shelby and seven other Republicans voted against the defense bill, with several echoing the ranking member’s comments.

Chairwoman Barbara Mikulski, D-Md., said the reason her defense and other appropriations bills peak those caps lies with the House and Senate Budget committees — particularly the lower chamber’s panel.


“We need a topline so we can get to the bottom line,” Mikulski said. “We have marked up our bills to a topline of $1.058 trillion, the level in the American Taxpayer Relief Act, which was approved by the Senate by a vote of 89-8.”

She noted the defense bill and others “assume that sequester will be replaced with a balanced solution of revenues and smart cuts.”

Mikulski criticized the House for, in legislation it has passed, building in “a moat around defense so that all $91 billion in cuts come out of domestic funding bills.”

President Obama on Wednesday, during a meeting with Senate Democrats, indicated he will not support placing such a “moat” around Pentagon spending when fiscal negotiations kick into full steam this fall, according to lawmakers who attended.

In a sign that rank-and-file lawmakers have moved little from ideological stances taken since 2010, Shelby called for “meaningful spending cuts in mandatory accounts,” meaning “entitlement reform.” Democrats largely have opposed deep cuts to such programs; Republicans oppose more new revenues that Democrats want.


Senate Budget Committee Chairwoman Patty Murray, D-Wash., acknowledged the differences between the House and Senate budgets is a problem.

“Everyone knows there’s going to have to be a compromise at some point,” Murray said during the mark up. “We’re not going to solve this by kicking the can to someone else — it’s up to us.”

Sparks also flew during the session over a provision offered by Sen. Tom Udall, D-N.M., then amended by Sen. Dianne Feinstein, D-Calif.

Udall’s measure would cut off funding for any US operation in Syria that violates the War Power Act. After several senators raised concerns, Feinstein inserted language specifically pointing to the US armed services and a military operation.

The measure passed via voice vote, with several, including Sen. Lindsey Graham, R-S.C., audibly voting no.

Graham argued against both the initial and the revised amendment, arguing it would trample on the president’s constitutional powers as commander in chief.

“To my Republican colleagues who suddenly support the War Powers Act, where were you during the Bush years?” Graham said.

The 1973 law requires presidents to secure congressional approval for military operations within 60 days, or withdraw forces within the next 30. Since it was passed in the wake of the Vietnam War, Congress largely has looked the other way when presidents from both political parties have launched military operations that stretched into a 61st day and beyond.


Furlough appeals keep coming

Washington Post

By Lisa Rein, Published: August 2 at 6:00 am

The small agency receiving thousands of appeals from furloughed federal employees has delayed docketing and hearing most of them them until the volume slows down, officials said Thursday.

The surge of challenges arriving at the Merit Systems Protection Board hit 13,090 on Thursday, with 1,900 of those arriving by fax, snail mail and electronically on Wednesday alone. That’s double the appeals that had come in a week ago.

With its support staff and administrative judges overwhelmed, the merit board has put a hold on all the cases from Defense Department civilians, the largest group of federal workers taking unpaid days to meet the budget cuts known as sequestration.

The board posted a message on its Web site this week that said: “Due to the unprecedented large volume of furlough appeals being received from employees of the military services and Department of Defense activities —and after much consideration —MSPB has decided to delay the processing and adjudication of these appeals.”

General Counsel Bryan Polisuk said that once the Defense appeals slow down after Aug. 8 — the 30-day filing deadline from the start of furloughs on July 8 — the staff will resume sorting through them.

“We’ll be in a better position [then] to see what we have, and start making decisions on how to adjudicate these cases,” he said. “Given the volume of furlough appeals, it’s going to be difficult for our regions to be as responsive in the near future as they normally are.” The merit board has eight offices across the country.

Appeals from employees furloughed from other agencies, including the Internal Revenue Service, the Environmental Protection Agency and the Department of Housing and Urban Development, are moving forward.

Administrative judges have ruled on 11 appeals so far, one from an EPA employee and 10 from the Federal Aviation Administration, which took one furlough day in April before Congress gave the agency authority to redirect money from other areas of its budget to shore up staffing and operations.

The ruling upheld the furloughs, meaning that the employees will not get back pay.

The high volume of appeals temporarily knocked out the merit board’s electronic “e-Appeal” service several times this week. Polisuk said it was quickly restored.

The appeals represent a new attack by federal employee unions on the Obama administration’s decision to furlough about 775,000 employees this fiscal year to help meet $85 billion in budget cuts.

The unions say federal agencies had other options than forcing employees to lose multiple days of pay. They are holding seminars with their locals across the country to instruct their members how to file appeals, which seek to cancel the furloughs and recover back pay.

Several unions have asked the merit board to consider the appeals as class-action cases, a request that is under consideration.

It’s unclear whether the merit board will meet its average turnaround time of 120 days to decide the cases, Polisuk said.

“It’s a very fluid situation.”



Do We Need A U.S. Department Of Technology?

A cabinet-level Dept. of Tech will help U.S. retain leadership position in the global high-technology sector, argues former Department of Transportation CIO Nitin Pradhan.

By Nitin Pradhan, InformationWeek

August 01, 2013



I came to the U.S. in the early 1990’s, on a fellowship from an American university. I was exploring several countries, including Australia, Canada and New Zealand, but I decided on the U.S. because of its leadership in technology. I have been fortunate to have a great career here, including being appointed by President Obama as the CIO of the Department of Transportation and the FAA in his first term.

Now my “iKids” are growing up and I want them to thrive in the U.S. technology industry. But will this industry continue to flourish in U.S.? What can we do today to ensure our continued technology leadership in the future?


Why Tech Industries Grow

Tech industries grow because of the availability of research and development dollars, a high-quality education system, a tech-savvy workforce, a large local technology marketplace and government incentives. The U.S. has no intrinsic advantage in the technology industry. Past wins have been a function of dollars invested, bipartisan leadership and lack of global competition.


However, now the global competition is heating up — just ask Apple, HP, Ericsson and Boeing, and they’ll tell you Samsung, Lenovo, Huawei and Airbus are tough global competitors. South Korea, China and the European Union governments are investing heavily in the growth of the tech industry. The U.S. government is investing in technology industry too, but its focus is mainly on defense, and with sequestration these investments are being reduced. I believe a more direct, consolidated, coordinated and planned approach toward technology investments is needed to keep our leadership strong.


Tech Remains the U.S.’s Future

According to TechAmerica Foundation’s 2013 Cyberstats report, U.S. high-tech employment totaled 5.95 million in 2012, with average wages of $93,800 — 98% more than the average private sector wages of $47,000. According to some additional recent data, U.S. high-tech jobs are growing at three times the rate of other private sectors, and each tech job creates more than four jobs in the wider economy, thanks to wages that are 17-27% higher than other sectors. If we want to create more of these jobs for our kids, we need a mechanism to support that future.

We need to create a new cabinet-level Department of Technology (USDoTech) now, while we are still leading in the technology world. The goal of the department should be to drive collaborative public-private technology innovations that maximize public value through private growth.

The notion of a cabinet-level technology department is not new. James Harold Wilson, the former Prime Minister of the United Kingdom, established MinTech, the first Ministry of Technology, in 1964. China today has Ministry of Science and Technology; India has a Ministry of Communications and Information Technology; and South Korea has MSIP (the Ministry of Science), ICT and Future Planning to “build a creative economy for the happiness of all people.”


How to Create USDoTech?

President Obama, to his credit, established the first chief information officer and chief technology officer positions in the federal government. However, neither has cabinet-level authority, and with few resources available, the impact on the growth of this important sector has been limited.

One way create a cabinet-level department is by consolidating a number of technology-centric offices spread across various federal agencies that often work in an uncoordinated and sometimes even counterproductive way. Some examples include the technology-focused sections of the Office of Management and Budget (OMB), the Office of Science and Technology Policy (OSTP), the National Science Foundation (NSF), the National Institute of Standards and Technology (NIST) and many others.

Centralizing the technology leadership functions in these departments will give government a clearer, more holistic picture of the needs, challenges, opportunities, and threats for this fast-growing sector, allowing it to more effectively craft policies, legislation, and regulations and promote appropriate public-private investment strategies to keep U.S. leadership dynamic. We don’t need a humongous new department like Homeland Security – just a right-sized, efficient, tech-savvy group that will deliver results.


Congress: Technology Oversight Needs Change Too

As mentioned in my recent article in the Fast Government Report published by The IBM Center for the Business of Government, the Congressional governance of federal government investments in the technology portfolio is fragmented and is therefore not conducive to seeing the benefits of integrated approaches to technology.

Congress has 21 Senate committees, 22 House committees and many more subcommittees, which directly or indirectly have oversight over technology initiatives and investments in federal agencies. However, technology today is highly connected infrastructure, and a holistic view and investment strategy is key to future success. It is therefore essential that Congress establish a technology committee focused on maximizing transformative use of technology and effective involvement of private industry for the benefit of the country.


Next Steps: Get Involved

How do we create the “USDoTech” with this polarized Congress? Crowdsourcing, of course!

If you support the concept of a cabinet-level technology department, forward this article and talk to your friends and family today. “Like” the initiative on Facebook, and suggest the roles and responsibilities for this new department now. Call your Congressional representatives and senators and ask them to enlist Congressional Research Services (CRS), the Government Accountability Office (GAO) and the Congressional Budget Office (CBO) to study the matter and work with the private sector to build a definitive bipartisan plan.

Finally, if your company has a government affairs department, ask it to lobby the federal government to support this worthy cause. Together, we can create this necessary department with no increased cost to taxpayers and keep the U.S. a leader in high-tech for years to come.



Rasmussen Reports

What They Told Us: Reviewing Last Week’s Key Polls

Saturday, August 03, 2013


In this 50-50 nation, Americans remain closely divided over whether government is the problem or the solution.

Just over half of voters still view the federal bailouts of the financial industry negatively, and 56% believe more competition and less government regulation are better for the U.S. financial system than more regulation and less competition.

Fifty-three percent (53%) don’t like President Obama’s national health care law, but 42% do.

The president in budget negotiations with congressional Republicans has proposed cuts in corporate tax rates in exchange for a new federal jobs program. Thirty-eight percent (38%) view Obama’s combination approach as the better way to create new jobs, but nearly as many (36%) think cutting corporate tax rates alone would be more effective. After all, 66% think decisions made by U.S. business leaders to help their own businesses grow will do more to create jobs than decisions made by government officials.

Voters are almost evenly divided these days when asked which party they trust more to handle the 15 important issues regularly tracked by Rasmussen Reports. They continue to trust Republicans most on the number one issue, the economy, and other money issues like taxes, job creation and government spending. They trust Democrats more in areas such as energy policy, the environment, health care and education.

This ambivalence is also reflected in the Generic Congressional Ballot where the two parties have been separated by two points or less every week since mid-April.

Obama’s total job approval held steady at 47% in July, tying the president’s lowest approval rating since December 2011. Fifty-one percent (51%) disapproved of the president’s performance last month.

The president’s daily job approval ratings now are more in line with findings during most of his first term in office.

Still, just 10% of voters now rate Congress’s performance as good or excellent, although this is the first time Congress’ positives have reached double digits this year. Sixty-three percent (63%) believe most members of Congress are willing to sell their vote for cash or campaign contributions, and 55% think it’s at least somewhat likely that their own representative in Congress has sold his or her vote for cash or a campaign contribution.

“To see where the country is heading, ignore Washington,” Scott Rasmussen says in his latest weekly newspaper column, “because most of what the [political] parties talk about is yesterday’s news and is largely irrelevant to the realities of the 21st century.”


And what are those realities?

For the second week in a row, 26% of voters say the country is heading in the right direction. This is the lowest level of confidence since mid-January 2012.

The Rasmussen Employment Index which measures worker confidence dropped 10 points in July to its lowest level since last November. As the Index projected, the government released another tepid jobs report on Friday.

Most consumers and investors believe the U.S. economy is currently in a recession.

Thirty-three percent (33%) of Americans believe the U.S. economy will be stronger in a year, but slightly more (39%) feel the economy will be weaker one year from now.

Following news of Detroit’s bankruptcy filing, 55% believe some states may soon be filing for bankruptcy, too.

Thirty-seven percent (37%) think now is a good time for someone in their area to sell a home.

Despite big gains made in the stock market in July, only 22% think the stock market will be higher than it is today a year from now.

Twenty-nine percent (29%) of Working Americans are now looking for work outside of their current company, the highest finding measured since March 2011. But they remain closely divided as to whether staying or leaving offers them the best career opportunities.

Seventy-seven percent (77%) of all Americans think the current minimum wage of $7.25 an hour is not a wage someone can live on, and 61% favor raising the minimum wage to $10.10 an hour by 2015, a proposal that is currently being considered by Congress.

But while Americans recognize that more minimum-wage fast-food jobs are now being held by workers who are over 20, 74% oppose making the minimum wage different for those in different age groups.


In other surveys last week:

– Even before a military judge handed down his guilty verdict against Bradley Manning on Tuesday, 46% of voters said the former Army intelligence analyst was a traitor for leaking government secrets. But just 33% now want Manning to spend the rest of his life in prison. That’s down from 41% two months ago.


– Thirty-two percent (32%) think Edward Snowden, the private contractor who exposed the National Security Agency’s spying on domestic phone calls, is a traitor, unchanged from last month.

– Forty-six percent (46%) favor putting greater restrictions on the NSA’s tracking of Americans’ telephone calls.

– A recent major study suggests that an increasing number of American women are now the primary breadwinners in married couples, but for most men and women, the man remains the chief earner.


August 10 2013




A Call for Cyber Diplomacy

By Joseph Marks

August 2, 2013


LAS VEGAS — If the United States’ goal is peace and security in cyberspace, officials should think less about cyberwar and more about cyber diplomacy, a scholar from the Atlantic Council said on Thursday.

For all of the talk about lightning attacks that come out of nowhere and the often inscrutable language of experts, cyber conflicts at the national level tend to mirror traditional conflicts much more than you’d expect, Jason Healey, director of the council’s Cyber Statecraft Initiative said during the Black Hat security convention here.

By the time a major attack, say, against the U.S. financial system is being dealt with at the top levels of government, the stakes and strategies are very similar to a traditional national security crisis, said Healey, who was director for cyber infrastructure protection at the White House from 2003 to 2005.

“What’s happening here is not that different from a coup in Pakistan,” he said. On a tactical level, government responders will be calling up bankers and trying to help secure their servers rather than securing embassy staff. But, at a strategic level, the response should be similar, he said.

That means “the president needs to get on the phone with Mr. Putin,” or whichever leader is likely behind the attack, and figure out a way to make it stop, he said.

Healey was editor of the Atlantic Council book A Fierce Domain: Conflict in Cyberspace, 1986 to 2012. While individual cyberattacks may happen with lightning speed, large cyber conflicts such as Russia’s 2007 attacks on Estonia and the Stuxnet attack, reportedly launched by the U.S. against Iranian nuclear facilities, tend to unfold over time and allow for thoughtful decision-making by top leaders.

Cyber diplomacy, he argued, can also be strategic and thoughtful. Some of this diplomacy has taken place in multilateral contexts, he said, such as recent discussions about whether the laws of war apply in cyberspace. More often, it will happen in a bilateral context such as the recent agreement between the U.S. and Russia to install a cyber hotline.


Surveillance scandal rips through hacker community

The good ol’ days of chummy games of “Spot the Fed” at Defcon are finished as hackers and security entrepreneurs plan next steps in the wake of government spying revelations.

by Seth Rosenblatt August 6, 2013 4:00 AM PDT



LAS VEGAS — It used to be that the playful Defcon contest of “Spot the Fed” gave hackers and the government agents tracking them a chance interact in a less serious manner.

Hackers who found a government agent among the conference attendees would wear with pride T-shirts that read, “I spotted the Fed.” The agent would be given a shirt that read, “I am the Fed.” And by flipping the cat-and-mouse dynamic for at least one weekend a year, the two groups more or less came to a greater understanding of each other.

The relationship had gotten friendly enough so that when Gen. Keith Alexander, the director of the National Security Agency, visited Defcon for the first last year, the conference’s founder Jeff Moss told CNET, “It’s eye-opening to see the world from their view,” and noted that he had wanted to score an official visit from the NSA since Defcon began.

It would go too far to say that the uneasy marriage of friendship between the two groups now lies torn asunder in an acrimonious divorce. Hackers, information security professionals, and security experts looking to turn their knowledge into businesses won’t stop working or communicating with the U.S. government. But the response to the scandal has driven many of them back to their more skeptical roots.

“What we need to realize that [Gen. Alexander] is asking us to destroy ourselves,” said Moxie Marlinspike, a well-known information security professional and entrepreneur who has had equipment seized and returned and been detained but never charged by the U.S. government.

“The types of programs that he’s developing are part of this trend of more efficient law enforcement, toward this world of 100 percent effective law enforcement,” said Marlinspike, who uses the alias to protect his legal name.

Marlinspike told CNET that he thinks the NSA is interested in hiring hackers because they tend to have an approach to critical thinking that produces an unusual mindset. Hackers are people, he said, who are “not always abiding by the letter of the law, but are not exactly harmful, either.”

“The problem is that he’s asking us to participate in the destruction of the zone where hackers exist,” Marlinspike said.


No single hacker voice on NSA

Information security professionals are not unified in their interpretation of Alexander’s attempt at a mea culpa at last week’s Black Hat conference here. Alex Stamos, a network infrastructure and security expert and the chief technical officer of Artemis, the company proposing the .secure top-level domain for a safer Internet, said that Alexander was actually aiming his talk not at independent security researchers but the security teams at security companies.

“If you’re a security researcher at a large cloud company, you have to include the NSA on your list of threats that you have to protect against,” he said. Stamos has done government security consulting in the past, although he told an audience at his Defcon session discussing professional ethics for “white hat” or “good guy” hackers that he would reconsider doing so in the future.

Christopher Soghoian, a senior policy analyst and principal technologist with the American Civil Liberties Union, agreed. “I think you’ve got an extra threat in your threat model, and that threat is the NSA.”

Marc Maiffret, a white hat hacker who narrowly avoided serious legal consequences for his teen hacking and has testified in front of Congress on security issues, said that the situation is more than a little ironic.

“We don’t want the NSA to monitor anything, but the whole goal of what [security professionals] do for the most part is to monitor everything. We should have the same safeguards to make sure that those abuses aren’t happening,” he said, referring to the recent surveillance revelations leaked by Edward Snowden.

The ACLU’s Soghoian said that the lack of public discussion is at the core of the problem and has impeded the government’s achieving its stated security-and-safety goals.

“The FBI has a unit now that does nothing but hack into people’s computers, extract documents, control Webcams,” he said. “The FBI’s role as an offensive cyber actor significantly undermines their cause. How can an agency warn people about malware when it’s using malware itself?”

One security start-up that had an encounter with the FBI was Wickr, a privacy-forward text messaging app for the iPhone with an Android version in private beta. Wickr’s co-founder Nico Sell told CNET at Defcon, “Wickr has been approached by the FBI and asked for a backdoor. We said, ‘No.'”

The mistrust runs deep. “Even if [the NSA] stood up tomorrow and said that [they] have eliminated these programs,” said Marlinspike, “How could we believe them? How can we believe that anything they say is true?”

Where does security innovation go next?

The immediate future of information security innovation most likely lies in software that provides an existing service but with heightened privacy protections, such as webmail that doesn’t mine you for personal data.


Wickr’s Sell thinks that her company has hit upon a privacy innovation that a few others are also doing, but many will soon follow: the company itself doesn’t store user data.

“[The FBI] would have to force us to build a new app. With the current app there’s no way,” she said, that they could incorporate backdoor access to Wickr users’ texts or metadata.

“Even if you trust the NSA 100 percent that they’re going to use [your data] correctly,” Sell said, “Do you trust that they’re going to be able to keep it safe from hackers? What if somebody gets that database and posts it online?”

To that end, she said, people will start seeing privacy innovation for services that don’t currently provide it. Calling it “social networks 2.0,” she said that social network competitors will arise that do a better job of protecting their customer’s privacy and predicted that some that succeed will do so because of their emphasis on privacy.

Abine’s recent MaskMe browser add-on and mobile app for creating disposable e-mail addresses, phone numbers, and credit cards is another example of a service that doesn’t have access to its own users’ data.

Stamos predicted changes in services that companies with cloud storage offer, including offering customers the ability to store their data outside of the U.S. “If they want to stay competitive, they’re going to have to,” he said. But, he cautioned, “It’s impossible to do a cloud-based ad supported service.”

Soghoian added, “The only way to keep a service running is to pay them money.” This, he said, is going to give rise to a new wave of ad-free, privacy protective subscription services.

The issue with balancing privacy and surveillance is that the wireless carriers are not interested in privacy, he said. “They’ve been providing wiretapping for 100 years. Apple may in the next year protect voice calls,” he said, and said that the best hope for ending widespread government surveillance will be the makers of mobile operating systems like Apple and Google.

Not all upcoming security innovation will be focused on that kind of privacy protection. Security researcher Brandon Wiley showed off at Defcon a protocol he calls Dust that can obfuscate different kinds of network traffic, with the end goal of preventing censorship.

“I only make products about letting you say what you want to say anywhere in the world,” such as content critical of governments, he said. Encryption can hide the specifics of the traffic, but some governments have figured out that they can simply block all encrypted traffic, he said. The Dust protocol would change that, he said, making it hard to tell the difference between encrypted and unencrypted traffic.

It’s hard to build encryption into pre-existing products, Wiley said. “I think people are going to make easy-to-use, encrypted apps, and that’s going to be the future.”

Longer-term solutions

Right now, the intersection of individual information security experts, governments, and private companies large and small is at a crisis point. How they untangle and compromise their competing interests could have far-reaching ramifications for all.

Maiffret, the teen hacker turned respected infosec expert both inside and outside the infosec community, thinks that the government is going to have to give up some ground.

“I think they know they need to say more. How do you say more, and not jeopardize things,” is the question, he said. “There is a better middle ground. It’s just like businesses accepting social media, it won’t happen overnight.”

Companies could face severe consequences from their security experts, said Stamos, if the in-house experts find out that they’ve been lied to about providing government access to customer data. You could see “lots of resignations and maybe publicly,” he said. “It wouldn’t hurt their reputations to go out in a blaze of glory.”

Perhaps not surprisingly, Marlinspike sounded a hopeful call for non-destructive activism on Defcon’s 21st anniversary. “As hackers, we don’t have a lot of influence on policy. I hope that’s something that we can focus our energy on,” he said.


One Kickstarter Could Revolutionize 3D Printing

By Rachel Feltman

August 2, 2013


One of the biggest obstacles to at-home 3D printing is the difficulty of designing objects to print, but the days of struggling to learn CAD or trawling for templates online may be over. Fuel3D, Inc has just launched (and, in one day, successfully funded) a Kickstarter for a handheld, point-and-shoot, full color 3D scanner. That means you can now print anything you can take a picture of—for about $1,000.

3D scanners aren’t new, but other handheld devices cost upwards of $15,000, according to Fuel3D (they can’t be too far off, since all our attempts to verify led to intimidating “call for pricing” pages.) The tech of this particular model was first developed for medical imaging at Oxford University, and unlike a traditional camera, which just captures enough data to provide the illusion of depth, the Fuel3D actually records the geometry of an object. It uses geometric stereo (capturing an image from two different locations to perceive depth) and photometric stereo (capturing an image under different lights to perceive depth) simultaneously.





States Consider Regulation of Drones in US Skies

Associated Press

By LISA CORNWELL Associated Press

CINCINNATI August 4, 2013 (AP)


Thousands of civilian drones are expected in U.S. skies within a few years and concerns they could be used to spy on Americans are fueling legislative efforts in several states to regulate the unmanned aircraft.

Varied legislation involving drones was introduced this year in more than 40 states, including Ohio. Many of those bills seek to regulate law enforcement’s use of information-gathering drones by requiring search warrants. Some bills have stalled or are still pending, but at least six states now require warrants, and Virginia has put a two-year moratorium on drone use by law enforcement to provide more time to develop guidelines.

Domestic drones often resemble the small radio-controlled model airplanes and helicopters flown by hobbyists and can help monitor floods and other emergencies, survey crops and assist search-and-rescue operations. But privacy advocates are worried because the aircraft can also carry cameras and other equipment to capture images of people and property.

“Right now police can’t come into your house without a search warrant,” said Ohio Rep. Rex Damschroder, who has proposed drone regulations. “But with drones, they can come right over your backyard and take pictures.”

Since 2006, the Federal Aviation Administration has approved more than 1,400 requests for drone use from government agencies and public universities wanting to operate the unmanned aircraft for purposes including research and public safety. Since 2008, approval had been granted to at least 80 law enforcement agencies.

But the FAA estimates that as many as 7,500 small commercial unmanned aircraft could be operating domestically within the next few years. A federal law enacted last year requires the FAA to develop a plan for safely integrating the aircraft into U.S. airspace by September 2015.

Damschroder’s proposed bill would prohibit law enforcement agencies from using drones to get evidence or other information without a search warrant. Exceptions would include credible risks of terrorist attacks or the need for swift action to prevent imminent harm to life or property or to prevent suspects from escaping or destroying evidence.

The Republican said he isn’t against drones but worries they could threaten constitutional protections against unreasonable searches and seizures.

“I don’t want the government just going up and down every street snooping,” Damschroder said.

The Ohio House speaker’s office says it’s too soon to comment on the chances for passage. But similar legislation has been enacted in Florida, Tennessee, Idaho, Montana, Texas and Oregon.

The sponsor of Tennessee’s bill said the law was necessary to ensure that residents can maintain their right to privacy.

“Abuses of privacy rights that we have been seeing from law enforcement recently show a need for this legislation,” said Republican Sen. Mae Beavers.

Beavers and Damschroder modeled their bills after one signed into law this year by Florida Gov. Rick Scott, who said then that “we shouldn’t have unwarranted surveillance.”

But the industry’s professional association says regulating law enforcement’s use of unmanned aircraft is unnecessary and shortsighted. It wants guidelines covering manned aircraft applied to unmanned aircraft.

“We don’t support rewriting existing search warrant requirements under the guise of privacy,” said Mario Mairena, government relations manager for the Arlington, Va.-based Association for Unmanned Vehicle Systems International.

The association predicts unmanned aircraft systems will generate billions of dollars in economic impact in the next few years and says privacy concerns are unwarranted.

In Maine, Gov. Paul LePage vetoed the state’s drone-regulating legislation, saying “this bill steps too far” and would lead to lawsuits and harm Maine’s opportunities for new aerospace jobs. He plans to establish guidelines allowing legitimate uses while protecting privacy.

The American Civil Liberties Union supports legislation to regulate drone use and require search warrants, but it would also like weapons banned from domestic drones and limits on how long drone-collected data could be kept, said Melissa Bilancini, an ACLU of Ohio staff attorney.

In North Dakota, Rep. Rick Becker’s bill to ban weapons from drones and require search warrants failed, but the Republican says he plans to try again because “we must address these privacy concerns.”

Democratic Rep. Ed Gruchalla, formerly in law enforcement, opposed Becker’s bill out of concern it would restrict police from effectively using drones.

“We are familiar with drones in North Dakota, and I don’t know of any abuses or complaints,” he said.

Drones can be as small as a bird or have a wingspan as large as a Boeing 737, but a program manager with the International Association of Chiefs of Police says most law enforcement agencies considering unmanned aircraft are looking at ones weighing around 2 pounds that only fly for about 15 minutes.

“They can be carried in the back of a car and put up quickly for an aerial view of a situation without putting humans at risk,” Mike Fergus said, adding that they aren’t suited for surveillance.

Medina County Sheriff Tom Miller in northeast Ohio says his office’s 2-pound drone is intended primarily for search-and-rescue operations and wouldn’t be used to collect evidence without a warrant.

Cincinnati resident Dwan Stone, 50, doesn’t have a problem with some limits.

“But I don’t oppose drones if there is a good reason for using them,” she said.

Chase Jeffries, 19, also of Cincinnati, opposes them.

“I don’t want the government being able to use drones to spy on people,” he said.



Senate Bill Could Further Delay UAS Integration


by Press • 5 August 2013

By Woodrow Bellamy III


FAA’s goal of integrating unmanned aircraft systems (UAS) into the National Airspace System by 2015 could be delayed by a provision in the Senate’s 2014 transportation bill.

The bill requires Transportation Secretary Anthony Foxx to submit a detailed report on UAS integration’s privacy impact to lawmakers, prior to allowing FAA to issue final regulations on the integration of UAS into the NAS. The Senate’s privacy provision tasks the agency with yet another hurdle to clear in the already time consuming process of UAS integration, which requires complex technological analysis of how these aircraft will behave within civil airspace and fly in the same airspace as commercially operated passenger aircraft, among other measures. UAS industry experts, such as Ben Gielow, government relations manager and general counsel for the Association for Unmanned Vehicles Systems International (AUVSI), are concerned that the provision will further delay integration, as most industry advocates already believe FAA will miss its goal of integration by 2015. Integration will allow UAS to be operated across a wide variety of applications, including law enforcement and border control, weather monitoring, oil and gas exploration, disaster management and aerial imaging. -


See more at:


Unmanned Aircraft Demonstration Highlighted at Agronomy Field Day

by Press • 7 August 2013


One of the most exciting recent developments in agriculture is the use of small Unmanned Aircraft Systems to evaluate crop conditions.  Producers, agronomists and the public can view a demonstration of this new technology at the 2013 Agronomy Field Day on Aug. 16 at Kansas State University’s Agronomy North Farm.

The use of these unmanned aircraft systems equipped with aerial optical sensing technology has gained national press attention in recent months.  Kevin Price, K-State, professor of agronomy and geography, and Deon van der Merwe, associate professor ofveterinary medicine and head of the diagnostic laboratory toxicology section, are among the leading researchers in the nation on this technology.  They are working to develop systems that can be used by the agricultural community.

“We’ve had an incredible reception among consultants, producers, plant breeders and others when we’ve shown them what this technology can do.  It has the potential to make their jobs much easier and will help them make better recommendations in a more timely manner,” Price said.

“At the field day, I will be showing some of the kinds of things we can accomplish with these small Unmanned Aircraft Systems in field demonstrations,” he added.

This is just one of several new technologies featured at the field day by agronomy researchers.  Other demonstrations will show the work of Dave Mengel, professor and soil fertility specialist on optical sensors for nutrient recommendations; Vara Prasad, associate professor and crop physiologist on stress tolerance research; and Peter Tomlinson, assistant professor and environmental quality specialist and Chuck Rice, university distinguished professor of agronomy on techniques used to measure greenhouse gases.

The field day will begin with registration at 9 a.m. and wrap up at 2 p.m.  There is no charge to attend, and a complimentary lunch will be available.  Preregistration is requested so that a lunch count can be made. Those interested in attending can preregister by calling Troy Lynn Eckart at 785-532-5776. To preregister online, see:

Sessions include two concurrent one-hour tours in the morning, starting at 9:45 and 11 a.m. After lunch, there will be demonstrations on sUAS flights; analyzing and interpreting images from sUAS technology; and field checking of optical sensing readings for crop nutrient status.

In addition, there will be displays from commercial companies and K-State researchers in the shed near the registration area, along with the crop garden, forage garden, and weed garden for browsing.  Extension specialists will be available to answer questions.

For more information, interested persons can contact Dallas Peterson at 785-532-0405 or


Is the unmanned grass truly greener on the civil side?


by Keven Gambold • 9 August 2013

It has been an ‘interesting’ couple of years for Unmanned Experts (UMEX), a small team of ex-military UAS/RPAS operators who had spread their wings into the civilian sector. Initially a US-based company, personnel were hand-picked from retiring USAF, US Army, British Army and Royal Air Force to offer consultancy, training and managed services based on their substantial operational UAS expertise. The time seemed right to move into ‘civi street': the rumors swirling around the uniformed crewrooms were ones of endless contracts, too few companies and a yearning need. This article is designed to give a ‘peek behind the curtain’ into the current civil UAS/RPAS scene.


From a CONUS-perspective, the culture shock was considerable: the far-reaching International Trade in Arms Regulations (ITAR) effectively prevents US-based firms from even discussing most UAS-based topics with non US-persons, especially when outside of the country. An enduring lack-of commitment by the Federal Aviation Authority (FAA), despite Congressional mandate to the contrary, has brought all commercial UAS operations in US airspace to a halt. If you read about a real-estate firm using unmanned aircraft to photograph properties, then that is likely illegal, and the FAA has prosecuted a number of such enterprises. This FAA reticence has been exacerbated by a set of knee-jerk State-sponsored Privacy legislations (40 out of 50 States have attempted to enact some form of restrictive regulations) which severely ham-strings Law Enforcement use of the platforms, and a number of UAVs currently sit on the shelves at Sheriff’s offices across the Nation. The still substantial DoD market is justifiably competitive especially with Big Government suffering its own ‘credit crunch’. Not much to rejoice about here.


So ‘change or die’ became the watchword, and UMEX started new companies in both England and Australia, where the respective CAA and CASA aerospace agencies have positively embraced UAS integration efforts (the UK currently has nearly 300 registered civilian commercial operators). Outside of the Missile Technology Control Regime (MTCR limits data exchange on larger platforms i.e. 500kgs over 300km) there are few restrictions on providing international UAS services. Note that European or International airspace integration is woefully behind also, but some countries have ‘gone it alone’ to considerable success.


Unmanned Experts UK Ltd, drawing on over 25,000 hours of cutting-edge UAS operations and instruction, produced a comprehensive set of ground school courses ranging from one-day Career Workshops to 4-week UAS Maintenance Courses. Accreditation is an oft used, but little understood, concept but UMEX Courses are approved by George Mason University for CEUs, and we are tied to Southampton and Lincoln Universities in the UK. Since then, UMEX has run courses in the UK, US (for ASPRS amongst others) and Singapore with more planned for Turkey, the Middle East, India and Africa.


There is a growing demand for Consultancy and Managed Services in far-flung places as the utility of inexpensive SUAS comes to bear on a number of industries: open-cast mine mapping in Australia, anti-poacher patrols in Namibia, oil platform security in Iraq and search (& rescue) operations in Canada to name a few. An undercurrent of ‘good’ stories is emerging across the globe to show the true potential that UAS could bring to everyday lives. UMEX is fielding requests from an ever diversifying client base, and the future looks much brighter.


Bottom Line: the civil market is coming to life and the grass is sprouting, but in the US there’s more watering required.


Keven Gambold

Chief Operations Officer

Unmanned Experts




Washington Post to be sold to Jeff Bezos

Washington Post

By Paul Farhi, Monday, August 5, 4:33 PM


The Washington Post Co. has agreed to sell its flagship newspaper to founder and chief executive Jeffrey P. Bezos, ending the Graham family’s stewardship of one of America’s leading news organizations after four generations.

Bezos, whose entrepreneurship has made him one of the world’s richest men, will pay $250 million in cash for The Post and affiliated publications to the Washington Post Co., which owns the newspaper and other businesses.

Seattle-based Amazon will have no role in the purchase; Bezos himself will buy the news organization and become its sole owner when the sale is completed, probably within 60 days. The Post Co. will change to a new, still-undecided name and continue as a publicly traded company without The Post thereafter.

The deal represents a sudden and stunning turn of events for The Post, Washington’s leading newspaper for decades and a powerful force in shaping the nation’s politics and policy. Few people were aware that a sale was in the works for the paper, whose reporters have broken such stories as the Pentagon Papers, the Watergate scandals and disclosures about the National Security Administration’s surveillance program in May.

For much of the past decade, however, the paper has been unable to escape the financial turmoil that has engulfed newspapers and other “legacy” media organizations. The rise of the Internet and the epochal change from print to digital technology have created a massive wave of competition for traditional news companies, scattering readers and advertisers across a radically altered news and information landscape and triggering mergers, bankruptcies and consolidation among the owners of print and broadcasting properties.

“Every member of my family started out with the same emotion—shock—in even thinking about” selling The Post, said Donald Graham, the Post Co.’s chief executive, in an interview Monday. “But when the idea of a transaction with Jeff Bezos came up, it altered my feelings.”


Added Graham, “The Post could have survived under the company’s ownership and been profitable for the foreseeable future. But we wanted to do more than survive. I’m not saying this guarantees success but it gives us a much greater chance of success.”

The Washington Post Co.’s newspaper division, of which The Post newspaper is the most prominent part, has suffered a 44 percent decline in operating revenue over the past six years. Although the paper is one of the most popular news sources online, print circulation has dwindled, too, falling another 7 percent daily and Sundays during the first half of this year.

Ultimately, the paper’s financial challenges prompted the company’s board to consider a sale, a step once regarded as unthinkable by insiders and the Graham family itself.

With extraordinary secrecy, Graham hired the investment firm Allen & Co. to shop the paper, company executives said. Allen’s representatives spoke with a half-dozen potential suitors before the Post Co.’s board settled on Bezos, 49, a legendary tech innovator who has never operated a newspaper.

Bezos, in an interview, called The Post “an important institution” and expressed optimism about its future. “I don’t want to imply that I have a worked-out plan,” he said. “This will be uncharted terrain and it will require experimentation.”

He said, “There would be change with or without new ownership. But the key thing I hope people will take away from this is that the values of The Post do not need changing. The duty of the paper is to the readers, not the owners.”

Despite the end of the Graham family’s control of the newspaper after 80 years, Graham and Bezos said management and operations of the newspaper would continue without disruption after the sale.

Post publisher Katharine Weymouth—Graham’s niece and the fourth generation of her family involved in the newspaper—will remain as publisher and chief executive of the Bezos-owned Post; executive editor Martin Baron will continue in his job. No layoffs are contemplated as a result of the transaction among the paper’s 2,000 employees, who will be told of the sale at a company-wide meeting Monday afternoon.

Bezos said he would maintain his home in Seattle and would delegate the paper’s daily operations to its existing management. “I have a fantastic day job that I love,” he said.

In a note to Post employees on Monday, Weymouth wrote, “This is a day that my family and I never expected to come. The Washington Post Company is selling the newspaper that it has owned and nurtured for eight decades. “

The new owner of The Post may be as much a surprise as the decision to sell the paper in the first place.

Throughout his storied business career, Bezos has been an empire builder but hasn’t shown any evident interest in the newspaper business. He has, however, maintained a long friendship with Graham, and both men have informally advised the other over the years. Graham, for example, advised Bezos about how to feature newspapers on the Kindle, Amazon’s popular e-reader.

A computer science and electrical engineering student at Princeton, Bezos used his tech savvy to rise rapidly at a New York hedge-fund company, becoming its youngest senior vice president.

He founded Amazon at 30 with a $300,000 loan from his parents, working out of the garage in his rented home in Bellevue, Wash. He called his creation Amazon in part to convey the breadth of its offerings; early promotions called the site “Earth’s Biggest Bookstore.”

Since Amazon’s founding, Bezos has devoted himself to building it into a retail behemoth that sells everything from diapers to garden equipment to data storage at rock-bottom prices with a click of a mouse. It rung up $61 billion in sales last year.


In the process, Amazon has wreaked havoc on traditional brick-and-mortar stores. Many retailers have expressed dismay, and resentment, at Amazon’s ability to sell the same products at a lower price, in part because of its efficiency but also because it wasn’t collecting sales tax in most states.

For long periods, however, Bezos frustrated investors and analysts who wanted Amazon to turn profits more quickly, or more regularly. Because of heavy investments in warehouses and new businesses, Amazon didn’t deliver a profit until the company’s ninth year of operation, and seven years after selling shares to the public.

At times, Bezos has been openly disdainful of Wall Street’s demands for ever-rising quarterly profits. He told Fortune magazine last year, “The three big ideas at Amazon are long-term thinking, customer obsession, and willingness to invent.”

Under Bezos, the company’s drive into new businesses has been relentless. To supplement its line of Kindle readers and tablets, for example, Bezos pushed Amazon into book publishing itself, upsetting rivals like Barnes & Noble and book agents alike. (Bezos himself is an avid newspaper reader; in addition to The Post, he said he reads the New York Times and Wall Street Journal.)

But Amazon’s breakneck growth has also come with a few stumbles. Among other investments, Bezos bought a majority stake in in 1999 and paid $60 million for a portion of, a delivery service. Both companies went out of business. An attempt to compete with in online auctions wasn’t successful.

As such, an investment in Amazon comes with the likelihood of erratic earnings—and sometimes no earnings at all. The company lost $39 million last year.

Ultimately, however, Amazon has rewarded patient believers. Amazon’s sales have increased almost tenfold since 2004 and its stock price has quadrupled in the past five years. “We believe in the long term,” Bezos told Fortune, “but the long term also has to come.”

Friends and competitors have described Bezos as cerebral, demanding, curious, and given to asking challenging questions. He shows little tolerance for those who are poorly prepared, but can be charming and quick to laugh. “If Jeff is unhappy, wait five minutes,” his wife has said of him.

Bezos’ personal ventures have also given no hint of any interest in the news business. He started a private company called Blue Origin in 2000 to develop a space vehicle, and has acquired land in west Texas as a rocket launch site, both part of a lifelong passion for space travel. He is also reportedly spending $42 million to develop a clock inside a mountain in Texas that is designed to last 10,000 years—a symbol of Bezos’ business philosophy of thinking long-term.

In naming Bezos its “Businessperson of the Year” in 2012, Fortune called him “the ultimate disrupter…[who] has upended the book industry and displaced electronic merchants” while pushing into new businesses, such as TV and feature film production.

His drive and business creativity have earned him favorable comparisons to the late Steve Jobs, Apple’s co-founder and a confidant of Don Graham and his late mother, Post Co. chairman Katharine Graham. Earlier this year, Harvard Business Review ranked Bezos as the second best-performing chief executive in the world during the past decade, following only Jobs, who died in 2011.

In a message to employees on Monday, Don Graham quoted billionaire investor Warren Buffett, a longtime advisor to the Post Co., calling Bezos “the ablest CEO in America.”

Bezos’ reputation and smarts made him attractive as a buyer of The Post, said Weymouth. “He’s everything we were looking for—a business leader with a track record of entrepreneurship who believes in our values and cares about journalism, and someone who was willing to pay a fair price to our shareholders,” she said.


Weymouth said the decision to sell The Post sprang from annual budget discussions she had with Graham, her uncle, late last year. “We talked about whether [the Washington Post Co.] was the right place to house The Post,” she said. “If journalism is the mission, given the pressures to cut costs and make profits, maybe [a publicly traded company] is not the best place for The Post.”

Any buyer, she said, “had to share our values and commitment to journalism or we wouldn’t sell it.”

The sale to Bezos involves The Post and its website (, along with the Express newspaper, the Gazette Newspapers and Southern Maryland Newspapers in suburban Washington, the Fairfax County Times, the Spanish-language El Tiempo Latino newspaper, and the Robinson Terminal production plant in Springfield. Bezos will also purchase the Comprint printing operation in Gaithersburg, which publishes several military publications.

The deal does not include the company’s headquarters on 15th St. NW in Washington (the building has been for sale since February), or Foreign Policy magazine,, the, the WaPo Labs digital-development operation or Post-owned land along the Potomac River in Alexandria.

The Post, founded in 1877, has been controlled since 1933 by the heirs of Eugene Meyer, a Wall Street financier and former Federal Reserve official. Meyer bought the paper for $825,000 at a bankruptcy auction during the depth of the Depression.

After years of financial struggle, Meyer and his successor as publisher of The Post, son-in-law Philip L. Graham, steered the paper into a leading position among Washington’s morning newspapers. They began enlarging the company, notably by acquiring TV stations and Newsweek magazine in 1963 (the company sold the magazine for a nominal fee to the late billionaire Sidney Harman in 2010 after years of losses). In later years, the company added cable TV systems and the Kaplan educational division, currently the company’s largest by revenue.

Upon Graham’s death in 1963, his widow (and Meyer’s daughter) Katharine Graham took over management of the company. Despite her inexperience as a corporate executive, Mrs. Graham ably led the company through a colorful and expansive period.

The newspaper rose to national stature under Benjamin C. Bradlee, whom Katharine Graham had hired from Newsweek in 1965 as a deputy managing editor and promoted to editor in 1968. Bradlee oversaw the opening of new reporting bureaus around the nation and the world, started the Style section, and ignited the paper’s long run of Pulitzer Prize-winning reporting.

The Post’s and New York Times’ publication in 1971 of stories based on the Pentagon Papers—a secret government study of American military and political involvement in Vietnam—led to a landmark legal case in which the Supreme Court prohibited the government from exercising “prior restraint,” or pre-publication censorship, against the newspapers.

The arrest of seven men accused of breaking into the Democratic National Committee’s headquarters at the Watergate office complex in 1972 triggered the newspaper’s unearthing of a series of illegal activities orchestrated by President Nixon and his closest advisers. The revelations eventually led to Nixon’s resignation. The events were memorialized by the movie “All the President’s Men,” which turned The Post—as well as Bradlee and reporters Bob Woodward and Carl Bernstein—into household names.

Seven years after Nixon’s resignation, however, the paper suffered one of its darkest hours. It was forced to give back a Pulitzer Prize awarded to reporter Janet Cooke in 1981 after she admitted that her story about an eight-year-old heroin addict in Washington named Jimmy was a fabrication.

Katharine Graham, who died in 2001, was succeeded as Post publisher by her son, Donald, in 1979. He also succeeded her as chief executive of the Washington Post Co. in 1991.

During the 1990s and into the new century, under Bradlee’s successor, Leonard Downie Jr., the paper enjoyed arguably its most successful run in terms of profits, circulation and journalism. With little direct competition, the newspaper division’s revenue and profit soared. The Post won 25 Pulitzers under Downie, including six in 2008, the year he retired and was succeeded by Marcus Brauchli as editor.

The Grahams are among the last of a dwindling number of multigenerational family owners of metropolitan newspapers. Most major newspapers were once owned by local families with decades-long ties to their town or city, but that ownership profile has faded with succeeding generations and has largely disappeared in the Internet era.

Many of the heirs to great newspaper fortunes have sold their holdings to corporations or wealthy investors with little connection to the regions that the newspapers helped shape or, in some instances lately, to local businesspeople whose wealth was more recently acquired.

Over the past 20 years, the list of family-owned companies that have sold their newspapers holdings include the Chandlers (owners of the Los Angeles Times, among others), Cowles (Minneapolis Star Tribune), Copleys (San Diego Union-Tribune), and Bancrofts (Wall Street Journal).

The New York Times, controlled by the Sulzberger family, is among the last major dailies still operated by descendants of its early proprietor. It acquired The Boston Globe from members of the Taylor family in 1993 for $1.1 billion; it announced last week it was selling the paper for a mere $70 million to Boston businessman John W. Henry, a businessman who owns the Boston Red Sox.

Following the sale to Bezos, the Graham family will continue to control the renamed Washington Post Co. through its closely held stock, known as Class A shares. The A shares can’t be sold on the open market, but out-vote a second class of public stock, called Class B shares. The New York Times Co. has a similar stock structure, ensuring the Sulzbergers’ control.

Bezos, who ranks 11th on the Forbes 400 list of wealthiest individuals in America with a net worth of $23.2 billion, has given little indication of his ideological leanings over the years. He hasn’t been a heavy contributor to political campaigns, although he and his wife have regularly donated to the campaign of Sen. Patty Murray (D-Wash). In years past, they had given modest contributions to a handful of Republican and Democratic senators.

Bezos’ political profile rose suddenly and sharply when he and his wife agreed last year to donate $2.5 million to help pass a referendum that would legalize same-sex marriage in Washington State, catapulting them to the top ranks of financial backers of gay rights in the country. The donation doubled the money available to the initiative, which was approved last November and made Washington among the first states to pass same-sex marriage by popular vote.

Perhaps the single biggest item on Amazon’s legislative agenda is a bill that would empower all states to collect sales tax from online retailers.

Amazon is only required to collect sales taxes in states where it maintains a physical presence such as a warehouse. But Amazon now is supporting the bill, which has passed the Senate and is pending in the House. State sales taxes no longer pose a real threat to Amazon: With an emphasis on same-day shipping, the company is building distribution warehouses across the country and would have to pay the tax anyway. Last month, the company announced it would hire 5,000 employees at these warehouses, an ambitious growth strategy that is hurting profits in the short run.

Bezos’ most notable charitable donations have been twin $10 million contributions to two Seattle-based institutions, the Museum of History and Innovation and the Fred Hutchinson Cancer Research Center. The gift to the museum was for the creation of a center for innovation that would be situated a few blocks from a new Amazon headquarters campus.

Baron, the former editor of the Boston Globe who joined The Post as its editor in January, said he was surprised to learn last week that the newspaper was being sold.

But he added, “I’m encouraged that the paper will be in the hands of a successful business person who understands the world of technology as well as anyone. He’s expressed his commitment to the organization and to its continued independence…I came here because I wanted to join a great news organization, and it will continue to be one.”



OPM delays goal to eliminate pension claims backlog

Federal Times

Aug. 5, 2013 – 05:20PM | By SEAN REILLY | Comments


The Office of Personnel Management missed its goal last month of eliminating a backlog of new retirement claims and of processing most new retirees’ first full pension checks within 60 days.

The 7,724 new retirement claims processed last month marked an 11 percent drop from the June total and was down by almost half since February, when OPM processed more than 15,300 new retirement claims, according to statistics released by OPM Monday. The backlog of pending claims rose slightly last month to 25,601, up from 25,542 the previous month.

The drop reflects the continuing repercussions of a sequester-related budget crunch that forced OPM’s retirement services office to end employee overtime at the end of April.

In January 2012, then-OPM Director John Berry set a target for eliminating the existing claims backlog by last month and processing 90 percent of retirees’ first full pension checks with 60 days of leaving government service. At that point, OPM was taking more than 150 days to process the average new claim; that time has since fallen to about 90 days.

On Monday, OPM revised its schedule for achieving those goals. It now says it will eliminate the backlog by March and hit the 60-day processing target by next May on the assumption that the agency will be able to restart overtime for retirement services employees on Oct. 1 when the new fiscal year begins.

At present, however, it is far from certain whether Congress will approve the administration’s request for an 8 percent funding increase for OPM’s retirement services division in fiscal 2014.

In an interview Friday, Ken Zawodny, associate director for retirement services, could not say how the agency’s latest timetable to improve retirement claims processing will be affected if overtime cannot resume in October.

“I have to be optimistic that we will obtain our budget,” Zawodny said. He also said OPM has taken some steps on its own to improve performance, such as reorganizing retirement services employees, in order to cut the average claims processing time by 40 percent since the beginning of last year. In the updated plan, OPM cited U.S. Postal Service early-out programs that added more 20,000 claims to the agency’s workload a another reason for its failure to hit the 60-day processing goal.




Why Insiders, Not Hackers, Are the Biggest Threat to Cybersecurity



By Brian Fung

National Journal

June 10, 2013 0

The National Security Agency leaks by Edward Snowden will easily go down as one of the biggest revelations of the year, if not the decade. But the episode also raises new questions about the risk that insiders pose to government and corporate cybersecurity, in spite of the attention lavished on foreign hackers.

Snowden’s case is unique in that it uncovered a previously unknown surveillance apparatus that’s massive in size and scope.The way the whistle-blower did his deed, however, is not unique. Two-thirds of all reported data breaches involve internal actors wittingly or unwittingly bringing sensitive information to outsiders, according to industry analysts.

“It’s not an either-or proposition,” said Mike DuBose, a former Justice Department official who led the agency’s efforts on trade-secret theft. “But amidst all the concern and discussion over foreign hacking, what gets lost is the fact that the vast majority of serious breaches involving trade secrets or other proprietary or classified information are still being committed by insiders.”

DuBose is now the head of the cyber investigations unit at the risk-management firm Kroll Advisory Solutions. In February, his team authored a report warning that contractors, information-technology personnel, and disgruntled employees—all descriptors that fit Snowden pretty well—pose a greater threat than hackers, “both in frequency and in damage caused.”

Not everyone agrees. Even though insiders generally play an outsized role across all reported data breaches, their role in confirmed data breaches is rather small, according to an annual study by Verizon. In 2012, specifically, internal actors accounted for 14 percent of confirmed data breaches. Of those, system administrators were responsible for 16 percent.

“Our findings consistently show,” the Verizon report read, “that external actors rule.”

However common they are, cases like Snowden’s show how devastating one insider can be. The extent of the damage depends on what’s being exfiltrated and from where, and there aren’t many standards for calculating losses. Most companies estimate the value of their trade secrets based on how much money they sank into the research and development of that knowledge. But for the government, it’s the potential security impact that takes precedence—and that turns the question into a matter of subjective debate.

Last month, The Washington Post reported that Chinese spies compromised the designs for some of the Pentagon’s most sensitive weapons systems, including the F-35 Joint Strike Fighter, the V-22 Osprey tiltrotor aircraft, and the Navy’s new Littoral Combat Ship.

If true, the report could have major consequences for national security. But Snowden’s case is equally consequential, if for different reasons, and it bolsters DuBose’s point about the relevance of insiders. Snowden may have rightfully uncovered evidence of government overreach, but if a mid-level contractor can steal top-secret information about the NSA and give it to the public in a gesture of self-sacrifice, someone else could do the same—but hand the intelligence to more nefarious actors.



DHS Set to Tap New Cybersecurity Leader

Phyllis Schneck’s Road to Deputy Undersecretary

By Eric Chabrow, August 7, 2013. Follow Eric @GovInfoSecurity


Phyllis Schneck, the next deputy undersecretary for cybersecurity at the Department of Homeland Security, comes to the job with a different set of experiences than her predecessors – and that could prove valuable.

Schneck will leave her job as chief technology officer for the public sector at security provider McAfee to become the top cybersecurity policymaker at DHS, a senior DHS official confirms.

Unlike her predecessors, Schneck has a strong academic and research background, having earned a Ph.D. in computer science from Georgia Tech. She holds three patents in high-performance and adaptive information security and has six research publications in the areas of information security, real-time systems, telecom and software engineering. Before becoming McAfee’s public sector CTO, Schneck served as the company’s vice president of threat intelligence.

“Her underpinning background is in security, which gives her credibility,” says Chris Buse, chief information security officer for the state of Minnesota. “She is very poised and is an excellent communicator [who] has experience dealing with the political processes, something that is vital to this job.”

Schneck will take over the job Bruce McConnell has held on an acting basis since April, when Mark Weatherford resigned to join the consultancy The Chertoff Group [see DHS’s Mark Weatherford Resign]. McConnell is leaving DHS at week’s end after four years as the department’s senior counselor on cybersecurity [see Another Senior Cybersecurity Leader to Exit DHS].


Awaiting Official Announcement

DHS has not officially announced Schneck’s appointment, first reported by The Hill newspaper. Schneck has not commented publicly on her new job. A spokeswoman for Schneck says she will not say “anything about any potential job change unless it were to be official.”

The appointment does not require Senate confirmation.

Schneck’s academic and research background reflects an out-of-the-box thinking about IT security. She champions, for example, developing IT security technology to emulate a human body with a strong immune system to battle infections.

“Just as your body defends against thousands of colds every year and you only maybe only get one, that’s what these systems are designed to do: push off the enemy and push off malicious traffic, without it having to have a name, and certainly without it having to have a signature – just understanding what is good or legitimate and what is not well intended or not needed and being able to defend against that and get smarter as you do so,” Schneck said in an interview with Information Security Media Group [see Infosec Evolution Mimics Immune Systems].


Hefty Curriculum Vitae

Her resume is filled with mostly private-sector jobs, but early in her career she worked at the NASA Goddard Space Flight Center and the University Of Maryland’s Department of Meteorology. Her corporate life has been consistently intertwined with government. As McAfee’s public sector CTO, she oversees the company’s technical vision for public-sector applications of security and global threat intelligence.

Schneck served as vice chair of National Institute of Standards and Technology’s Information Security and Privacy Advisory Board, a panel charged with identifying emerging managerial, technical, administrative and physical safeguard issues as they relate to cybersecurity and privacy. She was the working group co-chair for public-private partnership for the Commission on Cybersecurity for the 44th Presidency, a panel that produced a report that served as the outline for President Obama’s cyberspace policy.


“She has experience dealing with the political processes, something that is vital to this job,” Minnesota’s Buse says.

Dwayne Melancon, chief technology officer for risk-based security and compliance management solutions provider Tripwire, says Schneck’s experience should prove vital as DHS reconfigures itself. “Her technology background will be well-received by those in the information security industry – a kindred spirit is always more welcome,” he says.

And that spirit will be challenged as DHS implements Obama’s cybersecurity framework being developed by a public-private partnership under NIST’s auspices [ NIST Unveils Draft of Cybersecurity Framework ].


Major Challenge

Melancon says the public-private partnership is not fully realized, and that presents Schneck with one of her biggest challenges as she assumes her new role.

“This effort hasn’t yet delivered on its promise, and I believe our national security will pay the price if this isn’t solved,” Melancon says. “The problem isn’t the lack of capability – it’s our inability to actually get that capability delivered to the places where it’s most needed, which is securing our critical cyber-infrastructure in both the public and private sectors. … Dr. Schneck can’t just commission new efforts; she must also take a critical eye to existing programs and decommission those which aren’t delivering.”

Schneck comes aboard as the leadership within cybersecurity and infrastructure protection at DHS is in flux. Secretary Janet Napolitano, one of the administration’s chief spokespersons on cybersecurity, will leave in September, and the deputy secretary, Jane Holl Lute, resigned this spring. President Obama nominated Alejandro Mayorkas, director of DHS’s Citizenship and Immigration Services, to be deputy undersecretary, but his nomination is stalled in the Senate.

Rand Beers, undersecretary for the National Protection and Programs Directorate, is serving as DHS acting deputy secretary, and could become acting secretary if neither Mayorkas nor Napolitano’s successor is confirmed by the time she leaves. Beers will leave DHS when either the new secretary or deputy secretary is confirmed. Suzanne Spaulding, deputy undersecretary for infrastructure protection, has been nominated as undersecretary to replace Beers; she’s now serving as acting undersecretary. Schneck will report to Spaulding.


Making the Transition

Schneck’s presence at DHS could provide some stability to a department experiencing much tumult in its higher ranks in recent months [see Another Leadership Shakeup at DHS ]. But Eugene Spafford, executive director of the Center for Education and Research in Information Assurance and Security at Purdue University, wonders about Schneck’s adjustment to government bureaucracy, coming from McAfee, which he describes as a more diverse and security-focused environment than DHS.

“I’m not sure how much exposure she will have had to the kinds of pressures and trade-offs likely to be found in DHS, especially given some of the churn there in recent months,” Spafford says.

“Phyllis does like to get things done, so that could be a problem when inside DHS,” Spafford says. “Or, it could be a good thing – we will have to wait and see. If she is frustrated about getting things done, she’s not likely to stick around for a long time.”

If the Chinese army is trying to hack a Missouri water plant, what else is it infiltrating?


By Gwynn Guilford @sinoceros August 6, 2013


The question of whether the Chinese military is on a hacking offensive has largely been answered—and, despite Chinese government protestations, it sure looks like a pretty big “yes.” However, beyond the widely reported infiltration of foreign companies, the question of what else it’s hacking remains hazy.

But new research confirms one of the scarier possibilities: that the Chinese army isgoing after critical US infrastructure.

This all came out of a project by Kyle Wilhoit, a researcher at a security companycalled Trend Micro, that set up decoy “honeypots,” as decoy infrastructure systems are known, in 12 different countries. Wilhoit’s construction of a rural Missouri water plant honeypot attracted the notorious APT1, a crackerjack Chinese military hacking team sometimes known as Comment Crew, according to research he presented at the Black Hat security conference in Las Vegas. The group gained access in December 2012 through a Microsoft Word document containing malware unique to APT1.

“I actually watched the attacker interface with the machine,” he told the MIT Technology Review, referring to APT1. “It was 100% clear they knew what they were doing.”

Wilhoit noted to that systems like power grids and water plants are inherently vulnerable to hacking threats because they’re using outdated technology and don’t necessarily prioritize security. Of the “critical” honeypot attacks—meaning those that could have taken control of the system—half came from China. Examples include turning the pump’s water temperature to 130˚ Fahrenheit and shutting down the system.


This suggests that Chinese hackers are taking control of critical infrastructure. “These attacks are happening and the engineers likely don’t know,” Wilhoit told MIT Technology Review. Here’s a look at APT1′s past activity:



To date, the only publicly disclosed cyberattacks on US control systems have been on a factory and a state government building, both in New Jersey, reports MIT Technology Review. It’s unclear where those attacks originated.


Advice from the CIA: Keep your eye on your laptop…

Wed, 2013-08-07 05:14 PM

By: Jacob Goodwin

John Mullen, a longtime senior operations officer with the CIA, caught the attention of his audience at the SINET Innovation Summit in New York City on August 6 as he matter-of-factly recited the ways in which foreign intelligence services routinely steal industrial secrets and intellectual property from naïve traveling U.S. business people visiting their countries.

Mullen noted that advanced technologies have certainly helped these foreign intelligence services to steal valuable information from Americans (and other visiting travelers) but that their true advantage was what he dubbed “human frailty.” The naïve willingness of an ill-informed visitor to allow himself to be physically separated from his laptop, to willingly accept a computer file from his host, or to avail himself of computer services in the local market are the true Achilles Heels, Mullen suggested.

“Human frailty is often the weakest link in the chain,” he declared.

Mullen emphasized that unlike the U.S. Government, many foreign governments — he declined to name them — have identified as part of their national economic strategy a concerted effort to steal whatever intellectual property they can lay their hands on. To that end, these assertive governments “will manipulate your relationships and your friendships” to achieve their goals, he advised.

“When you’re on their turf,” said Mullen, “they own you.”

He said some foreign governments — as well as some foreign businesses, which work extremely closely with their own governments — will use mobile technologies to keep you under constant surveillance. “They’ll hot mic your cell phone,” said Mullen, “and they’ll track your movements.”

Some governments are not above blackmail to achieve their goals, he added. He told the group of cybersecurity professionals who gathered for a one-day conclave at Columbia University about one such instance he recalled. An employee of a U.S. software company was traveling on business overseas when he was seduced by a “femme fatale,” working for a foreign intelligence service. The U.S. traveler was filmed during his exploits and later blackmailed into sharing some of his proprietary information, for fear that the incriminating film would otherwise be made public.

Of course, said Mullen, people are still naively clicking on attachments to incoming email messages (some of which contain dangerous malware), and still beginning online relationships with people they’ve never met face-to-face. “Intelligence services don’t have to meet people to recruit them,” he warned. “They can meet them online.”

Mullen suggested that most of these techniques — and more — are practiced by many of the more-aggressive foreign governments. How does he know? Because he has practiced some of the same data gathering techniques himself over the years…on behalf of Uncle Sam.


Rasmussen Reports

What They Told Us: Reviewing Last Week’s Key Polls

Saturday, August 10, 2013


Who’ll be the next in line?

Hillary Clinton is the overwhelming favorite among Democratic voters for their party’s presidential nomination in 2016. New Jersey Governor Chris Christie holds a narrow lead among Republicans for the 2016 GOP nomination. 

But not so fast. Clinton was also the clear favorite in August 2005, three years before the 2008 Democratic National Convention. She subsequently lost in the primaries to Illinois Senator Barack Obama. Meanwhile, GOP voters who say Christie is the candidate they least want to see nominated outnumber those who support him by 10 percentage points.

History has shown us time and again that 2016 is not likely to end up as a matchup of the current front-runners, Clinton and Christie, Scott Rasmussen says in his latest weekly newspaper column. 

Democrats’ fortunes in 2016 – and in next year’s mid-term elections – are likely to be tied in large part to perceptions of how President Obama’s national health care law is working out.

Roughly half (48%) of all voters now want their governor to oppose implementation of the health care law in their state. Forty percent (40%) want their governor to support implementation. That’s a complete reversal from January when 47% wanted their governor to support implementation of the law and 39% were opposed.

The economy is sure to be in play in the upcoming national elections as well. Despite the continuing bad national job numbers, 52% of Americans believe it is still possible for anyone who really wants to work to find a job, but only a plurality (44%) now thinks it’s possible for just about anyone in the United States to work their way out of poverty

Sixty-five percent (65%) of working Americans consider themselves members of the middle class, but only a third of all Americans believe the economy is fair to the middle class

At the same time, consumer and investor confidence remain near their highest levels in several years. 

But  just 27% of voters say the country is heading in the right direction

Job approval ratings for the president himself have settled back to the levels seen during most of his first term in office after a surge just before – and for three or four months after – Election Day last November. 

Gun control was one of the top issues on the president’s agenda at the first of the year, but support for tougher anti-gun laws is down from just after the elementary school massacre in December. Americans now are evenly divided when asked if the United States needs stricter gun laws. 

When the National Rifle Association proposed armed guards in schools rather than more gun control, it was mocked by many in the media. But 62% of Americans with children of elementary or secondary school age would feel safer if their child attended a school with an armed security guard.

The president faced a couple of foreign policy flare-ups this past week, so we asked voters what they thought about them.

Following Obama’s decision to snub Russian President Vladimir Putin during an upcoming trip to Europe, U.S. voters’ views of Putin have worsened, but most think it’s unlikely the United States and Russia will reenter a Cold War period. Views of Russia are unchanged.

The Obama administration opted to temporarily close embassies and consulates in several Middle Eastern countries because of terrorist threats. Fifty percent (50%) of voters think U.S. involvement in Middle East politics is bad for America

Yet while 49% now believe most Muslims around the world view the United States as an enemy, just 19% think American Muslims living in this country are treated unfairly because of their religion and ethnicity

Fifty-eight percent (58%) think the Muslim-American U.S. Army officer now being tried for killing 13 of his fellow soldiers at Fort Hood, Texas should be sentenced to death if convicted, but that’s consistent with attitudes toward other suspected mass killers in recent years.

Republicans now hold a three-point lead over Democrats on the Generic Congressional Ballot.  This is the largest gap between the two parties since mid-April.

In other surveys last week: 

– Most voters continue to think gaining control of the border is more important than legalizing the status of undocumented workers already living in the United States. Only 32% believe that if a woman comes to the country illegally and gives birth to a child here, that child should automatically become a citizen. That’s the lowest finding measured since November 2011.

– One-in-four Americans (25%) think mass transportation in this country is not as safe as it was 10 years ago. Most Americans seldom, if ever use mass transit, but they still tend to believe the government should back mass transit projects as long as they don’t lose money.  

Sixty-three percent (63%) of voters believe global warming is at least a somewhat serious problem, including 35% who consider it a Very Serious one.  

– Fifty-eight percent (58%) of Americans, when given the choice, still prefer to read a printed version of a newspaper over the online version. But the number of adults who prefer print newspapers is the lowest measured in several years of tracking. Just 24% say they buy a print copy of their local paper every day or nearly every day, down from 31% four years ago.  

Detroit Tigers slugger Miguel Cabrera and veteran catcher Yadier Molina of the St. Louis Cardinals are the early fan favorites for MVP in the American and National leagues respectively.

– With the second half of the Major League Baseball season underway, the Tigers are now the favorite to win the World Series. 


July 27 2013




Pentagon chief can’t offer hope in budget cuts

Updated: 3:45 a.m. Monday, July 22, 2013 | Posted: 3:44 a.m. Monday, July 22, 2013


The Associated Press


The audience gasped in surprise and gave a few low whistles as Defense Secretary Chuck Hagel
delivered the news that furloughs, which have forced a 20 percent pay cut on most of the military’s civilian workforce, probably will continue next year, and it might get worse.

“Those are the facts of life,” Hagel told about 300 Defense Department employees, most of them middle-aged civilians, last week at an Air Force reception hall on a military base in Charleston.

Future layoffs also are possible for the department’s civilian workforce of more than 800,000 employees, Hagel said, if Congress fails to stem the cuts in the next budget year, which starts Oct. 1.

On the heels of the department’s first furlough day, and in three days of visits with members of the Army, Navy, Air Force and Marine Corps, Hagel played the unenviable role of messenger to a frustrated and fearful workforce coping with the inevitability of a spending squeeze at the end of more than a decade of constant and costly war.

The fiscal crunch also lays bare the politically unpopular, if perhaps necessary, need to bring runaway military costs in line with most of the rest of the American public that has struggled economically for years.

“Everybody’s bracing for the impact,” Army Master Sgt. Trey Corrales said after Hagel spoke with soldiers during a quick stop at Fort Bragg, N.C.

Corrales’ wife, a military civilian employee, is among those furloughed, and they have cancelled their cable TV and started carpooling to work to save money.

“The effects of the economy have started to hit the military,” Corrales said. “It was late in coming to us.”

The furloughs have hit about 650,000 civilian employees but also have slowed health care and other services for the uniformed military, which has stopped some training missions and faces equipment shortages due to the budget shortfalls. Troops were told this month they will no longer receive extra pay for deployments to 18 former global hot spots no longer considered danger zones.

Troops already are facing force reductions, and the Army alone has announced plans to trim its ranks by 80,000 over the next five years.

Officials agree that the military has undergone cycles of expanding and shrinking of the force over generations. Hagel said this time is different, and worse, however, because of what he described as a “very dark cloud” of uncertainty hanging over the Pentagon as Congress considers whether to reverse $52 billion in spending cuts that are set to go into effect in 2014.

At the Naval Air Station in Jacksonville, Fla., Hagel told an estimated 100 civilians gathered in a bustling jet maintenance hangar that the military had not been prepared for the $37 billion in cuts that took effect this year, forcing the furloughs. While he said he was deeply sorry for the strain the crunch has put on families, he said he would not slash troops’ training or other readiness budgets any further to prevent huge gaps in national security.

“I’m sure you realize how disruptive the furlough is to our productivity. So I’m hoping that we’re not going to do it again next year,” Elizabeth Nealin, a research and engineering manager at the navy base’s fleet readiness center, told Hagel.

“Have you planned for a reduction in force?” Nealin asked bluntly.

Hagel said if the $52 billion cut remains in place, “there will be further cuts in personnel, make no mistake about that.”

“I don’t have any choice,” he said.

The spending cuts this year may feel more dramatic than in times past because of a vast growth in Defense Department personnel and equipment costs over the past decade, said Todd Harrison, a senior fellow at the Center for Strategic and Budgetary Assessments in Washington. But current spending levels are close to what they were in 2007, when the war in Iraq was at its peak.

“So we’re not even back to a pre-9/11 level,” he said.

Since 2000, the number of U.S. troops has grown by about 3 percent to fight the wars in Iraq and Afghanistan, Harrison said. But the number of civilian Defense employees hired to support the troops has far surpassed that, growing by 14 percent in the same time.

Hagel said he is taking a hard look at where fat can be trimmed from the Pentagon and said the military has been “guilty of wasting a lot of money on a lot of things.” But he also said he “can’t lead this institution based on hope, based on I think, or based on maybe” — and predicted more dollar cuts ahead.

In Charleston, where the hopeful crowd quickly turned worried, Sandra Walker pointedly asked Hagel what might be in store for her job security, retirement benefits and security clearances if the shortfalls continue.

“I’ve taken a second job to compensate, because I have several children at home,” said Walker, who works in education and training at a medical clinic on base. “And if we are going to have future furloughs, will those things be taken into consideration for the future of our jobs?”

Sticking to his message, and stopping short of directly answering her question, Hagel offered little hope.

“There’s no good news,” he said.


Sandy-ravaged regions will never get landlines back

By Katie Lobosco @KatieLobosco July 22, 2013: 6:06 AM ET


Last fall, Superstorm Sandy wiped out landline telephone service for thousands of people. Many of them are never getting those landlines back.

Verizon (VZ, Fortune 500) is still in the process of repairing the telephone infrastructure that was damaged by the massive storm in late October. But in many cases, the telecom giant is replacing the old copper-based systems with new technologies — including wireless.

Those changes are coming for the industry as a whole, whether or not telecom giants like Verizon and AT&T (T, Fortune 500) want them to. And they were coming long before Sandy struck. The parts needed to repair the old landline technology are hard to find, sending companies to some odd places to purchase equipment, such as eBay (EBAY, Fortune 500).

“It can’t be that our critical infrastructure is relying on eBay for replacement parts,” said Bob Quinn, head of AT&T’s regulatory affairs.

Manufacturers that once made the required components, such as Nortel and Lucent, have gone out of business or been bought out, noted Danielle Coffey, a vice president at the Telecommunications Industry Association.

“It’s not only eBay, there’s a whole secondary market for these parts,” she said.

Related story: are landlines doomed?

That’s because landlines are a dying business. Many customers have switched to cell phones or VoIP services like Microsoft’s (MSFT, Fortune 500) Skype to make calls. More than 36% of Americans use cell phones as their only telephone service, about ten times the rate from a decade ago, according to a Centers for Disease Control study.

Still, many telephone customers in Sandy-ravaged areas are displeased about the prospect of losing their landlines.

On Fire Island, N.Y., off the southern coast of Long Island, Verizon is replacing its copper landlines with a wireless telephone system called Voice Link. The new system consists of a small modem-sized device that plugs into an electrical outlet and a standard telephone jack in your wall at home. That device connects to Verizon’s wireless cellular network, which brings phone service and a dial tone to the existing cord or cordless phones in the home. Customers can use it to make calls, and it and offers services like call waiting, caller ID and voice mail.

But, at least for now, Voice Link can’t connect customers to the Internet. That means medical alert services often used by senior citizens will not work. Those kinds of systems allow a customer to press a medical alert button immediately contacting a monitoring center. Alarm services, fax machines, and DSL Internet won’t work either.

Related story: AT&T isn’t nearly as bad as you think

Hundreds of Fire Island residents have filed complaints with the New York Public Service Commission about the service.

“It’s not quite ready for prime time,” said Harold Feld, the senior vice president of Public Knowledge, an advocacy group that opposes the all-wireless Voice Link system. “If we do switch to wireless as an alternative, then we want this wireless alternative to be as good or better than what we have now.”

Verizon offers the only telephone service on the island, so the hundreds of residents of the popular vacation spot have little choice but to accept Verizon’s Voice Link plan. Verizon said it intends to improve the system as time goes on.

Fire Island is a “unique situation,” said Tom Maguire, the senior vice president for national operations at Verizon. Wireless is not the only path forward for swapping out copper lines for new technology. AT&T and Verizon in many cases are replacing copper with fiberoptic cable and upgrading their networks from a series of routers and switches to a modern digital network.

Related story: Femtocell hack reveals mobile phones’ calls, texts and photos

But Fire Island isn’t the only place where Verizon is installing Voice Link. The company began working on the Voice Link system well before the storm, testing it in places like Florida and Virginia as a way to connect customers without having to repair existing copper lines.

In areas other than Sandy-ravaged communities, Maguire said, the Voice Link system will be available as an option — and not every customer is a suitable candidate. It is for people who do not want DSL Internet service and do not have services like alarm and life support systems. And if a Verizon technician goes to a home to install Voice Link and there is weak cellular network signal, Verizon would in that case repair the copper wire instead of installing the wireless system.

Verizon has deployed Voice Link in Mantoloking, N.J., which was also heavily damaged by Hurricane Sandy, and the company has plans to deploy it in areas like the Catskills where the copper infrastructure is badly damaged.


Future Carriers Built to Carry Drone Fleets


by KRIS OSBORN on JULY 19, 2013


Navy planners have anticipated the recent historic steps forward the Navy has taken toward outfitting the decks of their carriers with fleets of unmanned drones by designing future and current carriers to support the technological advances these aircraft will present, officials said.

The U.S. Navy’s new Ford-class aircraft carriers are engineered with the ability to accommodate more carrier-launched unmanned aircraft systems similar to the X-47B that landed on the deck of the USS George H.W. Bush July 10, service officials explained.

The Ford-class carriers are built with a series of technological advances compared to their predecessors — to include a slightly larger flight deck, upgraded nuclear power plants, dual-band radar, improved landing gear and vastly increased on-board electrical capacity to include a new electromagnetic propulsion system for aircraft taking off the deck, said Rear Adm. Thomas J. Moore, Program Executive Officer, Carriers.

“The Ford Class will be around until about 2110. The flight deck has been designed to be bigger and have a higher sortie generation rate. The ship itself is built with three-times the electrical generating capacity than the Nimitz {Ford predecessor} class has – so it is not hard to envision that we are going to be flying unmanned aircraft off that ship,” said Moore.

Citing the recent historic touchdown of the X-47B demonstrator aircraft aboard the USS George H.W. Bush, Moore said the Ford-class carriers are engineered with a specific mind to next-generation aviation and ship-based technologies.

The Ford-class of carriers are being built with emerging technological trends in mind and the expected increase in unmanned systems and electrically-generated weapons systems.

Moore said that if you look at the kind of aircraft which initially flew on a Nimitz-class carrier when they first emerged in the 1970s, they are very different than what is flying on those carriers today. In fact, the Joint Strike Fighter aircraft will fly on the Nimitz-class carriers before they retire, he said.

“Unmanned aircraft will certainly be part of our portfolio moving forward – they will not replace manned aircraft but will play an important role.”

The USS Ford is slated to enter the water at a christening ceremony in November of this year and begin formal service by late 2016. It is the first-in-class in a planned series of next-generation Ford-class aircraft carriers designed to replace the current Nimitz-class carriers on a one-for-one basis over roughly the next 50 years.

The USS Gerald R. Ford (CVN 78), now nearing completion, will be followed by the second and third Ford-class carriers, the USS John. F. Kennedy (CVN 79) to enter service by 2025 — and the USS Enterprise (CVN 80), slated to enter service by 2027.

The Ford-class carriers will have four 26 megawatt electrical turbine generators, designed in part to power key systems on the ship, including dual-band phased array radar and the Electro-Magnetic Aircraft Launch System, or EMALS – put on carriers to replace the current steam catapults for aircraft on the flight deck.

“On a Nimitz-class carrier, outside of the propulsion plant we use a lot of steam to run the catapults and heat the water. We made a conscious decision on the Ford class to better electrify the ship,” Moore explained.

Moore also emphasized that the electrical backbone and capacity of the Ford-class carriers will better equip the ships to accommodate directed energy weapons in the future, should they be added to the ship.

For example, it is conceivable that directed energy or laser weapons might compliment the defense systems currently in place to defend the ship such as the Phalanx Close-in-Weapons-System, Rolling Air Frame Missile and NATO Sea Sparrow, Moore explained.

“The Ford has huge margins of ability to generate electrical power that no other ship has,” he said.

In fact, increased automation, computer technology and electrical capacity will reduce man-power requirements on-board the ship, dramatically increasing capability and lowering life-cycle costs, said Mike Petters, President and Chief Operating Officer, Huntington Ingalls Industries.

Moore explained that the technology-inspired man-power reductions will result in as much as $4 billion in savings over the 50-year life-cycle of the ship.

Petters and Moore explained how the Ford-class carriers are designed with a slightly smaller island to allow for more deck space, thus increasing the ship’s ability to launch and recover larger numbers of aircraft.

“You have created an electrical distribution system that is going to allow for lower maintenance cost – then you have advanced arresting gear and the radars. You are really talking about a ship that has substantially more capability. The flight deck was all part of how do we get more sorties – changing the location of the footprint is all about how do you get the flight deck more efficient,” Petters said.

One analyst said that increasing the ability to project power at greater distances through the increased use of unmanned aircraft on carriers, is exactly how the Navy should be thinking about its future.

“The aircraft carrier is relevant today and it will be relevant for decades,” said Bryan McGrath, managing director at FerryBridge Group LLC, a defense consulting firm based in Easton, Md.

In particular, longer reach or operating ranges — for strike possibilities and Intelligence, Surveillance and Reconnaissance missions – is likely to grow in importance in light of what Pentagon strategists refer to as Anti-Access/Area-Denial, or A2/AD, challenges, he argued.

A2/AD is a strategic approach to current and future conflict based upon the premise that potential adversaries have increased technological capability to challenge the U.S. military’s ability to operate in certain areas in an uncontested manner – such as closer to shore.

“Sortie generation rate as a virtue will decline in importance in the years to come largely because any opponent of worth will hold us a little further off shore. Sortie generation rose in importance as we came to dominate the oceans. Now we may have a near-peer competitor, so what we really need is range from the wing – the ability to operate from further away and bring strike power,” McGrath said.

The increased sortie-generation rate capability with the Ford-class carriers is designed to increase the flexibility to launch manned and unmanned systems with greater ease and frequency, a Navy official said.

“The deck has been built to provide the air wing of the future with greater flexibility,” the official said.

Nevertheless, any efforts to increasingly configure aircraft carriers to accommodate increased ability to house and launch longer-range platforms, including manned and unmanned systems, is something McGrath would like to see more of.

“The Navy should begin thinking about designing an aircraft carrier that is devoted to the launch and recovery of unmanned aviation,” McGrath said. “It will need to do its job for 50 years, so you have to think about what you get. You get a very powerful symbol and the means for the delivery of American power. There is no substitute in our arsenal.”

Moore said the Ford-class of carriers are being built with a mind to long-term service – an approach which has, by design, engineered the ship with growth potential such that it can accommodate emerging technologies as they arise.

“Big-deck carriers are by far what we need in terms of power projection and presence. For a lot of the missions we want and the presence we have around the world, there’s nothing like it. Why are the Russians, Indians and Chinese building a carrier? Countries know that carriers bring something to the table that nothing else can bring in terms of an instrument of national power. There is a reason we build these things,” Moore added.

Read more:


DOE study: Fracking chemicals didn’t taint water

Jul 19, 5:48 PM EDT


Associated Press

PITTSBURGH (AP) — A landmark federal study on hydraulic fracturing, or fracking, shows no evidence that chemicals from the natural gas drilling process moved up to contaminate drinking water aquifers at a western Pennsylvania drilling site, the Department of Energy told The Associated Press.

After a year of monitoring, the researchers found that the chemical-laced fluids used to free gas trapped deep below the surface stayed thousands of feet below the shallower areas that supply drinking water, geologist Richard Hammack said.

Although the results are preliminary – the study is still ongoing – they are the first independent look at whether the potentially toxic chemicals pose a threat to people during normal drilling operations. But DOE researchers view the study as just one part of ongoing efforts to examine the impacts of a recent boom in oil and gas exploration, not a final answer about the risks.

Drilling fluids tagged with unique markers were injected more than 8,000 feet below the surface at the gas well bore but weren’t detected in a monitoring zone at a depth of 5,000 feet. The researchers also tracked the maximum extent of the man-made fractures, and all were at least 6,000 feet below the surface.

That means the potentially dangerous substances stayed about a mile away from surface drinking water supplies, which are usually at depths of less than 500 feet.

“This is good news,” said Duke University scientist Rob Jackson, who was not involved with the study. He called it a “useful and important approach” to monitoring fracking, but he cautioned that the single study doesn’t prove that fracking can’t pollute, since geology and industry practices vary widely in Pennsylvania and across the nation.

The boom in gas drilling has led to tens of thousands of new wells being drilled in recent years, many in the Marcellus Shale formation that lies under parts of Pennsylvania, New York, Ohio and West Virginia. That’s led to major economic benefits but also fears that the chemicals used in the drilling process could spread to water supplies.

The mix of chemicals varies by company and region, and while some are openly listed the industry has complained that disclosing special formulas could violate trade secrets. Some of the chemicals are toxic and could cause health problems in significant doses, so the lack of full transparency has worried landowners and public health experts.

Over the last four years the debate over fracking chemicals has attracted tremendous attention from state and federal agencies, public health experts, and opponents of fracking. Yet while many people have focused on the potential threat from the chemicals, experts have come to believe that more routine aspects of the drilling process are more likely to cause problems. Poor well construction that allows excess gas to escape, spills of chemicals or other fluids that take place at the surface, and disposal of wastewater are all issues of concern.

Jackson said most of the problems that the Duke researchers have seen have been related to well construction, not fracking chemicals.

The study done by the National Energy Technology Laboratory in Pittsburgh marked the first time that a drilling company let government scientists inject special tracers into the fracking fluid and then continue regular monitoring to see whether it spread toward drinking water sources. The research is being done at a drilling site in Greene County, which is southwest of Pittsburgh and adjacent to West Virginia.

Eight Marcellus Shale wells were monitored seismically and one was injected with four different man-made tracers at different stages of the fracking process, which involves setting off small explosions to break the rock apart. The scientists also monitored a separate series of older gas wells that are about 3,000 feet above the Marcellus to see if the fracking fluid reached up to them.

The industry and many state and federal regulators have long contended that fracking itself won’t contaminate surface drinking water because of the extreme depth of the gas wells. Most are more than a mile underground, while drinking water aquifers are usually close to the surface.

Kathryn Klaber, CEO of the industry-led Marcellus Shale Coalition, called the study “great news.”

“It’s important that we continue to seek partnerships that can study these issues and inform the public of the findings,” Klaber said.

While the lack of contamination is encouraging, Jackson said he wondered whether the unidentified drilling company might have consciously or unconsciously taken extra care with the research site, since it was being watched. He also noted that other aspects of the drilling process can cause pollution, such as poor well construction, surface spills of chemicals and wastewater.

Jackson and his colleagues at Duke have done numerous studies over the last few years that looked at whether gas drilling is contaminating nearby drinking water, with mixed results. None has found chemical contamination but they did find evidence that natural gas escaped from some wells near the surface and polluted drinking water in northeastern Pennsylvania.

Scott Anderson, a drilling expert with the Environment Defense Fund, said the results sound very interesting.

“Very few people think that fracking at significant depths routinely leads to water contamination. But the jury is still out on what the odds are that this might happen in special situations,” Anderson said.

One finding surprised the researchers: Seismic monitoring determined one hydraulic fracture traveled 1,800 feet out from the well bore; most traveled just a few hundred feet. That’s significant because some environmental groups have questioned whether the fractures could go all the way to the surface.

The researchers believe that fracture may have hit naturally occurring faults, and that’s something both industry and regulators don’t want.

“We would like to be able to predict those areas” with natural faults and avoid them, Hammack said.

Jackson said the 1,800-foot fracture was interesting but noted it is still a mile from the surface.

The DOE team will start to publish full results of the tests over the next few months, said Hammack, who called the large amount of field data from the study “the real deal.”

“People probably will be looking at the data for years to come,” he said.

On Friday, DOE spokesman David Anna added that while nothing of concern has been found thus far, “the results are far too preliminary to make any firm claims.”



Draft of a House Bill Restricting FAA Licensing of UAVs Unveiled

by Press • 25 July 2013


A House member from Vermont unveiled the draft of a bill that would require those seeking to operate an unmanned aerial vehicle over U.S.-airspace to provide detailed information about the use of the UAV before receiving a federal license to fly the drone.

Rep. Peter Welch, D, unveiled the draft document that would prohibit the Federal Aviation Administration from issuing a drone license unless the application includes a statement that lists who will operate the drone, where the drone will be flown, what kind of data will be collected, how that data will be used, whether the information will be sold to third parties and the period for which the information will be retained.

The draft would also require the police applicants, as well as their contractors and subcontractors, to provide a “data minimization statement” that says how data collected by the UAV that is unrelated to the investigation of a crime will be minimized. The draft also requires the FAA to create a website listing all approved UAV licenses, data collection and data minimization statements, any data security breaches suffered by a licensee and the times and locations of flights.

In addition, under the draft bill, even if a police agency had a license to operate a drone, the agency would still have to obtain a warrant to fly the UAV except in special circumstances such as search and rescue operations and the imminent danger of death or serious injury.

“As drones emerge in the airspace over the United States, I am deeply concerned that American’s constitutional right to privacy and their right to know are protected,” Welch said. “Drones can be a valuable tool for law enforcement, public safety and other commercial uses, but it is essential that the law be updated to cover this technology and to ensure individual privacy and transparency in its operation and use,” he said.

Welch made the draft of the bill public at Chittenden County, Vt. on June 12, but he has not yet introduced the bill in the House.


The Creepy, Long-Standing Practice of Undersea Cable Tapping

The Atlantic

July 10 2013

By Olga Khazan


In the early 1970’s, the U.S. government learned that an undersea cable ran parallel to the Kuril Islands off the eastern coast of Russia, providing a vital communications link between two major Soviet naval bases. The problem? The Soviet Navy had completely blocked foreign ships from entering the region.

Not to be deterred, the National Security Agency launched Operation Ivy Bells, deploying fast-attack submarines and combat divers to drop waterproof recording pods on the lines. Every few weeks, the divers would return to gather the tapes and deliver them to the NSA, which would then binge-listen to their juicy disclosures.

The project ended in 1981, when NSA employee Ronald Pelton sold information about the program to the KGB for $35,000. He’s still serving his life prison term.

The operation might have ended, but for the NSA, this underwater strategy clearly stuck around.

In addition to gaining access to web companies’ servers and asking for phone metadata, we’ve now learned that both the U.S. and the U.K. spy agencies are tapping directly into the Internet’s backbone — the undersea fiber optic cables that shuttle online communications between countries and servers. For some privacy activists, this process is even more worrisome than monitoring call metadata because it allows governments to make copies of everything that transverses these cables, if they wanted to.

The British surveillance programs have fittingly sinister titles: “Mastering the Internet” and “Global Telecoms Exploitation,” according to The Guardian.

A subsidiary program for these operations — Tempora — sucks up around 21 million gigabytes per day and stores the data for a month. The data is shared with NSA, and there are reportedly 550 NSA and GCHQ analysts poring over the information they’ve gathered from at least 200 fiber optic cables so far.

The scale of the resulting data harvest is tremendous. From The Guardian:


This includes recordings of phone calls, the content of email messages, entries on Facebook and the history of any internet user’s access to websites — all of which is deemed legal, even though the warrant system was supposed to limit interception to a specified range of targets.

In an interview with online security analyst Jacob Appelbaum, NSA leaker Edward Snowden called the British spy agency GCHQ “worse than” the NSA, saying it represents the first “full take” system, in which surveillance networks catch all Internet traffic regardless of its content. Appelbaum asked Snowden if “anyone could escape” Tempora:

“Well, if you had the choice, you should never send information over British lines or British servers,” Snowden said. “Even the Queen’s selfies with her lifeguards would be recorded, if they existed.”

The U.S.’s own cable-tapping program, known by the names OAKSTAR, STORMBREW, BLARNEY and FAIRVIEW, as revealed in an NSA PowerPoint slide, apparently functions similarly to Tempora, accessing “communications on fiber cables and infrastructure as data flows past,” according to The Washington Post. The slide indicates that Prism and these so-called “upstream” programs work together somehow, with an arrow saying “You Should Use Both” pointing to the two operations.


So how does one tap into an underwater cable?

The process is extremely secretive, but it seems similar to tapping an old-fashioned, pre-digital telephone line — the eavesdropper gathers up all the data that flows past, then deciphers it later.


A map of undersea cables. (TeleGeography)

More than 550,000 miles of flexible undersea cables about the size of garden watering hoses carry all the world’s emails, searches, and tweets. Together, they shoot the equivalent of several hundred Libraries of Congress worth of information back and forth every day.

In 2005, the Associated Press reported that a submarine called the USS Jimmy Carter had been repurposed to carry crews of technicians to the bottom of the sea so they could tap fiber optic lines. The easiest place to get into the cables is at the regeneration points — spots where their signals are amplified and pushed forward on their long, circuitous journeys. “At these spots, the fiber optics can be more easily tapped, because they are no longer bundled together, rather laid out individually,” Deutsche Welle reported.

But such aquatic endeavors may no longer even be necessary. The cables make landfall at coastal stations in various countries, where their data is sent on to domestic networks, and it’s easier to tap them on land than underwater. Britain is, geographically, in an ideal position to access to cables as they emerge from the Atlantic, so the cooperation between the NSA and GCHQ has been key. Beyond that partnership, there are the other members of the “Five Eyes” — the Australians, the New Zealanders, and the Canadians — that also collaborate with the U.S., Snowden said.

The tapping process apparently involves using so-called “intercept probes.” According to two analysts I spoke to, the intelligence agencies likely gain access to the landing stations, usually with the permission of the host countries or operating companies, and use these small devices to capture the light being sent across the cable. The probe bounces the light through a prism, makes a copy of it, and turns it into binary data without disrupting the flow of the original Internet traffic.

“We believe our 3D MEMS technology — as used by governments and various agencies — is involved in the collection of intelligence from … undersea fibers,” said a director of business development at Glimmerglass, a government contractor that appeared, at least according to a 2010 Aviation Week article, to conduct similar types of interceptions, though it’s unclear whether they took part in the British Tempora or the U.S. upstream programs. In a PowerPoint presentation, Glimmerglass once boasted that it provided “optical cyber solutions” to the intelligence community, offering the ability to monitor everything from Gmail to Facebook. “We are deployed in several countries that are using it for lawful interception. They’ve passed laws, publicly known, that they will monitor all international traffic for interdiction of any kind of terrorist activity.”

The British publication PC Pro presented another theory: that slightly bending the cables could allow a receiver to capture their contents.

One method is to bend the cable and extract enough light to sniff out the data. “You can get these little cylindrical devices off eBay for about $1,000. You run the cable around the cylinder, causing a slight bend in cable. It will emit a certain amount of light, one or two decibels. That goes into the receiver and all that data is stolen in one or two decibels of light. Without interrupting transfer flow, you can read everything going on on an optical network,” said Everett.

The loss is so small, said Everett, that anyone who notices it might attribute it to a loose connection somewhere along the line. “They wouldn’t even register someone’s tapping into their network,” he added.

Once it’s gathered, the data gets sifted. Most of it is discarded, but the filters pull out material that touches on one of the 40,000 search terms chosen by the NSA and GCHQ — that’s the content the two agencies inspect more closely.

The British anti-surveillance group Privacy International has filed a lawsuit against the U.K. government, arguing that such practices amount to “blanket surveillance” and saying that British courts do “not provide sufficiently specific or clear authorization for such wide-ranging and universal interception of communications.” Their argument is that the existing surveillance laws are from the phone-tapping days and can’t be applied to modern, large-scale electronic data collection.

“If their motivation is to catch terrorists, then are there less intrusive methods than spying on everyone whose traffic happens to transverse the U.K.?” said Eric King, head of research at Privacy International.

Meanwhile, the British agency, the GCHQ, has defended their practices by saying that they are merely looking for a few suspicious “needles” in a giant haystack of data, and that the techniques have allowed them to uncover terrorist plots.


If groups like Privacy International are successful, it may put an end to the capture of domestic Internet data within the U.K., but as NSA expert Matthew Aid recently told me, since 80 percent of the fiber optic data flows through the U.S., it wouldn’t stop the massive surveillance operations here or in other countries — even if the person on the sending end was British.

It’s also worth noting that this type of tapping has been going on for years — it’s just that we’re now newly getting worked up about it. In 2007, the New York Times thus described President Bush’s expansion of electronic surveillance: “the new law allows the government to eavesdrop on those conversations without warrants — latching on to those giant switches — as long as the target of the government’s surveillance is ‘reasonably believed’ to be overseas.”

Want to avoid being a “target” of this “switch-latching”? A site called “Prism-break” recently released a smorgasbord of encrypted browsing, chat, and email services that supposedly allow the user to evade government scrutiny.

The only platform for which there is no encrypted alternative is Apple’s iOS, a proprietary software, for which the site had this warning:

“You should not entrust neither your communications nor your data to a closed source device.”


Pentagon Factors Operational Energy into Acquisition


by KRIS OSBORN on JULY 25, 2013

The Pentagon has taken steps to formally integrate “energy efficiency” as a metric in the requirements process it uses for procurement and weapons development, senior officials said.

Energy performance has now become a mandatory key performance parameter in a part of the Pentagon’s developmental process known as Joint Capabilities Integration Development System, or JCIDS, said Sharon E. Burke, Assistant Secretary of Defense for Operational Energy Plans and Programs.

The JCIDS process, a collaborative analytical look at acquisition and development, is used to establish and refine requirements for procurement programs, often paving the way forward to next-step milestones in the process.

“We’ve become a force that uses a lot more energy than it used to,” she said. “We’re building energy efficiency into modernization. We have a long way to go because with a lot of the platforms that are entering the acquisition process — their fuel consumption is quite a bit higher. We’re increasing our fuel demand.”

In particular, Burke mentioned increasing power demands for next-generation electronics, ships, aircraft, weaponry and developmental items such as the Navy’s prototype Laser Weapon System that depends upon large “pulses” of energy to operate.

There are more than 300 operational energy initiatives across the Department of Defense, Burke explained. About $1.6 billion was spent on these programs in fiscal year 2013 and, depending upon budget developments, as much as $2 billion is expected for fiscal year 2014.

The Pentagon office for Operational Energy Plans and Programs was stood up by the National Defense Authorization Act of 2009 in response to requests from fielded forces, members of Congress and Pentagon leaders.

One analyst described this effort as a bit of a double-edge sword, indicating that this effort may bring both great rewards and also run the risk of adding too many requirements to an already taxed procurement process.

“On one hand, you are looking across the entire force and doing an in-depth analysis. This effort can bring lower costs, better performance, improved operational flexibility and a reduced logistics tail — which can save lives,” said Daniel Goure, vice president of the Lexington Institute, a Va.-based think tank.

“At the same time, are we now adding more requirements, more oversight and more reviews to a process that many believe is already too long and too cumbersome. Will this add complexity to getting stuff out the door?”

Also, some members of Congress have at times criticized the military’s operational energy platform, pushing back on various “green” efforts on the grounds that some of them may raise costs. Also, many members have raised questions about costs with regard to specific programs such as the Navy’s use of biofuels, an effort to power the fleet using alternative fuels.

Overall, the drawdown in Afghanistan means forces and Forward Operating Bases are more distributed or dispersed and the “re-balance” to the Asia-Pacific underscore the unyielding appetite for greater energy efficiency in combat circumstances and across increasingly greater distances, Burke explained.

While saving money by increasing energy efficiency remains a huge part of the calculus in today’s budget environment, the tactical and logistical advantages provide an edge on the battlefield, Burke explained.

“Anti-access/Area denial means that the supply chain is fully in play in the battlefield. That is going to be true going forward,” she said. “How do you build energy performance into the future force, which will have much bigger fuel requirements and much more sophisticated anti-access challenges? What are your options for making energy an advantage rather than a limiting factor?”

Considering these dynamics and the need for longer-dwell intelligence, surveillance and reconnaissance and longer-range strike capability necessitated by A2/AD, energy considerations are a key part of the equation when it comes to Pacific re-balance and the stepped up development of unmanned systems across the services.

“Unmanned systems give you a totally different way of looking at energy security. Not only is there much lower fuel costs but you can be a little more experimental with the way you power them,” Burke said.

Meanwhile, supply lines, fuel and energy efficiency have proven to be of paramount importance during the land wars in Iraq and Afghanistan, Burke added.

For example, fewer convoys needed to deliver fuel to remote outposts in Afghanistan due to increased efficiency results in both decreased expenditures and logistical footprints.

At the same time, greater fuel efficiency for ships, UAS and aircraft will help offset what many refer to as the “tyranny of distance” – the vast geographical expanse known to the Pacific theater.

Building energy efficiency parameters more formally into the process will help weapons and program developers perform the needed integration earlier in the developmental process, thus reducing costs and risks typically associated with the acquisition process.

As a result, Burke and other senior Pentagon officials have been working with combatant commanders and service acquisition personnel to work on the integration for this effort.

“You want to be able to put a value on better energy performance, so you need to know the value of a technology in operation. What does it cost you do support that technology? What is the unit cost? The only way you can know this is if you have the right analysis to bring to the process,” Burke said.

The Pentagon has already had some successes with the development and implementation of energy-efficient emerging technologies across the services. The effort spans a wide range of technologies from small, portable solar-powered blankets and lightweight batteries for the Army to hybrid-electric Navy amphibious assault ships and much more in between, Burke explained.

In addition, one key example of the approach to build energy efficiency more formally into the acquisition process is found in the ongoing procurement of the Air Force’s Combat Rescue Helicopter, a new helicopter program designed to replace the service’s currently serving HH-60.

“In the RFP [Request for Proposal] we were looking for better energy performance. It will be a criteria in the contract,” Burke explained.

Read more: 


B-52 CONECT: A reboot for the Digital Age

Posted 7/26/2013 Updated 7/25/2013 Email story Print story

by Airman 1st Class Joseph Raatz

Air Force Global Strike Command Public Affairs


7/26/2013 – BARKSDALE AIR FORCE BASE, La. (AFNS) — One thing is certain: it’s not your father’s B-52.

The B-52 Stratofortress has been the Air Force’s star long-range strategic heavy bomber and premier standoff weapon for more than 50 years. For generations, the B-52 has successfully executed missions all over the globe.

But in the 21st century, the pace of things has accelerated beyond the wildest dreams of the original designers who first put plans for the aircraft on the drawing board more than 60 years ago.

“Things change so quickly now, that you simply can’t take 20- to 30-hour-old data into the fight with you any longer,” said Alan Williams, the deputy program element monitor at Air Force Global Strike Command.

With digital display screens, computer network servers and real-time communication uplinks, the B-52 of the future will be far removed from the final batch that was delivered to Minot Air Force Base, N.D., in 1962.

The Combat Network Communications Technology, or CONECT, program will help thrust the B-52 into the modern era.

“Now the crews will be able to do final mission planning enroute,” Williams said. “They will be able to get targeting updates; they will be able to get intelligence updates, all while they are en route so that they can get the most current data.”

The beyond line of sight, or BLOS, communications ability introduced in the CONECT upgrades will allow for a central air operations center to pass along updated threat and targeting data to the aircraft for rapid machine-to-machine retargeting, rather than having the crew and mission be dependent solely upon information that was available at take-off.

“The aircraft will be much more effective and safer for the crew because of being able to receive those threat and target updates,” Williams said, adding that CONECT will also allow the aircrew to receive last-minute updates so that they are able to strike the most current or necessary targets and do it rapidly because of the new machine-to-machine targeting capability.

CONECT also brings an unprecedented networking ability to the B-52.

“It provides us with a digital backbone so that we can pass data all the way around the aircraft,” Williams said, explaining that with the upgrades, any data available to one crew member will now be available to every other member instantaneously via the new digital displays at each workstation.

These new upgrades will provide a foundation that may help guarantee the aircraft’s viability through the remainder of its life span, which is currently projected to extend beyond 2040.

“Now when we add additional systems to the aircraft at some future date, we will be going from a digital component, across our new digital backbone, to another digital component elsewhere in the aircraft,” Williams said. “In the future, it will make upgrades easier to do because we’ll already have that digital infrastructure in the aircraft.”

Williams summed up the CONECT upgrades by saying they would help convert the B-52 from an analog aircraft to a digital platform for today’s warfighter.

“It is taking the B-52 from a rotary-dial phone to a smartphone,” Williams said.

With the CONECT upgrades in place, the B-52 will be well-equipped to enter the Digital Age. In doing so, “the aircraft” will continue to be an adaptable powerhouse for decades to come.


Carter, Winnefeld to Brief House Panel on SCMR Findings


Jul 26, 2013

By Marcus Weisgerberin


Two top Pentagon officials are set to appear before House lawmakers next week to discuss the findings of the Strategic Choices and Management Review, an effort that will help determine how the US Defense Department operates amid budget cuts.

Deputy Defense Secretary Ashton Carter and Adm. Sandy Winnefeld, vice chairman of the Joint Chiefs of Staff, are scheduled to appear before the House Armed Services Committee on Aug. 1. Pentagon officials have yet to give a detailed look inside SCMR, which defense insiders have nicknamed the “skimmer” or “scammer.”

Lawmakers have been pressing DoD for several months to give them insight into the review, which began in March. The project is supposed to tee up decisions for programs, missions and troop level changes depending on the level of spending cut levied on the Pentagon.

Defense Secretary Chuck Hagel in July told senators that the SCMR “seeks to adhere … to the tenets” of DoD’s Pacific-focused Defense Strategic Guidance.


S&T Community Evaluating Funding Gaps For FY-15 POM Planning

The office of the Air Force’s chief scientist has been working through a budget exercise focused on identifying gaps between funding and priority science and technology investment areas.

The Air Force’s former chief scientist, Mark Maybury, released a forward-looking report earlier this summer titled “Global Horizons” that looks at opportunities, threats and emerging capabilities that the service’s science and technology community needs to address now and over the next two decades. Mica Endsley, the service’s new chief scientist who assumed her role in June, told Inside the Air Force after an Air Force Association event in Arlington, VA, on July 11, that her office has been using the work compiled in Global Horizons to identify which priorities are funded and where the service may need to shift its focus to invest in these capabilities.

“One thing we’ve just gone through and done is an analysis of where the gaps are for the [fiscal year 2015 program objective memorandum] to identify where we need to make some strategic changes to try and address these high-priority items,” Endsley said. “It’s really trying to direct our current efforts to make sure that we’re making these trade-offs.”

Endsley would not discuss where some of the shortfalls lie but said that some of the service’s plans laid out in the “Global Horizons” report and identified as near-term priorities — to be accomplished between FY-13 and FY-17 — are not currently budgeted for.

“Some of these things were already slated and in the budget, so they’re already marching ahead,” Endsley said. “There are other areas where we may want to redirect funds.”

One priority investment area involves the service rethinking its approach to and design of autonomous systems. Endsley said during the event that today’s automated systems lack human-machine teaming — an approach to automation that plays on the strengths of the machine to process and deliver data and the ability of the human operator to take that data and use it to make decisions.

“One of the things we’ve found in the past is that you can’t just throw automation at a problem and solve it,” Endsley said. “In fact, you can make a problem worse if you’re not very careful about how you do that. So what we really need to focus on are effective, synergistic teamings of people and automated agents in order to make this more successful.”

Endsley said that traditional automated systems have left humans out of the loop with very little understanding of the processes a machine is performing, which makes the operator less likely to recognize when a problem arises and respond to it in a timely manner. In many cases, she said, the reason for inaction is that the operator lacks full awareness of the situation because the processes aren’t transparent or are hard to understand — even for people who are actively trying.

“We’ve found that this type of automation puts people at a disadvantage because it makes you a passive processor of information as opposed to actively processing,” Endsley said. “And when you’re a passive processor, you don’t really understand what’s going on.”

The solution to this, she said, is an automated system that is trustworthy and serves as an aid to a process. Endsley told ITAF that one area where the service needs to improve its automation is data exploitation. The service’s command-and-control systems collect large amounts of very detailed information, she said, but its processes for integrating and distributing that data needs to be improved.

“We have to be able to integrate it. We have to be able to process video data, for example, and recognize what key information is happening,” she said. “That’s rapidly going to increase the ability to have detailed manual analysis.”

This new approach to automation will require both modernizing existing systems and creating new autonomous systems. She told ITAF that, even in the near term, the service is going to need to take a dual-minded approach to the issue.

She added that one problem the science and technology community faces in this and other research areas is that the Air Force invests so much of its budget in operations and maintenance. It will take an initial investment to reshape and expand the service’s use of automated systems, she said, but these systems will save the service money and time in the long term.

“If we can improve the efficiency of some of these systems, then we can do a lot to reduce these costs,” Endsley told ITAF. “We can do it with two airmen instead of 10 airmen to get a particular job done. I think that’s something we have to take a good look at.” — Courtney Albon



What They Told Us: Reviewing Last Week’s Key Polls

Rasmussen Reports

Bottom of Form

Saturday, July 27, 2013


President Obama attempted this week to shift the nation’s attention away from what he calls “phony” scandals and back to the economy.

He is again pushing Congress to prime the pump with more spending, but most voters (62%) continue to think the government should cut spending in reaction to the nation’s economic problems. Not that there’s much optimism: During budget negotiations in the spring, just 15% thought any plan agreed to by the president and Congress would really cut federal spending.

The economy continues to send off mixed signals. The Rasmussen Consumer and Investor Indexes which measure daily confidence are both down from recent highs but still are well ahead of where they’ve been in previous years since the Wall Street meltdown. 

Homeowners continue to express optimism about the housing market. Only 12% now think the value of their home will go down over the next year. That’s the lowest finding since Rasmussen Reports began regular tracking on this question in April 2009.  Just over half of U.S. homeowners still say their home is worth more than what they owe on it.

Yet while 55% of Americans think the institution of marriage is Very Important to U.S. society, they see that institution at risk in the current economy. Only 29% believe that in a family with children, it is good for both parents to work full-time.

A federal judge on Wednesday cleared the way for Detroit’s bankruptcy to move ahead, and 74% of Americans think other major cities may soon be following the same path to bankruptcy. Twenty-five percent (25) think the federal government should provide bailout funding for cities with serious financial problems.

Just 26% of Likely U.S. Voters now say the country is heading in the right direction. That’s down from a high of 43% the week just before Election Day and the lowest level of confidence since mid-January 2012.

Only 17% think U.S. public schools now provide “a world-class education,” down from 26% in August 2011 when the president first set this as the goal to achieve. But then just 25% think most high school graduates have the skills needed for college. Only 22% believe most of these graduates have the necessary skills to go into the workforce.

The president’s daily job approval ratings have tumbled from the euphoric highs surrounding his reelection to the levels found for most of his first term.

As roll out of the president’s national health care law stumbles along, voters continue to give high marks to the health care they now receive but are more pessimistic than ever about the short-term future of the health care system in this country. Just 24% now expect the U.S. health care system to get better over the next couple of years.

Obama and his supporters may characterize the Internal Revenue Service’s targeting of Tea Party and conservative groups as a “phony” scandal,” but 59% of voters still think it’s likely the president or his top aides were aware of what the IRS was up to.   Sixty-six percent (66%) believe the decision to target the groups came out of Washington, DC, with 26% who think it was made by someone at IRS headquarters and 40% who think it was decided by someone at the White House. Belief in the Washington connection is unchanged from a month ago.

In an era when there is increasing skepticism about crony capitalism, voters are more critical of the president’s policies toward small business and continue to believe he favors big business instead

Like responses to the George Zimmerman verdict itself, voters give the Obama administration mixed reviews for its reaction to the outcome of the Trayvon Martin case. But voters share the president’s concern about racial profiling. 

Still, only 24% of Americans believe Zimmerman’s actions which led to the shooting death of Martin were motivated primarily by racism. Just 21% believe he should now be charged with a hate crime by the U.S. Department of Justice.

Voters overall continue to believe the U.S. justice system is fairer to blacks and Hispanics than it is to Americans in general.  Forty-six percent (46%) of all voters think the U.S. justice system is generally fair to black and Hispanic Americans, but 87% of black voters disagree. Voters are almost evenly divided when asked if the U.S. justice system is fair to most Americans: 43% say yes, 41% no, and 16% are not sure. Just 36% say the system of justice in this country is fair to poor Americans

Forty-four percent (44%) of Americans agree with the jury’s verdict that Zimmerman was not guilty in the shooting death of Martin. Thirty-five percent (35%) disagree, including 80% of black Americans.

But 52% of all voters think if all of the other facts were the same as those in the Zimmerman/Martin case, the jury would have found a black shooter not guilty of murder if the victim was white.  Thirty-one percent (31%) disagree and believe the jury would have found the black shooter guilty of murder. Among blacks, however, 79% think the jury would have found the black shooter guilty. Sixty percent (60%) of whites believe the jury would have reached the same not guilty verdict. 

Just 19% of all voters believe it’s a good idea for states, cities, organizations and individuals to boycott Florida over the Zimmerman verdict

Most voters (51%) do not know if their state is one of the many that has a stand your ground self-defense law. Voters favor such a law by a 45% to 32% margin, but given the lack of awareness on the subject, opinion is unlikely to be settled at this time.

In other surveys last week:

– Democrats have regained their lead over Republicans on the Generic Congressional Ballot.

– Eighty-one percent (81%) of voters consider a candidate’s personal life at least somewhat important to how they will vote. Fifty-seven percent (57%) say they are less likely to vote for a candidate who has been involved in a sexual scandal.

Even before the latest news of sexual “texting” by New York City mayoral candidate Anthony Weiner broke this week, half of the city’s registered voters had an unfavorable opinion of him. Fifty-one percent (51%) have an unfavorable view of Eliot Spitzer who resigned as governor of New York in 2008 following exposure of his regular use of prostitutes. Spitzer is now running for city comptroller, New York’s chief financial officer.

– Forty-eight percent (48%) of Likely U.S. Voters believe research by private businesses seeking a profit will do more to meet the U.S. need for affordable energy sources than research by the government. Thirty-eight percent (38%) believe government research is the better way to go.

– Following outrage over Rolling Stone magazine’s decision to put a sympathetic picture of the Boston Marathon bomber on its cover, 62% of Americans think the media pay too much attention to the personal lives of violent criminals.

Most Americans see Rolling Stone’s decision to put the Boston suspect on its cover as a publicity stunt, but they recognize that the magazine has a right to publish what it chooses.


Get every new post delivered to your Inbox.

Join 127 other followers