Skip to content

April 7, 2012

April 9, 2012




The Future of Wireless is Wired


4/01/2012 @ 10:29AM

Laetitia Garriott de Cayeux, Contributor


Spectrum deficit will become a reality in the US in 2013 if we don’t act soon. There is little debate that the Federal Communications Commission and the open spectrum auctions recently mandated by Congress will provide much needed breathing room to help avoid the imminence of a “spectrum crunch.” But this on its own will only result in temporary relief, since wireless bandwidth is by definition finite. Regulatory actions alone cannot provide a complete long-term solution for our wireless spectrum deficit.

Interestingly, the winning investment play around the future of wireless may not be speculating on the ever-increasing value of an intrinsically finite resource. That’s because scarcity of wide-area spectrum will cause a significant migration towards more local area networks such as femtocells (small, lower-power radio transmission stations) and wifi, and will eventually find a relay in the infinitely expandable, wired backhaul—the link to a provider’s core network. Wired fiber infrastructure can still carry vastly more data than any wireless system. So while it is generally commonly accepted that scarcity leads to value increase, and many have taken note that wireless spectrum value can often represent over 50% of the enterprise value of cellular carriers, as in the cases of Leap Wireless (NASDAQ: LEAP) and Clearwire (NASDAQ: CLWR), a migration to local networks may in fact put a cap on the value of wireless spectrum, foiling investment strategies based on the premise that spectrum scarcity will prop up the market cap of cellular carriers.

This leads us to the following paradox: the true savior, and arguably only reliable investment play, will be backhaul technology service companies that will allow our wireless future to be wired, moving the paradigm from one of intrinsic scarcity (wireless bandwidth) to one of abundance (wired backhaul). Here, the universe of investable names includes companies providing transmission technology (copper, optical fiber, and microwave relay) as well as digital services at both ends of the wire such as sorting and distributing data packets.

In the latter category, companies like Juniper Networks (NASDAQ: JNPR) and Cisco Systems (NASDAQ: CSCO) that offer IP-based infrastructure solutions will greatly benefit from this emerging theme, along with related themes such as rising cyber security threats. However, while Cisco currently trades at 11 times calendar 2012 earnings, Juniper Networks trades at a much higher multiple, implying much higher growth expectations, which investors would have to take into consideration. Among the wireless companies most likely to go public this year, Ruckus Wireless, led by the computer networking entrepreneur Selina Lo, is also well positioned, since in addition to providing broadband access points and backhaul devices the company offers solutions that help offload 3G and 4G onto wifi.

In the former category, the optical communications group trades at around 18 times calendar 2012 earnings and just witnessed the announcement earlier this week of Oclaro’s (NASDAQ: OCLR) acquisition of Opnext (NASDAQ: OPXT). If consummated, this merger will lead to the emergence of three companies—the others being Finisar (NASDAQ: FNSR) and JDS Uniphase (NASDAQ: JDSU)—which today command a combined capitalization of $5.5 billion and will control a majority of the global optical communications market.

Looking further down the road, even with better compression and faster digital switches, there remains a speed bump in the otherwise speed-of-light process. Data flow slows dramatically at the connection nodes where signals are decoded into information packets, interpreted for rerouting, and then turned back into a new signal on a new line heading in a new direction. Companies such as Oclaro, which announced earlier this month technological advancements in the areas of fully optical switches, will radically reduce this potential logjam, since such technology allows data to continue to move at the speed of light, directly through the switch, never needing to be stored, interpreted, and retransmitted. Companies that own significant patent portfolios in this type of technology have great positioning for solving key long-term problems of massive global data flow in the wired future of wireless.


Note: A private fund managed and controlled by the author, an investment manager, holds a position in Oclaro and Cisco stocks.

This article is available online at:


U.S. Military Doesn’t Drive Energy Innovation, Except When It Does


William Pentland, Contributor

4/02/2012 @ 12:32AM

On Friday, the Bipartisan Policy Center released an odd study evaluating the role the U.S. Department of Defense (DoD) is likely to play in accelerating the pace of clean energy innovation.

The study, “Energy Innovation At The Department Of Defense,” proposes to answer the question: “Can policymakers successfully apply the [DoD’s innovation] capabilities to the context of the nation’s civilian energy needs?”

The answer is supposed to be “no.”

The report, which the BPC commissioned from the Consortium for Science, Policy, and Outcomes at Arizona State University, concludes that the DoD “is unlikely to become an all-purpose engine of energy innovation.”

This is a shame. In 2011, the DoD pumped about $233 billion into the gears of its innovation juggernaut.

While the DoD expects to reduce energy demand at military facilities, this will affect only about one-quarter of DoD’s energy usage. The remaining 75% is consumed in operations. The idea is that the DoD is all about mission and will not concentrate its innovation capabilities “on technologies that are most likely to contribute to the military’s mission.”

In other words, while the DoD may install an occasional solar panel on military bases in Nevada or California, it is unlikely to change its preference for effectiveness first and efficiency second in operational areas, which account for nearly 75% of the military’s energy consumption.

This is where the analysis goes sideways. After explaining why the military’s operations are unlikely to drive innovation in clean energy, the report explains why fuel cells would contribute to the DoD’s core mission of enhanced fighting capabilities, why fuel cells are ripe for the kind of bumper to bumper innovation support only the DoD can deliver and then proceeds to conclude that the DoD is an unlikely to drive clean-energy innovation.

Here are a few quotes about the promise of fuel cell technology taken from the study:


Practical small fuel cells would provide a basis for lightweight power packs for soldiers. Larger units could replace towed diesel generators and serve as auxiliary power units to minimize inefficient low-load operation of the main engines in ground vehicles and naval vessels.


Like batteries, fuel cells convert chemical energy directly (i.e., in a single step) into electrical power. In fuel cells, as in primary batteries, the chemical reactions that produce electricity cannot be reversed; but unlike such batteries, a fuel cell can be refilled—e.g., with hydrogen [alcohol, methane, natural gas, propane, biofuels, etc.]—to provide a new ‘charge’ of chemical energy . . . Because all of these contain far more energy per unit of weight than the chemicals in even the best batteries, fuel cells promise advantages in energy density of 20 to 50 or even 100 times.


For DoD, the attractions [of fuel cells include] . . . quiet stationary power in remote areas, lighter loads for dismounted soldiers [and] justify much higher costs than civilian markets will accept.


Fuel cells and batteries offer relatively high efficiency (the fraction of energy theoretically available that can be converted into useful work) compared to most other energy converters. The best diesel engines, for example, approach 40 percent efficiency under optimal conditions (i.e., the load-speed combination that gives the highest efficiency). While this is better than gasoline engines or gas turbines can achieve, some batteries approach 90 percent efficiency.


The BPC study notes a previous analysis by the Air Force Scientific Advisory Board (SAB) that claimed that eliminating the need for batteries with fuel cells would “change the game” in military operations.

Despite these observations, the study concludes that the “DoD should put greater emphasis on low-power design” – dismissing the promise of fuel cells it listed in the previous section as presumably irrelevant. To summarize, the report recognizes that innovations in fuel cell technology, which is an example of the low-carbon energy technologies the report claims the DoD is not likely to support, would advance the military’s core mission in a clear and compelling manner. Regardless of this linkage, the report urges the military to pursue low-power designs rather than advances in fuel cell technologies.

The logic for this strange treatment of fuel cells is explained like so: “Unlike solar and wind power, however, and despite massive investments in R&D over the past two decades motivated chiefly by prospective applications to electric vehicles, fuel cells have not been commercialized to any great extent.”


The majority of solar and wind power projects have resulted from the massive demand created by state Renewable Portfolio Standards. Solar and wind were “commercialized” primarily because of legal mandates – not economics. Absent the government-induced demand resulting from state renewable energy mandates and similar policies, it is unlikely that the advances in wind and solar power would have occurred.

Daniel Sarewitz, a professor at Arizona State University and one of the report’s lead authors, is a very, very smart fellow, but I am baffled as to why he would describe fuel cells as mission critical applications and then dismiss them as cost prohibitive. The primary reason the military has been a catalyst for innovation is because mission – and not economics or environmental goals – drives the decision-making process.

What gives?

This article is available online at:



EPA climate proposal could limit coal’s future, at least in US


Miami Herald

McClatchy Newspapers


WASHINGTON — The Obama administration’s proposal last week to put the first limits on greenhouse gases from new power plants probably will mean that no new coal-fired U.S. plants will be built after this year, but that won’t slash coal use anytime soon.

The rules require future power plants to keep their emissions of heat-trapping gases under a limit. Most natural gas plants would meet the standard easily, but coal-fired plants would have to reduce emissions by about half. The equipment to capture and store those emissions underground isn’t commercially viable.

“This is not a sudden death for the coal industry by any means,” said David Pumphrey of the Center for Strategic and International Studies, a center-right research center. “I think there are a lot of people out there who are going to make sure it doesn’t work that way. But it says the future of coal is limited and probably isn’t going to grow more than it is now.”

Coal today accounts for about 40 percent of electric power generation, down from 45 percent in 2010. The Energy Department’s latest forecast figures that it will remain roughly the same through 2035. Part of that forecast saw new coal use increasing slowly after 2015. But that calculation was made before the Environmental Protection Agency announced the proposal for new plants Tuesday.

Kevin Book, the managing director for research at ClearView Energy Partners, said coal probably would retain its place as the biggest source of electricity in 2035, but barely. It remains to be seen whether natural gas surpasses coal or state laws continue to support an increase in renewable energy, he said.

It’s also likely that some older coal plants will be kept around longer than originally intended, Book said. “If you can’t build more, you’ll will love the ones you have a lot longer.”

“I think the intent in the rule is to make the future of coal extremely problematic,” said Luke Popovich, the vice president for external communications at the National Mining Association. The plan essentially would make coal plants impossible until carbon capture and storage for part of their emissions becomes commercially available, and that’s likely to be more than a decade away, given the slow start on testing and deployment globally during the recession, he said.

Popovich and others expect the EPA to put greenhouse gas limits on existing power plants later, but they don’t anticipate that move until after the November elections.

President Barack Obama’s Republican opponents probably would put the EPA’s proposed rule on ice. They don’t agree that there’s a need for policies to reduce greenhouse gas emissions.

The EPA’s long-term plan for existing plants, however, is no secret. The agency is operating under the terms of a consent agreement reached after states and environmental groups sued it to force it to reduce emissions from new and existing power plants and refineries.

Jeff Holmstead, an EPA air administrator under the George W. Bush administration who now represents companies that oppose greenhouse gas limits and other Obama administration pollution rules, said that even if the new rule survived in court, it was hard to know what it would mean for future coal use.

At least three plants that were planned won’t be able to be built now, Holmstead said.

Other major companies say they’re not affected. North Carolina’s Duke Energy company and Progress Energy, which plan to merge, don’t have plans for more coal-fired plants.

“It’s important to keep coal in the equation long term,” Duke spokesman Thomas Williams said.

Carbon capture and storage isn’t ready, and the storage part is the biggest challenge, Williams said. If scientists find a way to recycle the carbon dioxide instead of storing it underground, “that would be huge for coal.”

Scientists have been working on how carbon dioxide could be recycled as fuel and building materials, but while there have been laboratory breakthroughs, the work is far from commercial use.

While it’s hard to predict whether U.S. use of coal will remain stable or decline, the global use of coal is growing.

“We see an enormous market for coal building in Asia,” Popovich said.

Exports would help offset the impact of lower U.S. coal use, he said.

U.S. coal exports have increased over the past five years, even though there was a decline at the height of the recession. The U.S. also has a bigger market for coal in Europe, as other countries that traditionally have shipped coal there have shifted to Asia, Popovich said.


(Bruce Henderson of The Charlotte Observer contributed to this report from Charlotte, N.C.)


Read more here:


Colleges Shedding Non-Core Operations

Updated April 1, 2012, 7:02 p.m. ET


Ohio State University wants to drop out of the parking business.


In an era of stagnant or declining state support for higher education nationwide, the university is seeking investors who might pay hundreds of millions of dollars to lease its parking system of nearly 36,000 spaces.

The university—one of the nation’s largest, with a nearly 57,000-student main campus in Columbus—is also considering privatizing a roster of other assets that could make a midsize city envious, including two 18-hole golf courses, a small airport and a power grid.

“Teaching students and [conducting] research is the core of what we do,” said Ohio State Chief Financial Officer Geoffrey Chatas, a former infrastructure investment manager at J.P. Morgan Chase & Co.

“The question is, do we really need to operate parking? Do we really need to own and operate an airport, or own and operate golf courses?”

Public universities across the U.S. have seen financial support from state governments decline, and have faced growing criticism for raising tuition to help cover the gap.

Tuition and student fees at four-year public universities rose more than 70% over the past decade, according to the College Board, while state appropriations per full-time student fell more than 20%.

That has prompted some universities to turn to the private sector for services beyond common areas such as bookstores and restaurants.

The University of Kentucky announced plans in December to transfer control of its dormitories to a private company in exchange for up to $500 million in upgrades and new construction. Portland State University in Portland, Ore., signed a deal with American Campus Communities Inc. in 2010 to build a $90 million, nearly 1,000-bed dorm on campus that is scheduled to open this fall.

Ohio State estimates it could get at least $375 million up front for a lease of up to 50 years of its parking facilities, and says seven groups that expressed initial interest—including Carlyle Group and Macquarie Group Ltd.—have been invited to make bids. Final proposals are due in May. Carlyle and Macquarie declined to comment

The plan illustrates how “institutions don’t have the resources they once did, and so they are trying to leverage every dollar possible,” said Peter Eckel, vice president at the Association of Governing Boards of Universities and Colleges.

Ohio cut funding for instruction at Ohio State last year by 15%, or nearly $63 million, as part of efforts to plug a multibillion-dollar state budget hole.

The university’s overall instructional budget climbed 4% nonetheless, to $5 billion, buoyed by a 3.5% tuition increase.

Local governments across the country have faced resistance to plans for privatized parking. Chicago agreed in 2008 to sell rights to operate its parking meters for 75 years for $1.16 billion. Critics have complained the price was too low, and that rates have risen too quickly since.

Some students and staffers worry that a private-sector operator would be tougher with parking fines, among other concerns, and contend officials haven’t provided enough information to show the school wouldn’t be better off financially in the long run managing parking itself.

“The first thing people say is, ‘Oh God, our prices are going to go up,’ ” said Brittany Fennell, a 20-year-old sophomore studying pre-med and psychology, who notes that the current parking administration has a reputation for flexibility with cash-strapped students on fines. But, she said, the plan could be positive if it mitigates tuition increases.

Ohio State’s parking system reaped about $28 million in revenue last fiscal year from permits, fines and other sources, and generated about $19 million in earnings before interest, taxes, depreciation and amortization, according to the university.

If the potential privatization goes through, the successful bidder would be limited to maximum annual rate increases of 7.5% during the first decade, after which prices could rise by either 4% or the rate of inflation, whichever is greater.

Ohio State said it likely would put proceeds into an endowment, using disbursements to fund scholarships, bus services and other purposes.

Write to Bob Sechler at

A version of this article appeared Apr. 2, 2012, on page A6 in some U.S. editions of The Wall Street Journal, with the headline: Colleges Shedding Non-Core Operations.




Docs plead with CMS for relief from deadlines, penalties

HealthCare IT News

By Bernie Monegain, Editor

Created 03/30/2012

CHICAGO – The American Medical Association (AMA) and state and national medical specialty societies are pleading with the government for relief from regulation and impending penalties.

The groups sent a letter to the Centers for Medicare & Medicaid Services (CMS) saying they are worried about an onslaught of overlapping regulations that affect physicians. Programs with overlapping timelines include the value-based modifier, penalties under the eprescribing program, physician quality reporting system (PQRS) and the electronic health record incentive program that is part of the meaningful use program – and the transition to ICD-10.

“Facing all of these deadlines at once is overwhelming to physicians, whose top priority is patients,” said AMA President-elect Jeremy Lazarus, MD. “We have asked CMS to develop solutions for implementing these regulations in a way that reduces the burden on physicians and allows them to keep their focus where it should be – caring for patients.”

Physicians said that without needed changes from CMS, they must transition to the ICD-10 coding system; spend significant time and resources implementing EHRs into their practices; work to successfully participate in the Medicare e-prescribing program; meet EHR meaningful use standards; and participate in the Physician Quality Reporting System (PQRS) – all within a short amount of time.

“In addition to these upcoming deadlines, physicians who treat Medicare patients are also currently facing a drastic cut of about 30 percent on January 1 from the broken Medicare physician payment formula,” Lazarus said. “The combination of these financial burdens could prevent physicians from making the investments needed to transition to new models of care delivery and improve the value and quality of care in the Medicare system.

Source URL:


Defense lobby chief wants ‘eyes wide open’ to impact of sequestration cuts

The Hill

By Jeremy Herb – 03/27/12 05:00 AM ET

The defense industry is counting on David Hess, the chairman of the Aerospace Industries Association (AIA), to prevent the unthinkable.


His mission: convince a Congress mired in partisan gridlock to work together — in an election year, no less — to reverse the hundreds of billions in sequestered defense cuts set in motion by the failure of the supercommittee last year.

Fail, and the defense industry could take a $500 billion hit from the Pentagon budget over the next decade, a reduction that defense leaders say would have devastating consequences.

“People joke it’s like asking, ‘How do you plan for the end of the world?’ ” said Hess, who was appointed chairman of AIA for 2012. “You’re not sure how bad it’s going to be, or when exactly it’s going to occur.”

While AIA serves primarily as the lobby for the aerospace industry, the group has been the loudest voice in the defense sphere on sequestration and has cranked out a series of widely cited studies on how the automatic cuts would affect industry.

Hess, the president of the aerospace giant Pratt and Whitney, hopes to goad Congress into action by focusing on the jobs that would be lost from a severe downsizing of the defense industry.

“We just want to go into this with everybody’s eyes wide open so they understand what it means,” Hess said. “There’s a huge economic impact to the U.S. economy with some of the very premium jobs in the United States: high-tech jobs, high-paying jobs, high-skilled jobs.

“To see those kind of jobs go away would, I think, have a severe impact on the unemployment rate,” he said.

But Hess and Marion Blakey, AIA’s president and CEO, are fighting an uphill battle to reach the top of the congressional agenda amid larger debates over taxes and deficits.

Sequestration, which was triggered when the supercommittee failed to find $1.2 trillion in deficit reduction in November, is set to take effect in January 2013, giving the defense industry a mere nine months to stop the cuts from taking effect.

But while most Republicans and Democrats agree sequestration should not occur, there’s deep disagreement about how to replace it, with Republicans pushing for cuts to entitlement spending and Democrats calling for tax increases. Few observers expect any action on stopping sequestration until after the 2012 election.

“The industry is really caught up in a much larger political debate that is ultimately about the two parties’ philosophies in governance,” said Loren Thompson, a defense expert at the Lexington Institute. “The jobs issue is important, but for the parties going into an election, this is about taxes and entitlements.”

That doesn’t mean the aerospace association is going down without a fight. After the August debt-limit deal cut $487 billion from the Pentagon budget over the next decade — and the supercommittee’s failure started the countdown to an additional $500 billion reduction — AIA’s lobbying skyrocketed.

The group spent $887,000 on lobbying in the third quarter and more than $900,000 in the fourth quarter, quadrupling its lobbying spending from the year prior, according to lobbying disclosure records. The amounts were more than double what AIA had spent on lobbying in any period in the last decade.


AIA launched a lobbying campaign last year called “Second to None” that focused on selling the defense industry’s importance to the larger economy. The group released a study in October that found 1 million defense-related jobs are at risk if sequestration occurs, though some questioned the figure.

The association followed that up with a report this month that provided a state-by-state breakdown of the jobs at risk.

A week later, the group held a lunch on Capitol Hill where Lockheed Martin CEO Bob Stevens told lawmakers and aides that sequestration would cause a “massive disruption” that would halt production on all current and future military programs.

The industry has many defenders in the Pentagon and the Capitol, and Defense Secretary Leon Panetta has used numerous colorful analogies, from “meat ax” to “gun to the head,” to describe the impact of sequestration.

House Armed Services Committee Chairman Buck McKeon (R-Calif.) has sponsored a bill that would push sequestration back one year, and he says Congress must act now because the industry will have to start preparing for the cutbacks with budget cuts and layoffs.

“This idea of not fixing things around here until the very last minute is crazy,” McKeon said in an interview with The Hill last month. “If we don’t fix it, we’re causing big problems for [the industry] right now, and then we’ll cause major problems for the military in January.”

With a background in mechanical engineering, Hess has been in the aerospace industry for 33 years, taking over as president of Pratt & Whitney in 2009.

He was named vice chairman of AIA in 2011, succeeding Jim Albaugh, president and CEO of Boeing’s commercial aircraft division, for his one-year term as chairman.

The Pratt & Whitney president is no stranger to big lobbying fights, as his company, a Connecticut-based subsidiary of United Technologies Corp., was locked in an expensive, years-long battle with General Electric and Rolls-Royce over the second engine for the F-35 fighter.

Pratt & Whitney appeared to emerge victorious in December, as GE and Rolls-Royce announced they were no longer funding their alternate-engine program.

Hess downplayed the F-35 fight, saying it was just business.

“There’s no hard feelings,” he said.

While AIA did not pick sides in its intramember fight on the F-35 second engine, the group’s views will be heard clearly in Congress on sequestration, Hess said.

“We’re a pretty effective group with a very loud voice,” he said.

The defense industry is in a more precarious position than it was during the last military drawdown in the 1990s, Hess said, as the industry has seen further consolidation. He warned there’s a risk that defense companies would be unable to ramp production back up again if the cuts are too deep.

“We have to start worrying about falling below a critical mass,” Hess said.

Thompson said he’s seen more of a united effort by defense companies in recent months, with little movement in Congress on sequestration.

Hess acknowledged his industry would likely be in the dark about sequestration’s fate until after the election, when there will be little time left for Congress to find a way out.

“I don’t think until recently the companies had begun to focus on how likely sequestration is,” Thompson said. “There’s nothing like a gun pointed at your head to focus the mind.”



The Hill

GOP, Dems at impasse over defense cuts, despite calls for action

The Hill

By Jeremy Herb 04/01/12 05:30 AM ET

Democrats and Republicans are talking past each other when it comes to reversing $500 billion in automatic defense cuts set for Jan. 1, despite an increasingly loud push to cancel them.

Lawmakers in both parties say the automatic cuts through sequestration cannot take effect, and that they want them to be undone now. But even if both sides agree that sequestration cannot stand, the two parties remain unwilling to budge an inch on the $1.2 trillion-pound gorilla that stands in the way: taxes.

The partisan disconnect played out on Thursday in dueling public appearances.

Sen. John McCain (R-Ariz.) led a group of seven Republican senators in a Capitol Hill news conference who warned of devastating consequences if Congress did not act soon, and McCain said the GOP was “stretching out our hand” to sit down and negotiate with Democrats.

Across the Potomac River, House Armed Services ranking member Adam Smith (D-Wash.) gave a speech at the RAND Corporation where he also said Congress should avoid the “blunt instrument” of sequestration, and said Congress needed a “sense of urgency” about its fiscal health.

However, Smith boldly predicted that Congress would likely stop sequestration from going into effect next year. “I do not think [it] will happen,” he said during the Thursday speech.

Just exactly how lawmakers plan to avoid it is much more difficult question.

The problem is that the two sides still deeply disagree about how to achieve the $1.2 trillion in deficit reduction that would replace the automatic cuts to defense and non-defense discretionary spending — cuts that were designed to be so bad as to force the two sides to reach a deal.

“The military’s budget is a hostage to a fight that’s much larger, centered around tax cuts and entitlement reform,” said Mackenzie Eaglen, a fellow at the conservative-leaning American Enterprise Institute.

“The problem Sen. McCain and all members who want to address this right now are facing is that nothing has changed,” she said. “The dynamic of the debate and a deal to change the law are the same as they were a year ago.”

Democrats insist that Republicans must put tax increases on the table in a deficit deal, which Republicans have said they will not do. Republicans, meanwhile, accuse Democrats of being unwilling to touch entitlement spending.

The divide is what doomed the supercommittee in November. Its failure set sequestration in motion.

As the clock ticks closer to January 2013, Pentagon officials and defense industry leaders continue to ratchet up their rhetoric about sequestration’s potentially disastrous impact.

At a Senate hearing this week, the Pentagon’s chief weapons buyer Frank Kendall said “hundreds of thousands” of jobs were at risk if sequestration hits and warned about broken contracts on big weapons systems.

“Across the department there are places where a devastating impact would occur, and of course that ripples down through all tiers of the industrial base,” Kendall said at his confirmation hearing Thursday. “There would be hundreds of thousands of jobs impacted by it.”

Most defense and budget analysts and many members of Congress say sequestration will not get dealt with until a lame-duck session after the November election. Sequestration is one of numerous big-ticket items that will likely get tackled in the final weeks of the year, including the expiration of former President George W. Bush’s tax rates.

Loren Thompson, an analyst at the Lexington Institute, said it’s quickly becoming more plausible that sequestration will actually occur — particularly if neither party gets a knockout blow in the November elections.

“It’s a measure of how paralyzed the political system is, that everybody agrees sequestration is a bad idea, and yet” it could happen, Thompson said.

“Depending on the outcome of the election, we could get sequestration not just for a few months, but a few years,” he said. “The explanation is really simple: If we get a split decision in which each party has a veto, they won’t be able to repeal the law.”

GOP defense hawks have pushed for Congress to avoid throwing sequestration to the lame-duck session, with bills from House Armed Services Committee Chairman Buck McKeon (R-Calif.) and McCain and Senate Minority Whip Jon Kyl (R-Ariz.) to push it back one year.

“I have never seen a lame-duck session that ended up in anything but a disaster,” McCain said. “For us to somehow say, ‘Ok, we’ll wait until after the November election,’ is crazy.”

McCain and the other GOP senators raised the need for bipartisanship at their news conference, and called on President Obama to hold a meeting with Democrats and Republicans in Congress to hash out a deal.

“We’re open to any ideas — this has to be a bipartisan exercise,” Kyl said. When asked about Democrats insisting that taxes be on the table, however, McCain said, “Of course we are against tax increases.”

Senate Armed Services Committee Chairman Carl Levin (D-Mich.), who said earlier this year he believes sequestration will force the GOP to give on taxes, told The Hill he and McCain have yet to sit down to discuss a fix to the automatic cuts.

“Revenues have to be on the table, including tax increases on upper-income folks,” Levin said. “There’s no way you can do serious deficit reduction without including a significant amount of revenues, including. I believe, a tax restoration on the upper bracket.”

After appearing at McCain’s news conference, Sen. Marco Rubio (R-Fla.) accused Democrats of “playing chicken” with national defense spending.

“They’re willing to say that if we don’t concede the tax increases they’re willing to gut America’s national defense,” Rubio said of Democrats. “I think that’s a very dangerous proposition to take.”

Republicans have frequently quoted Defense Secretary Leon Panetta’s doomsday characterizations of sequestration in their pleas for not letting the defense cuts occur, while blasting Obama for not proposing his own solution.

Obama has said he will veto legislation undoing sequestration without providing its equivalent in deficit reduction.

Both Obama’s budget and the House Republican budget that passed Thursday would negate sequestration: Obama’s through tax increases and Ryan’s by shifting the discretionary cuts out of defense.

While neither have a real shot at becoming law on their own, Sen. Lindsey Graham (R-S.C.) suggested Thursday that the two budgets could be a starting point for negotiations.

“Both budgets, the president’s budget and the House budget, avoid the cuts to the Defense Department, they just do it differently,” Graham said. “OK, we’ve got two different ways of setting aside defense sequestration — let’s find common ground.”



Air Force eyes commercial cloud for unclassified thin client network


By Bob Brewin

The Air Force said it will consider commercial cloud computing services for 1 million users of its unclassified networks as they shift from desktop computers to dumb terminals.

The Air Force Space Command, which manages the service’s networks, announced early last month it was considering the move to thin or zero clients to cut operations and maintenance costs and improve security.

Desktop computers store files and applications on local hard drives while thin clients and zero clients access applications stored on remote servers. Zero clients consist of a keyboard, mouse and monitor with no local processing power, while thin clients have some built-in processing power to support rich graphics displays and multimedia applications.

In response to questions posted last week by potential vendors, the Air Force said a commercial cloud solution for its thin-client-based network “may be considered” if it can meet security requirements and provide users with the same experience they have on a desktop.

At an April 2011 hearing, Defense Department Chief Information Officer Teri Takai told the House Armed Services Subcommittee on Emerging Threats and Capabilities that “there will be instances where we [can] use commercial cloud providers . . . [if] they meet our standards.”

Bernie Skoch, a retired Air Force brigadier general who has extensive security experience, said that with “appropriate protection” he could see the Air Force and the other services moving unclassified and eventually classified thin-client networks to a commercial cloud rather than hosting them internally.

Thin-client security will have to include user authentication — handled by the Common Access Card, a computer chip-based card used to log on to military networks — and “robust connectivity” to withstand distributed denial-of-service attacks, Skoch said.

Thin clients inherently offer more security than desktops because users cannot load their own software. The devices are easier and quicker to update than desktops, which require physical distribution of master disks. Thin or zero clients also make it easier for an individual to move from one terminal to another by saving work in the cloud, Skoch said.

Paul Mancini, marketing vice president for Devon IT, a Pennsylvania thin-client vendor, said since thin clients have no local storage they are more resistant to viruses and other forms of malware.

While a commercial cloud service could save the Air Force money, Mancini said a government cloud structure would provide more security.



Apple plans nation’s biggest private fuel cell energy project at N.C. data center

By John Murawski

Published in: Business

Posted: Tuesday, Apr. 03, 2012

North Carolina will be home to the nation’s largest private fuel cell energy project, a nonpolluting, silent power plant that will generate electricity from hydrogen.

Apple (yes, that Apple) filed its plans with the N.C. Utilities Commission last week to build the 4.8-megawatt project in Maiden, about 40 miles northwest of Charlotte. That’s where Cupertino, Calif.-based Apple has built a data center to support the company’s iCloud online data storage system and its Siri voice-recognition software.

The fuel cell project, the nation’s largest such project not built by an electric utility company, will be developed this year. It will be located on the same data complex that will host a planned 20-megawatt solar farm – the biggest ever proposed in this state.

But it’s the fuel cell project that’s generating buzz, eclipsing anything ever dreamed of in California, the nation’s epicenter for fuel cell projects.

“That’s a huge vote of confidence in fuel cells,” said James Warner, policy director of the Fuel Cell and Hydrogen Energy Association in Washington.

Fuel cells generate electricity through an electro-chemical process and are compared to batteries that give out power as long as they have a source of hydrogen.

They are exorbitantly expensive and in the past have been used only in experimental realms, such as NASA moon launches. But in the past decade the price has been coming down, and commercial projects have been proliferating in California, a state that offers an incentive program to cover roughly half the cost of the cells, said Shane Stephens, manager of research development at the National Fuel Cell Research Center at the University of California in Irvine.

The feds offer a 30 percent tax credit, but no incentive is available for fuel cells in North Carolina, making Apple’s project all the more intriguing. Apple is also developing miniature fuel cells to power laptop computers.

According to a recent report by the U.S. Energy Information Administration, fuel cells are among the world’s most expensive forms of electricity, costing $6.7 million per megawatt, which would put Apple’s project in the $30 million range.

North Carolina’s fuel cell exposure is limited to tiny demonstration projects that are one-forty-eight-hundredth the size of Apple’s fuel cells. Microcell Corp. is the Raleigh company behind the demos here.

According to information on the company’s website, the fuel cell facility could be in operation toward the end of the year. Beyond that information, Apple officials would not comment on the project. Nor would Bloom Energy, the Sunnyvale, Calif.-based company that will build it. The fuel cell modules, called Bloom Boxes, are used also by Walmart, Google, Staples, eBay, Cox Enterprises, FedEx, Bank of America, Coca-Cola, AT&T and Adobe, according to Bloom’s web site.

Clean energy for Duke

Charlotte-based Duke Energy, which is likely to buy the electricity output from Apple, referred all questions to Apple.

Apple does stand to receive bonus payments from Duke Energy if it puts clean energy on Duke’s grid, offsetting electricity from conventional power plants. The amount Apple would receive for selling renewable energy certificates to Duke would be privately negotiated. Duke is required under a 2007 state law to buy electricity generated from renewable resources to meet the state’s green energy targets.

Word of Apple’s project first dribbled out in February in Apple’s corporate sustainability report. But North Carolina regulatory filings provide new details.

The facility will consist of 24 fuel cell modules. It will extract hydrogen from natural gas supplied by Piedmont Natural Gas. But it’s not clear how much gas will be required.

To qualify as a renewable facility, Apple or Bloom will arrange to produce landfill methane gas or some other biogas to offset its natural gas use. The biogas supplier has not been named, but that information will have to be disclosed to win approval from the N.C. Utilities Commission.

Apple currently has a 500,000-square-foot data center on the 11.5-acre site. Construction recently began on a second building on the campus, but whether it will be another data center or a building related to the fuel cell investment is unclear.

Printed from the Charlotte Observer –


Architects float answers to rising seas around the world

April 3, 2012


By Margriet Faber, AP


BANGKOK (AP) – A floating mosque and golf course for the submerging Maldives islands. Amphibious homes in the Netherlands lifted to safety as waters surge beneath them. A hospital perched on 400 stilts to protect patients from Thailand’s devastating floods and the encroaching sea.

Around the world, architects and city planners are exploring ways mankind and water may be able to coexist as oceans rise and other phenomenon induced by climate change, including extreme, erratic floods, threaten land-rooted living.

With the Dutch at the helm, projects in the cutting-edge field of aqua-architecture are already in place, including a maritime housing estate, floating prison and greenhouses in the Netherlands. An increasing number are coming on stream, and while earlier blueprints appeared to be the stuff of science fiction, advocates say leaps of imagination are still needed given the magnitude of the danger.


“The focus on floating solutions has grown enormously. It has shifted from freak architecture to more sustainable, flexible alternatives,” says Dutch architect Koen Olthuis, citing growing support by governments and interest among private investors in Asia and Russia.

“We will have to live with a more watery environment. There is no choice,” says Danai Thaitakoo, a Thai landscape architect whose own Bangkok house was swamped last year as the country suffered its worst floods of modern times.

The Thai capital is also among the mega coastal cities projected by the end of this century to lie totally or partially under water as global warming boosts sea levels, according to the U.N. Intergovernmental Panel on Climate Change. Others include Tokyo, London, Jakarta, Sydney and Shanghai — an apocalyptic prospect of mass migrations and economic crises.

While in earlier decades architects and planners, particularly Japanese and Americans, dreamed of entire marine cities housing millions, most today are proposing a mix of defending communities with barriers and building on water using floating platforms, raised or amphibious structures and solutions still being devised.

“Climate change will require a radical shift within design practice from the solid-state view of landscape urbanism to the more dynamic, liquid-state view of waterscape urbanism,” says Danai, who is involved in several projects based on this principle. “Instead of embodying permanence, solidity and longevity, liquid perception will emphasize change, adaptation.”

In a study for low-lying New York, Olthuis says he envisioned Manhattan ringed by a sea wall with outlying boroughs allowing water to enter and adapting. The world’s Londons and Bangkoks, he says, may become “hydro-cities,” their historic hearts and concentrated core development waterproofed and other areas “going with the flow.”

The Netherlands, a third of which lies below sea level, has been managing water since the Middle Ages and is thus a pioneer in the field. It has exported its expertise to Indonesia, China, Thailand, Dubai and the Republic of the Maldives, an Indian Ocean archipelago that with a maximum elevation of about 8 feet is the world’s lowest country. The sea-battered city of New Orleans has also sought advice from Olthuis’s Waterstudio.

In the Maldives, Waterstudio has designed a network of floating islands, the first to be put in place next year, to accommodate hotels, a convention center, yacht club and villas. The “islands,” secured by steel cables, are made up of pontoons with a foam core encased in concrete that can be joined together like Lego blocks. An 18-hole golf course will also be set on such platforms, each with two to three holes, connected by underwater tunnels. The $500 million project, paid for by the Maldivian government and private investors, is slated for completion in 2015.

A floating mosque, originally destined for Dubai before an economic downturn hit, is also part of the master plan, Olthuis said in an interview.

Following the principles of “water will always find its way” and “collaborating with nature,” the Dutch have reversed some of their earlier strategy of tightly defending their land with dikes by allowing the sea to penetrate some areas on which housing has been constructed.


One pioneering effort was the placement of amphibious and floating homes on the River Maas in 2005. All survived major 2011 floods that forced the evacuation of villages along rain-swollen rivers.

Construction recently began on the Olthuis-designed New Water estate, 600 homes and a luxury apartment complex on land purposely inundated. Interest in water-based living and work space has accelerated over the past decade, he says, and Waterstudio’s drawing boards are stacked with plans for local and international projects.

Typical amphibious houses, like the two-story ones on the Maas, consist of a structure that slides into a steel framework over a hollow foundation which, like the hull of a ship, buoys up the building when water enters.

The Maas houses sell from $310,000, about 25% more than equivalent homes, in part due to the cost of connecting them to utilities and drainage. But Olthuis says such linkages are simple and present no inconvenience to owners.

“Just proven technology of plug-and-play systems. All tested and used for years in Holland,” he says.

“The only time you will see a difference between a floating house and the traditional one is during floods — when your house rises above the water and your neighbor’s stays put,” Olthuis says.

Along similar lines will be Britain’s first amphibious house, recently granted planning permission along the banks of the Thames River in Buckinghamshire. The 2,421-square-foot home will be able to rise to 8.2 feet in the event of flooding.

Thai architect Chutayaves Sinthuphan, who will be unveiling a pilot amphibious house for the Thai government in September, says interest in such projects has grown since last year’s floods, which killed more than 600 people and affected more than a fifth of the country’s 64 million people.

“We have had proposals out for some time, but nobody paid much attention to them until the floods came,” he says.

His Site-Specific Company has already built such houses for private clients, using modern techniques and materials but like other architects in Asia looking to a past when communities adapted well to annual monsoon season inundations.

They point to a riverside village in the southern province of Surat Thani, where everyone lived on homes atop bamboo rafts until all but three families moved on land. Those three homes were the only ones that survived last year’s floods.

In the mid-19th century, almost all of Bangkok lived on houses built atop stilts or rafts. Since then, most canals have been paved over and the stilt houses replaced by a concrete urbanscape that holds back water instead of allowing it to flow through.

Architect Prisdha Jumsai has borrowed from traditional methods to design Thailand’s first hospital for the aged. Work has begun on the 300-bed hospital over a permanently flooded area near Bangkok that is also subject to tides from the nearby Gulf of Thailand. Concrete stilts will raise its first floor about 13 feet above average water levels.


“We hope this will influence people not to just fill in land but to build on water. I think it will open up new ideas for Thais who can look to traditional architecture and make it more up-to-date in design,” Prisdha says.

But this still appears to be a minority view.

“Most Thais look to Western, land-based models and most architects still don’t talk about environmental concerns. They talk about how a house will look and make you feel good,” says Danai. “But this will have to change. It’s about survival.”



Cyber Defence Slow Due to Generation Gap: U.S. Official


Defense News

Apr. 2, 2012 – 01:13PM |


TALLINN, Estonia — Sluggish moves to counter the rising threat of cyber-attacks can be blamed on a generation of policymakers out of touch with rapid technological change, a senior U.S. official said April 2.

“The truth is there are a lot of senior officials in many countries who barely even know how to use an email,” Rose Gottemoeller, U.S. acting undersecretary for arms control and international security, said during a visit to Estonia.

“The change will come with the new generation,” she told the audience at a lecture delivered at the Estonian IT College, in the Baltic state’s capital Tallinn.

Estonia is one of the world’s most wired nations, and its high-tech savvy has earned it the nickname “E-Stonia”.

Home to NATO’s cyber defense center, founded in 2008, the nation of 1.3 million has been at the forefront of efforts to preempt cyber-attacks.

Estonia has bitter experience in the field.

A politically charged dispute with its Soviet-era master Moscow in 2007 was marked by a blistering cyber-attack blamed on Russian hackers — though the Kremlin denied any involvement.

Gottemoeller also said governments should consider incorporating open-source IT and social networking into arms control verification and monitoring.

“In order to pursue the goal of a world free from nuclear weapons, we are going to have to think bigger and bolder,” she explained.

“New concepts are not invented overnight, and we don’t understand the full range of possibilities inherent in the information age, but we would be remiss if we did not start thinking about whether new technologies can augment over half a century of arms control negotiating expertise,” she added.


Situational Awareness During Homeland Disasters

April 2012 – By Rita Boland, SIGNAL Magazine

SIGNAL Magazine


Brig. Gen. Christopher Coates, CF (l), Continental United States North American Aerospace Defense Command Region deputy commander, is greeted by John Salvador (2nd from r), Civil Air Patrol (CAP) National Headquarters assistant executive director, at CAP’s National Headquarters at Maxwell Air Force Base, Alabama. Gen. Coates presented CAP with five Geospatial Information Interoperability Exploitation-Portable (GIIEP) go kits courtesy of Air Forces Northern. GIIEP is designed to improve situational awareness and information sharing among the military, National Guard and state, local and tribal authorities.

A dynamic system promotes better collaboration and effective decision making among citizen soldiers and first responders.

In the battle to share information effectively among local, state and federal partners, the National Guard Bureau has employed a tool designed to give personnel an edge. This geospatial information technology is deployed across the United States and its territories, enabling better coordination during emergency situations. With Google Earth as its base, it already has proved valuable in large-scale responses, and officials are planning future improvements.

The Geospatial Information Interoperability Exploitation-Portable (GIIEP) is predominantly a geospatial information service, but it also includes go kits, which contain communications and visual record-making tools for use at the scene of an incident. Thomas Merrill, the National Guard Bureau’s GIIEP program manager, explains that the system is hosted in North Dakota at a U.S. Geological Survey office. Project personnel have populated a Google Earth globe—a plug-in that allows users to navigate and explore geographic data on a three-dimensional globe via a Web browser—with imagery interpreting all 54 states and territories where the Guard has units. Merrill says GIIEP is “very recent; it’s very accurate; and it’s trusted.”

GIIEP also pulls in information from other government sources such as U.S. Northern Command’s Situational Awareness Geospatial Enterprise. Merrill describes that as a treasure trove of information that National Guard task force commanders, units and state partners can pull data from during incidents. Another source is the Homeland Security Infrastructure Program (HSIP) Gold, which gives National Guard leaders and state officials the ability to look at infrastructure in their areas. During response to a violent tornado that hit Joplin, Missouri, responders accessed satellite data to see the scar caused by the storm, then overlaid information from HSIP Gold to see what infrastructure had been damaged or destroyed. GIIEP also incorporates sources providing near-real-time weather data.


Though the National Guard has been using GIIEP in full deployment for more than a year, it continues to increase the system’s information sources and uses. “There are many, many other government partners that we’re bringing in,” Merrill explains. One of the new partners is the Federal Aviation Administration. GIIEP partners now can pull data from and provide data to that organization. Commanders can learn about temporary flight restrictions or other similar actions.

With the system, various incident responders within a state are able to access decision-making resources more quickly, and everyone retrieves the same information. State National Guard authorities can hand out software to their local and tribal partners along with user names and passwords. When those groups turn on their software, the capabilities are available for them on the Internet. Merrill says that for users, the resource looks the same as Google Earth, but actually the data resides on a government website.

The go kits hold the equipment to enable much of the on-site information gathering and sharing that populates GIIEP. Each National Guard Joint Forces Headquarters has received two kits. Distributed Common Ground Station units also have received some, and the National Guard Bureau has a set for training purposes. Each kit contains a ruggedized laptop with a cell phone connection and a 3G connection, a satellite telephone, a handheld video camera and a still-frame camera. The still frame can provide the longitude and latitude of the image and is Global Positioning System-capable. It also will record the direction it is pointing. Basically, it tracks “all the metadata that an analyst would need,” Merrill states.

The video camera can give the location of the video, which can be stitched onto the Google globe. A Rover 4 receiver pulls full-motion video from unmanned aerial vehicles and feeds it into the computer and then into GIIEP, enabling decision makers to see the video in near real time, Merrill explains. He adds that video taken with GIIEP does not look the same as a Predator feed. “It’s not designed to,” he explains. But it does offer a first-blush view of conditions to officials.

Which pieces of the go kits would travel on an aircraft depends on the room available and on the mission. The Civil Air Patrol, which flies many of the GIIEP missions, has trained aerial observers and photographers who can employ the cameras, or the National Guard can send its own personnel. “It’s the option of the units involved,” Merrill explains. One concern for everyone using the technology is protecting the privacy of citizens as well as complying with all laws, regulations and directives. Merrill says that personnel go to great lengths to ensure that people’s rights are protected. Users also have to respect state sovereignty. Each state maintains its own information but can give another state permission to access it.

GIIEP has proven valuable during natural and manmade disasters across the United States. Merrill says it was brought to bear during the Deep Water Horizon oil spill crisis, when the full-motion video asset assisted incident commanders and decision makers obtain aerial photographs. He explains that the system gives incident commanders a ready asset, saving them time because they no longer have to call other states or services to obtain information.

Merrill also says the system is easy to employ. Regarding the Google Earth portion, “Any analyst can go in there and start using it very, very quickly,” he states. Nonanalysts should pick it up within a day or so of training. On the hardware side, personnel generally learn how to assemble and operate it in about a day and a half.



The GIIEP go kits include self-contained communications equipment and other hardware that enable full-motion video, digital imagery and in-flight chat capability with federal, state and local emergency operations centers supporting emergency situations.

Though the National Guard has expanded GIIEP’s information holdings and partners, as well as conducted software upgrades, Merrill says the system fundamentally is the same as its original format. “We’re talking right now about what our second generation would look like, and we have a working group right now looking at needs for the next few years,” he explains. The time frame for a new version is still indistinct.

As more partners begin employing the system, it will grow in usefulness. “We are going to be sharing open-source information between agencies, and this will be of mutual benefit to everyone involved with the program,” Merrill explains. His organization is talking with the U.S. Coast Guard, which he says wants to field it soon to obtain a remote sensing capability. Other agencies also have expressed interest in the system.

The multiuser nature of GIIEP is reflected in its development. Though the National Guard Bureau owns the system and served as the operational manager for the development effort, the U.S. Army Space and Missile Defense Battle Lab provided the technical services to integrate and deliver the system along with an integrated training program. The bureau’s J-2 supplied the original system requirements and maintained responsibility for coordinating information access agreements with numerous domestic event response partners. David K. Cox, chief, Integrated Missile Defense Division, Space and Missile Defense Battle Lab, explains that, “The GIIEP system was developed for the National Guard Bureau to support its incident awareness and assessment mission.”

The lab already had developed and demonstrated many of the network-centric information-sharing capabilities the bureau and its partners wanted in their system. Personnel primarily used commercial and government off-the-shelf products they had adapted to meet the needs of military and civilian first responders in the Defense Support of Civil Authorities (DSCA) mission. Other partners involved in the effort include the U.S. Air Force’s A-2 staff, the Air Force Electronic Systems Center and the 1st Air Force, which falls under Northern Command. According to Cox, Air Force organizations took part in all aspects of the project from the point of refining requirements to providing access to data sources such as commercial satellite and other overhead imagery products as well as evaluating the utility of the system.

“The GIIEP system addresses several capability gaps that were identified by the National Guard Bureau as a direct result of requirements identified in lessons from the Hurricane Katrina response mission,” Cox says. He adds that, “This system provides each state and territory headquarters with the tools to organize the best available relevant unclassified information such as geospatial products, facilities, response assets and infrastructure information in advance of domestic events, so the information is available when required in an accessible sharing environment.” GIIEP also enables on-scene responders to provide timely and accurate information to decision makers at operation centers where response activities are being planned and coordinated.


GIIEP integrates Space and Missile Defense Command’s Advanced Warfare Environment (AWarE), AWarE Video Elements and Virtual Alabama software products. “Each of these systems had been designed to experiment with different aspects of information sharing to support military command and control,” Cox says. They provide unique capabilities such as video management and exploitation, friendly force situational awareness, datalink interfaces and high-quality geospatial viewers that had been evaluated in experiments and used as a contingency capability to support warfighter needs. When integrated, the software met the requirements articulated by the bureau’s J-2 and could be delivered as an integrated system quickly; in this case, approximately one year.

Lab personnel coordinated with the bureau to send mobile training teams to each state’s Joint Force headquarters and the four U.S.-territory National Guard headquarters. The teams provided operator training including a series of operational vignettes. “The training was phased to ensure that states with potential events such as the Midwest floods and Gulf-Atlantic hurricanes were fielded as early as practical,” Cox explains. “This was primarily an integration activity since most of the individual components were open-source systems that had been previously evaluated in Army and Air Force experimentation.”

The most challenging aspect of the integration was designing the system to meet U.S. Defense Department information assurance standards while supporting effective information sharing with the National Guard’s numerous civilian first-responder partners at all levels, he adds. Both Cox and Merrill say the technology has been well-received by Guard units and their partners.

The Battle Lab continues to provide technical and sustainment support to the bureau. It also supported use of a developmental version to the Coalition Warrior Interoperability Demonstration. There, personnel exercised and evaluated it in a series of DSCA scenarios. Though intended for and so far used only in homeland missions, the geospatial system does have other potential applications. In principle, it could solve problems with processing, assessing and disseminating unclassified full-motion video and geospatial information products with nonconventional foreign allies and with partners responding to overseas humanitarian assistance and disaster situations.




Civil Air Patrol:

Space and Missile Defense Command:

National Guard Bureau J-2:




Advanced power-grid research finds low-cost, low-carbon future in West

Energy Daily

by Staff Writers

Berkeley CA (SPX) Apr 04, 2012


One possible scenario for the electricity system in the Western US in 2026-29. Pie charts show the proportion of different types of energy sources generating power and flowing between load areas if there were a carbon tax of $70 per ton. According to the SWITCH model, such a tax could allow the West to reach a goal of 54 percent of 1990 emissions by 2030. Credit: Daniel Kammen lab, UC Berkeley.

The least expensive way for the Western U.S. to reduce greenhouse gas emissions enough to help prevent the worst consequences of global warming is to replace coal with renewable and other sources of energy that may include nuclear power, according to a new study by University of California, Berkeley, researchers.

The experts reached this conclusion using SWITCH, a highly detailed computer model of the electric power grid, to study generation, transmission and storage options for the states west of the Kansas/Colorado border. The model will be an important tool for utilities and government planners.

“Decarbonization of the electric power sector is critical to achieving greenhouse gas reductions that are needed for a sustainable future,” said Daniel Kammen, Distinguished Professor of Energy in UC Berkeley’s Energy and Resources Group. “To meet these carbon goals, coal has to go away from the region.”

One example of low-cost, low-carbon energy generation and transmission around the West by 2030.

One possible scenario for the electricity system in the Western U.S. in 2026-29. Pie charts show the proportion of different types of energy sources generating power and flowing between load areas if there were a carbon tax of $70 per ton. According to the SWITCH model, such a tax could allow the West to reach a goal of 54% of 1990 emissions by 2030.

To achieve this level of decarbonization, policy changes are needed to cap or tax carbon emissions to provide an incentive to move toward low-carbon electricity sources, Kammen and the other study authors said.

While some previous studies have emphasized the high cost of carbon taxes or caps, the new study shows that replacing coal with more gas generation, as well as renewable sources like wind, solar and geothermal energy, would result in only a moderate increase to consumers in the cost of electric power – at most, 20 percent.

They estimate a lower ratepayer cost, Kammen said, because the evolution of the electrical grid over the next 20 years – with coordinated construction of new power plants and transmission lines – would substantially reduce the actual consumer cost of meeting carbon emission targets.

“While the carbon price required to induce these deep carbon emission reductions is high – between $59 and $87 per ton of CO2 emitted – the cost of power is predicted to increase by at most 20 percent, because the electricity system will redesign itself around a price or cap on carbon emissions,” said Kammen. “That is a modest cost considering that the future of the planet is at stake.”


Coal hazards

Burning coal, a non-renewable resource, produces about 20 percent of the world’s greenhouse gases, but also releases harmful chemicals into the environment such as mercury, sulfur dioxide, nitrogen oxides and sulfuric acid, responsible in some areas for acid rain and respiratory illness.

California has few coal-fired power plants, but gets about 20 percent of its electricity from coal-burning plants in neighboring states. About 46 percent of the state’s power comes from gas-burning plants, 11 percent from hydroelectric, 14 percent from nuclear and 11 percent from other renewables: geothermal energy, wind and solar.

The study, published in the April issue of the journal Energy Policy, highlights an analysis using the SWITCH electricity planning model. SWITCH, which stands for Solar, Wind, Hydro and Conventional generation and Transmission Investment, uses unprecedented detail that includes generation, transmission and storage of electricity.

The model was developed by Matthias Fripp to study California’s renewable energy options while he was a Ph.D. student at UC Berkeley. Kammen and his group extended the model’s capabilities and used it to study Western North America.

“We use the SWITCH model to identify low-carbon supply options for the West, and to see how intermittent generation may be deployed in the future,” said first author James Nelson, a UC Berkeley graduate student. “We show that it is possible to reach our goals of reducing carbon emissions using many possible mixes of power, whether natural gas, nuclear, solar, wind, biomass or geothermal.”

“Models like this are eagerly anticipated by many of the agencies involved in planning,” Kammen said, noting that SWITCH is a power-system model that can be fine-tuned for many different types of studies.

Setting targets for 2030 emissions

Mandates called Renewable Portfolio Standards (RPS) currently dominate carbon reduction policy in the United States. These standards require that a certain fraction of electricity generation come from renewable sources.

While California has a relatively high RPS target of 33 percent renewable sources by 2020, other Western states have less ambitious targets. Additional policy action throughout Western North America will be required to meet climate targets, Kammen said.

The UC Berkeley study concluded that current RPS targets are not sufficient to put electric power sector emissions on track to limit atmospheric levels of carbon to less than 450 ppm, a climate stabilization target recommended by the Intergovernmental Panel on Climate Change.

That target requires carbon emissions from electricity production in industrialized countries to drop to no more than 54 percent of 1990 emissions by 2030.

However, the study finds that the right mix of renewable energy sources can meet climate goals given stronger carbon policy.


Of all 50 states, California has been the most aggressive in setting goals for reducing carbon emissions, with a target to return to 1990 levels by 2020. The first step along the path of changing the balance of energy sources is the establishment of a carbon trading market in California, which will be up and running in September 2012, said Kammen.


Coauthors of the study are Josiah Johnston, Ana Mileva, Ian Hoffman, Autumn Petros-Good and Christian Blanco of UC Berkeley’s RAEL lab and the Energy and Resources Group; and Matthias Fripp of the Environmental


Pentagon, Industry Brace for Big Cuts


The Pentagon has told congressional leaders it will start planning this summer to make the sharp reductions in military spending set to begin in January if lawmakers don’t reach a broad deal to reduce the federal budget deficit.

Several major defense contractors already have begun contingency planning, anticipating they may have to lay off employees, close facilities and reduce other costs if the spending cuts take effect.

Many lawmakers on Capitol Hill don’t expect to start serious negotiations over how to avoid the defense cuts until after the November elections. They realize that alternative deficit-reducing measures that would be needed to compensate for restoring the defense funds—such as cutting other programs or raising taxes—are likely to be politically unpopular.

But the Pentagon has warned it can’t wait that long to begin preparations for the possible layoffs of tens of thousands of federal workers and other fallout from major cuts in spending.

Some defense contractors worry the Pentagon could begin delaying certain decisions on new projects as it tries to reprioritize its spending.

The potential Pentagon cuts would be one of the first material manifestations of the acrimonious debate in Washington over how to deal with the government’s growing debt.

The military spending cuts—which both Democrats and Republicans say they want to avoid—were an outgrowth of last August’s deal to raise the federal debt ceiling. Since Congress was unable to follow that deal with a deficit-reduction agreement, the Pentagon cuts and other spending reductions— known as the “sequester”—will take effect automatically unless lawmakers vote to override them by the end of the year.

“It’s become clear to the companies that sequestration is a real possibility,” said Stan Soloway, chief executive of the Professional Services Council, a trade association for government contractors.

The Pentagon declined to comment for this article.

If no deficit-reduction deal is reached, the law calls for the defense budget to be cut by more than $50 billion a year, or roughly 10% of the agency’s $531 billion base budget. The cut would last for 10 years, and Defense Secretary Leon Panetta has said the impact would be disastrous.

Several senior Republicans, including House Armed Services Committee Chairman Howard “Buck” McKeon of California, have proposed delaying the cuts in the first year.

While Democrats and Republicans have offered different proposals to replace the reductions with other budget changes, they mostly have staked out positions to appeal to their political bases. They appear to be making no progress toward a compromise.

The uncertainty has alarmed government officials and defense companies, which now have less than eight months to prepare for cuts and possible layoffs.

At issue is more than just a post-war drop in defense spending, which President Barack Obama has proposed as the interventions in Iraq and Afghanistan wind down.

And while Republicans have criticized the administration’s efforts to trim Pentagon spending, both parties generally agree the sequestration cuts would be jarring both to defense manufacturers and the military.

Frank Kendall, the Pentagon’s acting under secretary for acquisition, technology and logistics, told Congress last week that the automatic cuts were “intended to be so crazy that nobody would ever do it.”

Mr. Panetta would have little discretion to direct where the cuts would occur, meaning virtually every defense program would be hit.

The aerospace and defense industry is pleading with Congress to reach a deal soon to avert the cuts. The industry employs about one million workers. Last year, the Aerospace Industries Association, a trade group, released a study that projected the sector would lose in the neighborhood of 350,000 direct and supplier jobs under the automatic cuts.

Robert J. Stevens, chairman and CEO of Lockheed Martin Corp., LMT -0.10% told lawmakers and congressional aides last month that the spending cuts’ impact on the defense industry would be “devastating.”

“We ask that we not let an automatic budget trigger, a default position, become the dominant force for allocating resources that will shape our nation’s security posture and our industry, and we strongly urge action to stop this process,” Mr. Stevens said at a Senate Aerospace Caucus luncheon.

Dan Beck, a Boeing Co. BA -0.70% spokesman, said his company already had begun to prepare for the impact of a possible “worst-case scenario” of Pentagon cuts.

Credit-rating agencies have raised questions about the effect of cuts on defense companies’ finances.

Fitch Ratings said last week that the “threat of across-the-board defense cuts” was a concern and could affect Northrop Grumman Corp., NOC -0.50% a large defense contractor, but Fitch said it “expects a legislative resolution could reduce this risk, particularly with the respect to the timing of the cuts.”

Write to Damian Paletta at

A version of this article appeared April 4, 2012, on page A6 in some U.S. editions of The Wall Street Journal, with the headline: Pentagon, Industry Brace for Big Cuts.


State Department seeks a global drone fleet


By Bob Brewin

The State Department wants to acquire its own fleet of unmanned aerial vehicles to help protect diplomats posted to Iraq and other dangerous countries and in March, issued a request for proposals for contractors to provide the aircraft, crew and support on a turnkey basis.

State already operates UAVs over Iraq to help provide protection for the 2,000 diplomats and 14,000 contractors at the $750 million 440,000-square-foot embassy, The New York Times
reported in January.

The procurement released last month and updated Monday marks the start of a project to provide the department with UAV assets that could be deployed anywhere in the world. State did not say how many aircraft it eventually planned to deploy.

In its 2011 annual report, State’s Diplomatic Security Bureau said it tested UAVs in December 2010 in cooperation with the Defense Department and planned to deploy them to Iraq in 2011.

The mission of the UAV program is to provide real-time air surveillance of fixed installations and the ground routes that diplomats travel “thereby improving security in high-threat environments,” State said. The UAVs will help identify operational problems and potential threats, the department noted in the performance work statement.

State intends to acquire two types of aircraft in conformance with standards established by the Air Force. It wants to operate Tier I hand-launched UAVs such as the Gnat-750, manufactured by General Atomics, which can operate at altitudes of 500 to 2,000 feet and at speeds up to 40 miles per hour. These aircraft should be equipped with video and heat sensors that downlink still and streaming video and use built-in GPS navigation with a range of 250 miles.

The RFP also calls for contractors to supply Tier II and Tier II UAVs and aircraft such as the General Atomics predator, which can fly as high as 18,000 feet and has a range of 250 miles. The original RFP sought aircraft with a range of 900 miles. Tier II UAVs must also be able to downlink still and streaming video and use GPS navigation.

In addition, the winning bidder should provide pilots, analysts and navigators, global logistics support, and deploy the UAVs on a task-order basis, State said.

The State UAV project has attracted 62>interested bidders, including manufacturers such as General Atomics and a number of aerospace companies, as well as systems integrators such as Computer Sciences Corp., General Dynamics Information Technology, L-3 Communications and Lockheed Martin Corp. Bids are due April 23.


SecAF: Air Force trading size for quality

by Mike W. Ray

72nd Air Base Wing Public Affairs


4/4/2012 – TINKER AIR FORCE BASE, Okla. — In the current economic environment, the best course of action for the U.S. Air Force is to “trade size for quality,” said Secretary of the Air Force Michael Donley March 30, 2012, in a keynote address to attendees of the 39th Tinker and Community Dining Out at the Tinker Club.

The secretary said that budget reductions have compelled all branches of the military to balance competing needs, resulting in some tough choices for the Air Force.

“Although the Air Force is downsizing, we must still provide the force structure and capability and be prepared to respond to a dynamic environment,” said Secretary Donley.


“Readiness is essential because if we’re going to be smaller, we must be prepared.”

Despite the cutbacks, the Air Force took action to protect the Air Force’s top priorities, such as the long range strike family of systems, including the new bomber, as well as the KC-46 tanker acquisition, key space programs, and advanced ISR, the secretary said.

“Modernization remains a key concern, especially as our fleets age and new technologies drive new investment needs,” he said. “In this year’s budget proposal, we slow modernization while protecting programs that are critical to future capabilities.”

For example, the Air Force has slowed the ramp to full rate production of the fifth generation F-35 Joint Strike Fighter while it works through additional testing, he said. Secretary Donley also highlighted taking care of Airmen and their families as a top priority.

“We recognize that our Airmen are our most valuable, most precious asset,” the secretary said. “We have a responsibility to do right by our service members … even as we adjust to strategic and economic realities.”

Secretary Donley also discussed the proposed reorganization of the Air Force Materiel Command, from a dozen centers to five, by Oct. 1 this year. The “Five Center Construct” plan calls for the Air Force Sustainment Center to be headquartered at Tinker AFB.

President Obama has nominated Maj. Gen. Bruce Litchfield, commander of the Oklahoma City Air Logistics Center, for promotion to lieutenant general in order to command the center.

“This effort to streamline the command structure is expected to generate more than $100 million in savings across the command,” Secretary Donley said.

In a related matter, the secretary praised the OC-ALC for saving $36 million on the purchase of 1,036 sets of fan blades for F108 jet engines by procuring durable used equipment instead of costlier new fan blades.

“This is the kind of innovative thinking we applaud and encourage,” he said.

Secretary Donley also cited two prestigious awards won last year by Tinker military and civilian personnel:

· The Robert T. Mason Award for the best depot-level maintenance program in the Defense Department, presented to Tinker for 45 process-improvement events that resulted in KC-135 flow days being reduced from 216 to 179.

· The Dr. James G. Roche Sustainment Excellence Award presented to the B-1 System Program Office at Tinker, in recognition of the AFMC Aircraft Program Office that demonstrates the most improved performance in fleet sustainment for a given fiscal year.

The secretary said he and Air Force Chief of Staff Gen. Norton Schwartz “lead a world-class institution” and the capability that Team Tinker generates “is part of this success.”

In closing, Secretary Donley declared, “an Air Force ready to fly, fight and win. That’s our obligation.”


The New iPad’s Wi-Fi Problem Is Real

Atlantic Wire

Rebecca Greenfield

With Apple’s acknowledgment of some iPads have a Wi-Fi problem, we have confirmation that the problem isn’t one of those made-up-by whiner iPad issues. Looking for things to complain (and write) about, iPad users (and bloggers) created non-issues, like a battery charging “problem” and Warmthgate, in which the tablet warmed to “uncomfortable” temperatures. Considering all the possibilities for why a device might not connect to wireless, the Wi-Fi problem seemed to fall under that category. But per an Apple Care document from 9to5 Mac, the Wi-Fi thing is real, and Apple’s on it, instructing it Geniuses to “capture” the renegade Pads.

iPad owners have complained of this iPad issue as early as the weekend after the gadget went on sale. The issue, as its name suggests, is that some iPads will not connect to Wi-Fi. “The Wi-Fi is driving me nuts. My wheel keeps spinning waiting for pages to connect while lying on my bed,” said one annoyed Apple Forum commenter back in March. At the time, we suggested taking the tablet back, which worked for some forum users. Now, the “Poor Wi-Fi Reception on New iPad” forum has 49 pages of similar complaints. Considering these are Wi-Fi only objects, without mobile data capabilities, the tablet is kind of useless without a Wi-Fi connection.

Unlike the fake Warmthgate issue, which Apple (rightfully) brushed off with a comment that the iPad runs “within our thermal specifications,” Apple has a real solution for this problem. “Contact stores and retail centers should capture iPad (3rd generation) Wi-Fi only devices if they exhibit any Wi-Fi issues,” reads the internal memo. “Capture” sounds like an intense procedure to us, and according to 9to5Mac’s Mark Gurman it kind of is. “According to one source,” he writes. “‘Captured’ is code for the device to be immediately packed up and shipped to Apple’s engineering centers for examination and investigation.” Apple will replace the defective iPads. And, hopefully with a little investigation, the company will figure out the underlying issues.



Cybersecurity index aims to penetrate the fog of marketing hype


By Dawn Lim

It’s impossible to fix something if you can’t even gauge that it’s broken. It’s a classic problem that systems engineers and defense contractors face: they are staring into a fog of elusive threats made worse by marketers trying to make a sale on security hype.

Frustrated by this lack of clarity, Dan Geer, chief information security officer of CIA venture capital arm In-Q-Tel, launched a personal project that aims to measure threats in a meaningful way. Teaming up with a financial industry professional, he built a monthly sentiment index to capture the security community’s impressions on whether risks to IT systems and networks were rising or falling.

The project, called The Index of Cyber Security, highlights a young, growing movement within the security community to craft metrics that can give professionals direction if they are groping in the dark. The experiment was driven in part by “the despair of the security metrics guy thinking, ‘Where am I going to get the kind of aggregate data that allows us to get the big picture?’ ” Geer said.

“What perpetuates the fog is when different people who try to quantify technology risks may have an ax to grind,” added his partner Mukul Pareek, a risk professional working in New York. “So they want to present a numeric representation of risk to sell a product or create marketing gimmicks.”

Here’s how the year-old index is constructed. Every month, roughly 200 people are polled on how they feel about myriad security threats from industrial espionage to insider threats. The duo doesn’t rate actual risks, but evaluates if perceptions of risks are growing or decreasing. “In this way, we do not have to calibrate one respondent to the next such as to ensure that each of them has identical definitions and tastes,” Geer said.

To protect the anonymity and privacy of those polled, the survey is electronically configured to not allow anybody — including Geer and Pareek — visibility into any respondent’s answers. Many are CISOs and risk officers from banks, government agencies and institutions that house sensitive data. The two cast their net by getting referrals from people on the front lines of security. Their goal is to reach 300 respondents. Those who fill in the survey get a monthly aggregate snapshot and analysis of what the pool has submitted, as well as excerpts of comments from respondents, who sometimes reveal how they are tweaking their security budgets.

When the index was first conceived, the pair envisioned that cybersecurity insurance providers could use it to bring transparent pricing models to a market that is notoriously opaque. Another possibility was to propose the index as the basis of a tradable financial product that companies could buy and sell as a hedging tool in an investment portfolio. But they will not fully develop those applications until the index is more mature.

The duo behind The Index of Cyber Security plans to do a formal review of what they have learned in the past year. “We designed the index in a way that adjustments could be made to its components without damaging its continuity,” said Geer, “There is a lot of maturity that can happen in the meantime.”

Geer is no stranger to the difficulty of creating security metrics. In a separate 2007 collaboration, he created a price index for stolen passwords to routers and credit card information auctioned on websites. His goal was to measure how much hacked data was worth and see how financial incentives for hacking were changing over time. That became tricky when law enforcement officials started shutting down the websites that were put in the limelight, he recalls. It is also difficult to put a price tag on data sought by politically motivated actors.

While the appetite for sounder metrics is growing in the security industry, Geer admits, “it is not the roar of the crowd, but more like the hum of a cocktail party, frankly as evidenced by the reception our index has received.”

The index is up 26.4 percent since it was launched, and has risen every month during the past year.

Dawn Lim, a financial reporter in New York, was formerly an intern at NextGov.


RFID print software enhances document security

Dayton Daily News

By Dave Larsen, Staff Writer

Updated 11:13 AM Wednesday, April 4, 2012

A Washington Twp. company has released software that can track the movement of sensitive documents through radio frequency identification.

Plus Technologies announced that it has developed a version of its OM Plus print software for secure radio frequency identification (RFID) tag printing.

RFID technology transmits a pre-programmed serial number using radio waves. The programmed RFID tags can then be scanned to track the movement of products.

Plus Technologies developed the software for a law firm that wanted to use RFID to track confidential documents, but needed to create unique serial numbers on a page-by-page basis to enhance security.

The company’s print system captures unique information about the document, such as the author, date or number of pages, to create a “unique identifier.” This identifier is then “printed” to the RFID tag using special Lexmark printers. The same serial number also is stored in a database that tracks document check-in and check-out.

Plus Technologies officials said the technology also has applications in health care for patient wristbands, as well as other industries.

Find this article at:


Chinese Counterfeit COTS Chaos Continues

April 1, 2012

By Paul Kruczkowski, Editor

On March 26, U.S. Government Accountability Office (GAO) released a new report from its ongoing investigation into counterfeit electronic parts in the Department of Defense (DOD) supply chain. The report provides alarming details about the flow of counterfeit parts out of the epicenter of electronic part counterfeiting, China. As I reported in November, Chinese counterfeit electronics seriously threaten the performance and reliability of U.S. defense systems. Senator Carl Levin recently stated that counterfeit parts, “put our national security and the safety of our military men and women at risk. Not only that, but it also costs thousands of American jobs.”

In order to delve deeper into the counterfeit problem, GOA created fictitious company with an owner and employees, a website, and even a listing on the Central Contractor Registration. Using this fake company, GAO attempted to gain membership on three Internet platforms that provide access to vendors selling military-grade electronic parts. One of these platforms rejected the GAO application, but the other two granted them access, and one did so despite not receiving all requested supporting documentation.

GAO then requested quotes from vendors on both platforms for 16 parts focused on three categories: (1) authentic part numbers for obsolete and rare parts, (2) authentic part numbers with post-production date codes, and (3) fake or bogus part numbers that are not associated with any authentic parts.

They received responses from 396 vendors, 334 of which were located in China, 25 in the United States, and 37 in other countries including the United Kingdom and Japan. There were 40 responses to the part inquiries using bogus parts numbers, all from Chinese vendors. (The fact that 84% of the total responses — and more importantly, 100% of the responses to the bogus part inquires — came from China indicates that this is specifically a Chinese counterfeit problem.) The fictitious company proceeded to purchase 16 parts from 13 Chinese vendors, seven with the correct part number (category 1), five with the correct part number but incorrect date codes (category 2), and four with a fake part number (category 3).

GAO contracted SMT Corp. to conduct thorough authentication analysis on these parts, beginning with a visual inspection and followed by these additional tests:

  • Resistance to solvent and scrape test
  • Package configuration and dimensions
  • X-ray florescence element analysis
  • Real-time X-ray analysis
  • Scanning electron microscopy analysis
  • Solderability test
  • Dynasolve test
  • Delidding and die microscopy

The four parts ordered with bogus part numbers were obviously counterfeit. The other 12 parts were all deemed suspect counterfeit since they all failed many of the tests performed. (One part failed seven of the nine authentication tests.) Therefore, SMT determined that all 16 parts were suspect counterfeit.

Authentic versions of the counterfeit parts acquired in the GAO’s investigation are used in systems by every branch of the U.S. armed services. To name a few:

  • F-15, F/A-18E, AC-130H, B-2B, and many more aircraft
  • Los Angeles Class and Trident submarines
  • Arleigh Burke guided missile destroyer
  • Maverick AGM-65A missile
  • Joint Surveillance and Target Attack Radar System

The failure of counterfeit parts — due to poor form and fit, reduced MTBF (mean time between failures), or performance and tolerances that do not meet the specifications of the real component —can result in a wide range of problems, from unreliable operation of a platform’s subsystems to the risk of overall weapon system failure. This is unacceptable from a battle readiness standpoint, because the systems must be taken off-line to troubleshoot and repair the problem. Even worse, a counterfeit part failure can be deadly if a critical system fails during a mission.

Thus far, the Chinese government has been unable or unwilling to intervene to stem the tide of counterfeit electronic parts from its country, so that means the DOD must do everything it can do to secure its supply chain. It has already started taking steps in this direction: The Defense Supply Center Columbus (DSCC) has been qualifying distributors and their procurement processes, and government contractors are being encouraged to use legitimate sources to acquire parts. Now that the problem has attracted the attention and scrutiny of Congress, the consequence for allowing counterfeit parts to find their way into finished product will be much greater.

Such measures may stop the flow of new counterfeits into the DOD supply chain, but what about the parts that entered prior to the heightened awareness of counterfeits? Does anyone have a grasp of how many counterfeit parts reside in inventories at DSCC and OEMs? Defense contractors need a plan to conduct authentication tests on these parts before they are implemented, or else they will encounter costly delays.

The DOD may also want to make changes to its procurement process for weapon systems. During a recent conversation with Dale Lillard, president of Lansdale Semiconductor, we talked about how the DOD could look to the auto industry for one possible model. The DOD could procure the weapons system and support for parts and/or maintenance for a certain number of years. The unit cost of the weapons system would cover the additional parts and support and would give the OEMs both the funds and authorization to procure additional parts needed under the contract before those parts become obsolete.

After the support/warranty period is over, the DOD or OEM could rely on reputable aftermarket part manufacturers for future parts. This approach provides the opportunity for aftermarket part manufactures to identify which components will be required in advance, and give them ample time to prepare to properly support the industry. The most important benefit of an approach like this is that it chokes off the long-term demand for counterfeit parts from China. The only way to put the counterfeiters out of business is to stop buying their product.

I think the DOD’s penny-wise, dollar-foolish approach to procuring obsolete electronic parts will become a thing of the past, now that the counterfeit problem has gotten so big that it has become the elephant in the room. Why spend millions of dollars on pilot training and billions on the most advanced planes in the world if you are going to put both at risk to save a few thousand dollars by purchasing counterfeit parts? Why conduct exhaustive testing and emulation of components, subsystems, and system in new weapon systems, and then use replacement parts that aren’t held to those same standards?

I think this new GAO report is a wake-up call to the defense industry, and the remedy to the problem is quick and thorough action — and large dose of common sense. I wouldn’t accept counterfeit parts in the antilock breaking system or stability control modules in my car, and I certainly wouldn’t put $5 brake pads on my car when I know that reliable pads cost $50. Why? Because I put my family in that car. The young men and women that defend freedom around the world are somebody’s family. Start buying the $50 brake pads!


Google ‘Project Glass’ Replaces the Smartphone With Glasses


ARTICLE DATE : April 4, 2012

By Chloe Albanesius


Having to hold your phone in front of your face is so 2011. What if you could slip on some Doc Brown-style glasses and have your texts, emails, music, weather, and more beamed directly to your field of vision?

That’s the idea behind Google Project Glass, a concept device that puts your smartphone into a pair of slim glasses and projects its contents for some futuristic, voice-activated fun.

The effort, which is still in the planning stages, comes from Google X, a secret lab of future products first unveiled in November.

“A group of us from Google[x] started Project Glass to build this kind of technology, one that helps you explore and share your world, putting you back in the moment,” three Googlers wrote in a Wednesday Google+ post. “We’re sharing this information now because we want to start a conversation and learn from your valuable input. So we took a few design photos to show what this technology could look like and created a video to demonstrate what it might enable you to do.”

In that video (below), a Project Glass user looks down and various Google icons magically appear before him – calendar, Google+, the time, temperature, camera, chat, location, and more. Reminders pop up about meetings later that evening, and text messages come in as he’s eating breakfast. He speaks a response and sends it off without ever putting down the egg and cheese.

Leaving the house, an alert notifies him of a subway disruption, so he switches to walking directions, which pop up arrows as he approaches a street on which he’s supposed to turn. He passes a poster for an upcoming concert and asks his Project Glass device to remind him to buy tickets. Arriving at The Strand bookstore, he queries the location of his friend, and finds that he’s 400 feet away. The duo hit up a food truck, where they check in and take photos without ever touching the Project Glass device.

Later, the user has a video chat with a friend and shows off his ukelele skills.

It does not appear that Project Glass is one of Google’s elaborate April Fool’s jokes, but it probably won’t hit stores shelves anytime soon. Google+ users appeared to be enthusiastic about the feature (“I must get this as soon as possible,” wrote one). What do you think? Let us know in the comments, and check out PCMag’s Tech You Can Wear slideshow above.


Google Co-Founder Sergey Brin Spotted Wearing Project Glass

ARTICLE DATE : April 6, 2012

By Adario Strange

The reaction from the technology community to Google’s Project Glass augmented reality glasses video earlier this week ranged from the amazed to the overtly cynical. But one group of naysayers in particular—those who thought the glasses were merely the product of a fancy concept video—appear to be wrong. Google co-founder Sergey Brin was recently seen at an event wearing Project Glass, and there’s photographic proof.

Blogger Robert Scoble caught Brin wearing the glasses at an event for The Foundation Fighting Blindness (see photo at left). Posting to his Google+ page, Scoble said, “[Brin] quickly told me it is a prototype. I saw a bluish light flashing off of his right eyeball. I could only guess that my Google+ profile flashed up, or maybe some PR voice said ‘stay quiet’ or something like that. But the glasses are real. Very light looking. Most of the people around us had no idea that these glasses are pretty special.”

This will undoubtedly come as a surprise to many people in the tech and blogging community who assumed the video was as fictional as Microsoft’s slick Productivity Future Vision video. Well-known, Apple-focused blogger John Gruber wrote, “Google’s transition into the new Microsoft is now complete: fancy-pants sci-fi concept video to promote stunningly awkward augmented reality glasses.”

Blogger Joe Stracci responded to the video by writing, “…this isn’t a product that is in beta testing, alpha testing, or even a concept model phase. It’s just a somewhat cool video.” Gruber picked up on Stracci’s blog post and added, “Let’s pretend Google could actually build and ship something exactly like what they show in their concept video. Think about the data Google is collecting about the video’s protagonist.”

However, when you’re talking about a company that actually has working self-driving cars traveling around California, second-guessing that company’s ability to deliver on futuristic products isn’t advisable.

Other responses to the Project Glass video were more visual, including one video by Jonathan McIntosh called “Admented Reality” in which the original Project Glass video is shown with clever Google Adwords style advertisements layered over various parts of the video to match the wearer’s experience. Another, more humorous take, came from London-based Tom Scott who created an original video mimicking Google’s called “Google Glasses: A New Way to Hurt Yourself,” in which the glasses are shown as possibly too impractical to be used in everyday life.

According to Scoble, another high-profile Google executive, Vic Gundotra, is also walking around testing out his own pair of the glasses. Google hasn’t revealed a release date or price for the augmented reality glasses, but we now have good reason to believe that Project Glass is a real product that will be available long before Star Trek transporters and Jetsons anti-gravity shoes.

AFMC book commemorates command’s 20th anniversary

Posted 4/2/2012 Updated 4/2/2012

by Monica D. Morales
Air Force Materiel Command Public Affairs

4/2/2012 – WRIGHT-PATTERSON AIR FORCE BASE, Ohio — Air Force Materiel Command officials have released the command’s limited-edition anniversary book, highlighting AFMC’s 20 years of providing expeditionary capabilities to the warfighter.

The release of this commemorative book, entitled ‘Air Force Materiel Command: 20 Years of Warfighter Support,’ comes during the command’s anniversary year and just ahead of the anniversary date of July 1.

“Air Force Materiel Command has matured from a fledgling organization created by the integration of Air Force Logistics Command and Systems Command to become a preeminent provider of warfighting capability,” said AFMC Commander Gen. Donald Hoffman.

The anniversary book highlights not only the command’s four core mission areas of science and technology, acquisition management, test and evaluation and sustainment but also features profiles of the Airmen executing these missions day to day.

“From our laboratories and test ranges to our acquisition offices and depots, AFMC people have made a difference during the past two decades and will continue to do so in the years ahead,” said General Hoffman.

The book also features interviews with former AFMC commanders, a historical look at the command’s development and little-known facts about AFMC bases. This 144-page commemorative edition is printed in full color on high-quality paper stock.

The AFMC 20th anniversary book was published and printed by Faircount LLC of Tampa, Fla., in conjunction with content provided through the AFMC Headquarters and installation public affairs and history offices. It was printed at no cost to the Air Force since advertising in the book covered production costs.

The book is available in its entirety online at

AFMC personnel interested in obtaining a printed copy of the AFMC anniversary book can contact their local public affairs office


Who Is Waging Cyberwar Against the Jihadi Networks?

By LISA ABEND | – April 6, 2012

“The enemies of Allah who boast of their freedoms have not spared any effort to eradicate our blessed media.” After two weeks of silence, the jihadist forum Shamukh al Islam came back online yesterday with a gloat: an apparent cyberattack against Shamukh and four similar sites had failed to shut it down permanently. But terrorism analysts see the event in a different light. As they investigate the mystery of who caused the outage and why, most can’t help seeing in the blackout one more piece of evidence that al-Qaeda is in disarray.

Websites like Shamukh al Islam perform a critical function in jihadist circles. Loaded with videos that depict alleged Western atrocities against Muslims, they recruit supporters, while their chat rooms and forums allow jihadists around the globe to communicate with one another and exchange information, including instructions on bomb construction and chemical warfare.

So when Shamukh al Islam, perhaps the most prominent jihadist forum, suddenly fell silent on March 22 or 23, terrorism analysts took notice. That interest only grew over the next few days as four other sites went down and, with one exception, stayed that way. “For four of these sites to be off-line for two weeks is unprecedented,” says Aaron Zelin, a researcher at Brandeis University. “We’ve seen other cyberattacks on these sites before, but they’ve never managed to keep them down for that long.”

However significant the outage may be, no one is quite sure who caused it or why.

Because Shamukh went down right after French authorities cornered and killed Mohammed Merah, the 23-year-old jihadist who shot seven people in Toulouse, some analysts have suggested a connection. “Our first suspicion was that the blackout was somehow connected to Merah, just based on the timing,” says Evan Kohlmann, a terrorism analyst at Flashpoint Partners, a consulting agency. “The presumption here is that someone is intent on thwarting, or at least complicating, al-Qaeda’s efforts to release a particular piece of media” — perhaps the Merah video that was sent to al-Jazeera but never aired.

Yet a French connection is not the only possibility. On March 27, Spanish authorities arrested Muhrad Hussein Almalki in the coastal city of Valencia. Known as “the Librarian” for his work administering and archiving jihadist websites, Almalki supervised one of the downed sites and posted frequently under various aliases on at least two others. In a 2011 post to Shamukh, he answered a call for “enemy names” with a list of targets that included the two George Bushes, Bill Clinton and Tony Blair.

For Manuel Torres, a terrorism expert at Seville’s Pablo de Olavide University, Almalki’s arrest suggests that the sites’ operators may have taken down the forums. “Almalki was an administrator, and that means his arrest posed a significant danger: in both this case and a similar one in 2010, police found a list of passwords,” says Torres. “They might have taken down the sites themselves for protection.”

When sites have voluntarily gone dark in the past, however, their administrators have usually posted messages to that effect — something that did not happen in the Shamukh case until April 2. But if the evidence indeed points to a cyberattack, who was behind it?

On April 4, Pelayo Barro, a journalist for the Spanish digital newspaper El Confidencial, reported that the U.S. government had something to do with it. “My source, who works as an outside consultant for Spain’s National Intelligence Center, told me that U.S. intelligence agents got in touch with their Spanish counterparts in late March,” Barro told TIME. “They told them that a few days earlier a team of 10 hackers working for the Obama government had broken the passwords of several of the principal Islamist forums. They said it was the biggest cyberattack yet against these sites.” According to Barro, information gleaned from this attack enabled Spanish authorities to locate and arrest the Librarian, a figure they had been interested in for over a year.

Kohlmann, however, questions U.S. involvement. “Generally speaking, the U.S. government does not shut down jihadi websites,” he says. “Most of the people that I know in U.S. law enforcement and intelligence agencies believe it is more fruitful to leave the websites online and use them for intelligence purposes.” And if it wasn’t the U.S. government? “Other possible responsible parties might include the governments of France and Israel, as well as more skilled cybervigilantes.”

Britain took down jihadist websites in 2010 but did not admit to doing so until this year. So it is likely that the responsible party, whoever it is, will not be confessing anytime soon. But more significant than its origins may be what the attack reveals about al-Qaeda. “People in some intelligence agencies believe the organization is very weak,” says analyst Zelin. “And the length and breadth of this outage seems to support that.”

That’s not to say, however, that these jihadist networks no longer pose a threat. “It’s definitely a setback for al-Qaeda’s communication network,” says Kohlmann. But “if the past can serve as example, other trusted, authenticated forums will simply step up and take its place.”


NSA to crack codes with big data


By Aliya Sternstein 03/30/2012

Part of the Pentagon’s $250 million investment in finding significance within the gazillions of disparate data produced worldwide will go toward decoding foreign secrets, according to White House officials.

Under a governmentwide big data enterprise launched Thursday, the Defense Department will continue with a program that lets companies demonstrate to the National Security Agency new commercial tools that could aid military code makers and code breakers. The attempt to draw meaningful trends from a constantly increasing abyss of electronic information is the crux of the big data problem, technologists say, and adversaries are taking advantage of that digital maze to ferry secrets.

The NSA-Central Security Service Commercial Solutions Center program will be “hosting vendor capabilities presentations that showcase new commercial technology developments that meet the strategic needs of NSA-CSS and the national security community,” noted the White House big data agenda. CSS employs the government’s top cryptology specialists.

Other computer security programs under Defense’s big data initiative include a quest to uncover cyberspies.

At the Defense Advanced Research Projects Agency, the funding will help support the Cyber-Insider Threat, or CINDER, a program that identifies network activity that is consistent with the hallmarks of cyberespionage.

CINDER will run simulations of adversary missions over Defense networks to expose hidden operations, according to White House officials. The goal is to develop ways of more quickly detecting advanced persistent threats that linger silently in U.S. networks to carry out strikes — such as data extraction or manipulation.

“Such threats take many forms but include variants like trojaned code, backdoors in embedded systems, worms and logic bombs, all of which could prove detrimental to the warfighter if not discovered,” the CINDER website states.

Another big data-fueled project at DARPA — Insight — fills in for the human brain to find relationships among morasses of intelligence from myriad sources. The automated thinking uses information from sensors and prediction algorithms, among other things, to support commander decision-making, according to the program’s site.

In February, DARPA melded data from multiple intelligence sources to provide 135 terabytes of information — about 270 years’ worth of nonstop MP3s — to users across government, industry and academia.

NSA also will wield big data to fight cyberassaults, but the agency intends to delegate some of the heavy lifting to volunteers. “‘Vigilant Net: A Competition to Foster and Test Cyber Defense Situational Awareness at Scale’ will explore the feasibility of conducting an online contest for developing data visualizations in the defense of massive computer networks,” the White House’s agenda states.

Stay up-to-date with federal technology news alerts and analysis – sign up for Nextgov’s email newsletters.

By using this service you agree not to post material that is obscene, harassing, defamatory, or otherwise objectionable. Although Nextgov does not monitor comments posted to this site (and has no obligation to), it reserves the right to delete, edit, or move any material that it deems to be in violation of this rule.


Confused about big data and cloud? So’s everyone.

By Alice Lipowicz

Mar 30, 2012

If you think “big data” and “cloud” are confusing terms, you are not alone.

A new analytical study from an Austin, Texas research firm puts those terms on its 2012 list of “Most confusing technology buzzwords of the decade.”

The Global Language Monitor compiled the most recent list of popular yet fuzzy tech terms for a period spanning 2010, 2011 and 2012. The company releases an annual update on the list each spring.

Global Language uses a proprietary algorithm to track the frequency of English language tech words and phrases on global media, the Internet and in proprietary databases. The weighted index also factors in long-term trends, short-term changes, momentum and velocity.

To measure confusion, the algorithm looks at how many different ways a word is defined, how many questions are raised about it, and how many different words are used to explain it, Paul Payack, president of Global Language, told Federal Computer Week in an interview on March 30.

“What we do is try to find the words that have the most questions with them,” Payack said. “You can see pretty easily that these are confusing to people.” Another key metric is how many words are needed to define the term: The bigger the data set, the more confusing, he said.

For example, “big data” heads the 2012 list as the most confusing buzzword of the decade so far. Its meaning has become more diffuse as it has grown in popularity, Payack said. “People have said big data will save us by increasing productivity, but what does that mean?” Payack said. “Once it becomes a marketing term, then it is bent in so many ways.”

In second place on the 2012 list is “the cloud,” which also ranked in first place on the “most confusing” list in 2008.

“It has been on the list for a few years, with so many interpretations and so many definitions,” Payack said. “Over time, it has seemed to degenerate.”

“Cloud just keeps morphing and gets new definitions and that is because as it gets bigger and bigger, it gets bent by the different marketing and strategic pulls,” Payack added. “It is a hot button right now. If you are doing business, someone will definitely ask, ‘how does that work in the cloud?'”

Other terms that made the 2012 list, including “Web 2.0/Web 3.0” (in fourth place) and “3G/4G/5G” (in tenth place), along with confusing acronyms such as “SOA,” tend to generate multiple definitions over time, Payack said.

When we first looked at SOA [service-oriented architecture], there were 200 books on it. Now there are 40,000 books,” Payack said. “If you look at SOA and cloud, you can see that there is no set definition.”

Payack noted that federal executives and vendors, as well as foreign IT executives, are very interested in his research and try to hone in on precise meanings for the terms. At the same, many are relieved to find that others consider the terms to be poorly understood.

“They are happy to see that others are confused as well,” Payack said.

Payack, who has been studying technology terms since 2003, said some terms, such as virtualization, might appear to be new, but actually have been around for decades.

“Virtualization is still not understood,” Payack said. The term was in use with computer innovators in the 1970s, and then became popular with “cyberpunk” writers. “Now, 15 years later, people are still saying ‘this is the first use of virtual,'” Payack said.

Rounding out Global Language Monitor’s 2012 list of most confusing terms are “The Next Big Thing,” “social discovery,” “solid state,” “CERN,” “solar max,” “de-dupe,” and “SoLoMo.”



Schmidt takes charge of cybersecurity priorities

By Michael Hardy

Mar 30, 2012

Howard Schmidt, cybersecurity coordinator and special assistant to the president, is taking charge personally of a Cross-Agency Priority Cybersecurity goal to support implementation of the Barack Obama administration’s three key cybersecurity priorities.

Schmidt announced his involvement in a March 30 entry on the White House blog.

The three priority areas are Trusted Internet Connections, continuous monitoring of federal information systems and strong authentication.

TIC consolidates external telecommunications connections, leaving a smaller number of vulnerable connection points that are thereby easier to monitor. Continuous monitoring of federal systems turns cybersecurity into “a dynamic risk mitigation program that provides essential, near real-time security status and remediation, increasing visibility into system operations and helping security personnel make risk-management decisions based on increased situational awareness,” he wrote.

And regarding authentication, Schmidt wrote, “Passwords alone provide little security. Federal smartcard credentials such as PIV (Personnel Identity Verification) and CAC (Common Access Cards) cards provide multi-factor authentication and digital signature and encryption capabilities, authorizing users to access federal information systems with a higher level of assurance.”

Schmidt wrote that his goal is to greatly improve federal agency adoption of the three priorities and other critical cybersecurity capabilities. He wants to see 95 percent utilization of them by the end of 2014.

“Many departments and agencies have been working on these areas for several years, and there has been much progress,” he wrote. “By focusing on these priorities we plan to push adoption past the tipping point of adoption for all federal systems.”


3.8 Million Records From the 1940 Census Go Live Online

By Rebecca J. Rosen

The National Archives has digitized and published online the 72-year-old records of more than 130 million Americans, but finding your family will require a bit of legwork.

his morning the National Archives unveiled an ambitious three-year project to provide complete online access to the 1940 census — the census edition that details America during the Great Depression. This is the first census to be made fully and freely available over the web.

During World War II, the Bureau of the Census took the 1940 census and put it all on microfilm, where it has remained for the last 72 years — the duration of a legal waiting period that has just officially come to its end. Over the last three years, staffers at the National Archives have taken those microfilm strips, scanned them, and split the long scans into individual images, more than 3.8 million of them, representing 131,669,275 in the continental U.S. and another 2.5 million in U.S. territories including Alaska and Hawaii.

For those who want to track down their parents, grandparents, and other family members from that era, you’ll need a bit more than their names, unfortunately. For the time being, the census is only searchable by “enumeration district numbers” — small geographical units that structure the organization’s paperwork. If you know an address where your family lived, you’ll be able to find the enumeration district on the National Archives website or on this great index site created by Steve Morse.

For those with only a name, you’ll need to hang tight for now. The National Archives is asking for volunteers to help index the census to be searchable by names and other bits of information, but that is still a ways off.

This article available online at:


LightSquared question is in FCC’s hands

The agency could go ahead and kill the carrier's LTE plan or take other action, and it has no deadline

By Stephen Lawson

April 2, 2012 09:29 PM ET


IDG News Service – With the end of the public comment and response periods on LightSquared’s plan to operate a 4G LTE service near GPS frequencies, it’s now the FCC’s turn to decide how to proceed.

LightSquared wants to offer cellular broadband over spectrum that’s next to the band used by GPS (Global Positioning System), but tests have shown that its network would interfere with GPS receivers. On Feb. 15, the U.S. Federal Communications Commission proposed cancelling a conditional waiver it had given to LightSquared and indefinitely suspending its permission to build a land-based network.

The agency then set aside time for public comments on its plans, followed by responses to those comments. Thousands of comments came in, arguing on both sides of the issue, and the FCC extended the opportunity to comment. The response period finally ended last week.

In its formal response, filed on Friday, LightSquared slammed its opponents’ comments as “revisionist history” and said the FCC’s proposal was unsupported by law, science or precedent. LightSquared said the agency could either test more possible solutions to the interference, acknowledge LightSquared’s spectrum rights and make GPS vendors pay for solving the problem, or give LightSquared a different block of spectrum. The carrier’s loudest critic, the industry group Coalition to Save Our GPS, said LightSquared shouldn’t get to use the spectrum because it didn’t fulfill a requirement to resolve interference issues.

After reviewing the comments and responses, the FCC could go ahead with its plans or take some other course of action. It has no deadline to act. If it decides to kill LightSquared’s plan, the carrier will be given time to file a protest.

It appears unlikely the FCC will act soon.

“I don’t think the FCC is going to rush to do anything,” said Maury Mechanick, an attorney at White & Case and a former executive of satellite provider Comsat. The agency is out of public eye for now on this case, and as soon as it makes a decision, it will be thrust back into the limelight, he said. Mechanick believes it is likely that LightSquared would sue if the agency shot down its network. The carrier’s recent hiring of high-profile Washington lawyers suggests he may be right.

The FCC’s options are limited, according to Scott Pace, director of the Space Policy Institute at George Washington University. The conditional waiver it gave LightSquared, to let it sell LTE apart from satellite service, required the carrier to prove it wouldn’t interfere with GPS.

“That condition has not been met, so I don’t know that the commission really has a choice here,” Pace said. It’s also not clear that the FCC could give LightSquared another spectrum block except as part of a larger rulemaking process, he said. Whatever it may do, the timing of the next step is open-ended, he said. “They are an independent regulatory agency, and they will pick and choose their time.”

Stephen Lawson covers mobile, storage and networking technologies for The IDG News Service. Follow Stephen on Twitter at @sdlawsonmedia. Stephen’s e-mail address is


Air Force to continue legacy of innovation, Donley says

by Jim Garamone

American Forces Press Service


4/5/2012 – WASHINGTON — More than any other military service, people have viewed the U.S. Air Force as an innovative and “leap ahead” organization, the service’s top civilian leader said here today.

The pieces are in place to continue that legacy, Air Force Secretary Michael B. Donley told the Defense Writers Group.

Even before the Air Force became a separate service, strategists looked on aviation as a game changer, Donley said. Often the reach exceeded the grasp, but in the long run most of the strategies have proven correct, he added.

In World War I, the use of aviation underwent hothouse growth as a way to leap over the morass of the trenches. In World War II, bombing offensives and control of the air were prerequisites for successful military operations. That war also saw the agility that air transport provided ground and sea forces.

In 1947, the Air Force became a separate service, but took with it the legacy of innovation, Donley said. Intercontinental missiles, globe-spanning bombers, air-to-air refueling, heavy airlift, early warning radars, computers, space operations and drones are just a few of the leap-ahead technologies that airmen adapted and developed.


Now, with fifth-generation fighters, unmanned aerial vehicles, B-2 bombers and more, the service is well-prepared for the future, Donley said. “We’re planning for and building those capabilities,” he said. “Clearly, we will be fielding a modern tanker (and) a fifth-generation joint strike fighter in numbers.”

With the F-22, the United States is the only nation that has fielded a fifth-generation fighter, the Air Force secretary said. “We’ll be delivering the F-35 (Lightning II joint strike fighter) in numbers by the end of the teens,” he said.

Innovation is an important theme for the Air Force as budgets become tighter, Donley told the defense writers. “Modernization is an overhanging requirement,” he said, “and we have to be in a position to fund when those programs mature and they are ready for production.”

All budget decisions keep in mind the F-35, the new tanker aircraft, the next-generation bomber and space-based capabilities to ensure the Air Force can get more capable over time, he said. Also growing are capabilities in intelligence, surveillance and reconnaissance assets, space-based opportunities and cyber capabilities, Donley told the group.

The Air Force is looking at other “leap ahead” technologies, Donley said, but he declined to discuss them, noting that the Air Force developed the U-2 spy plane and F-117 stealth fighter in secret.

Modernization of bombers, fighters and mobility forces is one side of the coin, but the Air Force may make its greatest contribution in information management, Donley said.

“A lot of our capabilities are involved with moving information and maintaining advantages in our ability to collect it, to move it and to exploit it for operational purposes,” he said. “We’ve all lived this communications revolution of the past 25 years, and that’s where those new innovations and capabilities are finding themselves. It’s not just on the platforms of the tanker or bomber or fighter.”

The Air Force is on a good path for mobility, bomber and fighter forces, Donley said. The capabilities are built around the communications revolution, he added, and “those are very important capabilities for the joint force moving forward.”



From → Uncategorized

Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: