Sunday, February 2, 2014

GIS BRICS Industry

Expanding role for geospatial data and technology in the utility sector in BRICS countries

BRICS.svgThe BRICS countries (Brazil, Russia, India, China, and South Africa) are geographically, culturally and economically diverse, but have one common point on their agenda — the rapid development of the energy industry as a national priority. This is owing to the fact that the primary contribution to the projected increase in world energy consumption (the International Energy Outlook 2013 projects 56% growth between 2010 and 2040) comes from the BRICS. The BRICS countries represent 36% of total global renewable power capacity and almost 27% of non-hydro renewable capacity in 2012.
BRICS CO2 emissions 1990 and 2010 IEAThe BRICS face a wide range of challenges with respect to energy, the critical ones being universal electrification, especially in rural areas; rapidly increasing demand; the need to decrease energy intensity by deploying more renewable energy sources; reducing the high rate of energy losses, especially non-technical; and improving energy efficiency.

BRICS countries have been employing geospatial technology in various capacities in planning, generating, transmitting and distributing electric power. In a just published article in Geosspatial World we've provided a snapshot of the application of geospatial technology in these emerging countries and how we see it evolving in the future.

Utilities in BRICS countries are uniquely positioned as they have been using GIS as an operational tool for some time and are familiar with its capabilities. At the same time, they are not encumbered to the same extent by old, legacy IT systems based on operational silos that remain a challenge for utilities in developed economies. Their work forces are younger, more internet savvy, and more willing to adopt new technologies.  The dawn of the data-driven, geospatially aware era promises new opportunities to deliver improved availability, efficiency and affordability. If utility leaders in the BRICS understand the vision and seize the opportunity, they could propel these countries into a leadership position in the electric power utility sector.

NanoSAR, Tricorders, and EO and IR Fused Cameras


A SiN-VAPOR sensor is mounted in a pin grid array package. The sensor is made of silicon nanowires and is roughly the size of a quarter. (Jamie J. Hartman / U.S. Navy)
http://www.c4isrnet.com/article/M5/20131115/C4ISRNET08/311150023/There-s-relentless-push-mini-sensors-give-soldiers-new-tools?odyssey=nav|head

A SiN-VAPOR sensor is mounted in a pin grid array package. The sensor is made of silicon nanowires and is roughly the size of a quarter.

A SiN-VAPOR sensor is mounted in a pin grid array package. The sensor is made of silicon nanowires and is roughly the size of a quarter. (Jamie J. Hartman / U.S. Navy)

Written by
ERIK SCHECHTER

Bigger is most definitely not better when it comes to military sensors.

Indeed, in the ceaseless scramble to push every aspect of situational awareness down to the individual war fighter, government and industry have been working to get radar, cameras and other equipment onto smaller and smaller platforms — from hand-thrown UAVs to cellphones. Hence, miniaturization.
Synthetic aperture radar: mini and nano

Synthetic aperture radar (SAR) emerged as a battlefield application in the 1980s with improvements in data processing speeds, notes Bryan Burns, a senior scientist at Sandia National Laboratories. Since then, the ability to see at long ranges, through cloud, rain and other obscurants, and to deliver that radar imagery in near real time has driven the desire to get SAR onto smaller airborne platforms.

To accommodate this demand, SAR has been shedding weight. For example, in 1990, a typical unit weighed 500 pounds and offered imagery with a 6-inch resolution. Eight years later, Sandia and General Atomics Aeronautical Systems pushed that weight down to 120 pounds, (and improved resolution to 4 inches) with the Lynx SAR.

Today, the multimode Lynx Block 30 comes in at under 85 pounds, light enough to be employed with other payloads on an MQ-9 Reaper UAV.

Sandia also developed MiniSAR in the early 2000s. Weighing some 25 pounds, the radar can fly on tactical UAVs like the RQ-7 Shadow. Alternately, MiniSAR could share space with other sensors on a larger platform for multi-intelligence collection.

“It’s a very desirable position to be in,” Burns says.

Burns sees SAR weight coming down even further. For example, solid-state transmitters — now used for communications — could take the place of heavier traveling-wave tube amplifiers and microwave power modules, while electronically steerable arrays would eliminate the “gimbals often used to point antennas for SAR systems in various directions.”

In the meantime, some radar companies have been making dramatic advances with weight and form factor. Utah-based ImSAR, for example, has been developing a series of NanoSAR systems for small UAVs like the ScanEagle and RQ-20A Puma AE. The resolution of these NanoSARs is “very fine,” says ImSAR LLC CEO Ryan Smith, but he won’t get into details, citing International Traffic in Arms Regulations.

The original, now-discontinued NanoSAR A debuted in 2009. This lightweight system had a 12-inch resolution and could image at 1,000-1,500 feet. But it took a lot of fixes to get it there.

The first retooling took place in 2006-’07, when ImSAR discovered problems with the system’s processors. “We actually did a full reset and went back and redesigned every component,” Ryan says. “It was kind of a gutsy move.”

Then the system had to be reworked again in 2008-’09. “U.S. government customers” — Smith won’t specify agencies or offices — wanted a NanoSAR with higher resolution radar imagery and a precision gimbal.

The NanoSAR A has since been replaced by the multimode NanoSAR B, which weighs around 6 pounds, can operate at higher than 6,000 feet and has a better than 12-inch resolution. And at the 2013 Association for Unmanned Vehicle Systems International show in Washington, D.C., ImSAR unveiled its NanoSAR C, which is half the size of the B, but with all the same capabilities.

Commenting on the weight difference between ImSAR’s nano systems and Sandia’s much heavier MiniSAR, Burns insists, “It’s kind of an eggs and apples comparison because the data they send is the raw radar data, or the phase history data,” adding that while such an approach reduces airborne processor weight, it also creates issues of data bandwidth usage.

Smith brushes aside this characterization of his systems. No raw data is ever sent down to the ground, he says; it is either fully or partially processed in the air.

“We don’t take up any more bandwidth than what a processed image take,” he emphasizes.
A thermal sensor in every eyepiece

As SAR migrates to tactical and mini-UAVs, DARPA’s Low Cost Thermal Imager-Manufacturing (LCTI-M) program has been trying to get cheap thermal sensors into eyepieces, weapon sights and smartphones.

“The vision is to drive the cost and form factor down to a point where every soldier can own one,” says Nibir Dhar, the program manager in DARPA’s Microsystems Technology Office leading LCTI-M.

A thermal weapon sight can cost in the $9,000-$20,000 price range because of the tedious manufacturing process involved in making it.

“The optics, the lens, is made separately. Then the focal plane array, the actual sensor part is made separately. They are assembled manually,” Dhar explains.

The vision is to miniaturize thermal cameras — even their pixel sizes must shrink — and manufacture them for less than $500, which necessitates new ways of producing IR sensors. He says the wafer-scale manufacturing process used with cheap computer microprocessors is the way forward, but there are challenges in following that model.

For example, the thermal sensors “have to be inside a vacuum. A Pentium chip doesn’t need a vacuum,” Dhar notes, adding that making the germanium or other specialized material for thermal lenses also is tricky at the wafer scale.

In September 2011, the LCTI-M program awarded contracts to BAE Systems ($12.8 million), DRS ($11.1 million) and Raytheon Vision Systems ($13.4 million) to develop inexpensive miniature thermal cameras. In November, the companies are scheduled to demonstrate their prototypes, show price models for getting below $500, and prove that they’ve shrunk their pixels from 17 microns down to 12 or 10 microns.

Once past that hurdle, the final field test for all three companies will be the following September. There will be no downselect for LCTI-M, Dhar says, “because we want to have more manufacturers; that will drive the cost down even farther.”
Drones with all-purpose goggles

While BAE Systems is working with DARPA to develop tiny infrared sensors, the company also plans to begin production later this year on its Digitally Fused Sensor System (DFSS) for unmanned systems and unattended ground sensors. Weighing 6 ounces and measuring 2.5 inches wide, DFSS is a miniaturized, wide-field spinoff of BAE’s offering for the U.S. Army’s Enhanced Night Vision Goggle Program. (Army Night Vision Labs chose the ITT Exelis solution for that program.)

Like its Enhanced Night Vision Goggle predecessor, DFSS layers low-light and thermal camera views in one scene for increased situational awareness, but it does so in a form factor so small that it can fit on a Puma AE.

“DFSS is the smallest EO and IR fused camera that is available today for unmanned airborne, ground, maritime and unattended systems,” says Roman Hachkowski, technical director of Intelligence, Surveillance & Reconnaissance Solutions at BAE Systems.

DFSS can operate in a wide range of tricky conditions, from a blindingly bright day in the high desert to an illuminated parking lot to a moonless night. In the illuminated parking lot scenario, a law enforcement officer using DFSS could rely on the low-light EO camera to see through the windshield of a car, while the IR sensor could see past the overhead parking lot lights and identify suspects and what they are carrying once the vehicle door is opened.

In addition to demonstrating these capabilities, BAE Systems has been working with Army Night Vision Labs to see how well the DFSS can identify the beams of laser designators and peer into a “brown out” generated by helicopters.

“We were able to identify into the cloud of sand much farther than our customers expected, and with the fusion, we were able to see elements of the scene within the cloud,” Hachkowski says.

In January, DFSS participated in the Army Expeditionary Warrior Experiments at Fort Benning, Ga., and BAE Systems has been talking with companies to get DFSS on new unmanned airborne, ground and maritime platforms as well as unattended sensors. (It’s currently not on any program of record.)

Production of the fused sensor is set for later this year, with a next-generation DFSS planned for release in 18 months.

Building a Star Trek tricorder

In addition to shrinking radars and optics, the Naval Research Laboratory is developing a sensor called the Silicon Nanowires in a Vertical Array with a Porous Electrode (SiN-VAPOR), for mobile, real-time distributed chemical sensing.

“It’s a big, long title, but basically it means putting a sensor on a cellphone so that everyone can do chemical detection, so that everyone becomes a point detector on the battlefield, at the airport, at a football stadium, wherever,” explains Christopher Fields, the Naval Research Laboratory lead on the project.

Mounted on a cellphone, the platform offers high processing power, an easy interface and wireless communications in one small, low-power package. The sensor, working with others, could identify and map a chemical plume in any area.

“Really, what we’re trying to build is the Star Trek tricorder,” Fields notes, adding that for standoff detection, the device could be mounted on a robot used to approach a suspected IED or parachuted by the dozens into an area that might be suffused with harmful chemical gases.

Leveraging the work that others have done with silicon and microprocessors, the Naval Research Laboratory has shrunken SiN-VAPOR down to a 1-centimeter-by-1-centimeter pin grid array — and as tiny as that sounds, that’s just the chip carrier, which was designed for easy grasping with tweezers. The actual sensing part is only 5 millimeters by 5 millimeters.

Within that area are 100 million nanowires lined in a vertical configuration to create maximum surface area for maximum sensitivity.

According to Fields, SiN-VAPOR has already demonstrated TNT trace detection at the parts per billion and parts per trillion levels, and has performed under conditions that have thwarted other lab prototypes.

“We have data that suggests that our sensor actually performs better in a humidified environment,” he says.

The next step is to not just detect TNT but to detect, identify and quantify various types of chemicals and explosives in a target area. This requires coating groups of nanowires with different chemicals and employing pattern recognition algorithms, like those used in facial recognition software, to identify compounds. That will be challenging, but Fields nevertheless expects to see a SiN-VAPOR prototype tested as early as January.

With their tricorders, thermal eyepieces and mini-drones bristling with SAR and fused sensors, tomorrow’s war fighters will be bombarded with data about friend and foe and battlefield. One wonders if ultimately it will create super-soldiers or just information overload

Spatialized 4-channel 1080p-60 *Made in Colorado*

ODYSSEY 7Q REVIEW from Matthew Allard on Vimeo.

Monday, January 27, 2014

Methane takes a runner on society - getting it back in the can

Spatial analysis finds that methane emissions from fossil fuel extraction may be 5X larger than current estimates

Methane emissions by source EPA

Methane is one of the more potent greenhouse gases for global warming, but it is not clear just how much more potent methane is than CO2.  The EPA has estimated a factor of  21 times compared to carbon dioxide. But Robert Howarth, an environmental biology professor at Cornell University, has argued that it is actually 72 times as powerful as carbon dioxide in terms of its warming potential.  Furthermore  Howarth has argued that the type of shale gas drilling taking place in Texas, New York and Pennsylvania generates particularly high emissions of methane. A study has estimated that between 3.6% to 7.9% of the methane from shale-gas production escapes to the atmosphere in venting and leaks over the lifetime of a well.

study just published has assessed the spatial distribution of anthropogenic methane sources in the United States by combining comprehensive atmospheric methane observations, extensive spatial datasets, and a high-resolution atmospheric transport model. Based on the results of this analysis the authors conclude that the US Environmental Protection Agency (EPA) underestimates methane emissions nationally by a factor of ∼1.5.

Generally the study finds that methane emissions due to the animal husbandry and fossil fuel industries have larger greenhouse gas impacts than indicated by existing inventories.
The study concludes that there is a wide regional variation in the discrepancies in methane emissions and that the discrepancy is particularly large in the south-central United States.  There the study reports its results are ∼2.7 times greater than EPA estimates.  South-central emissions account for 24 ± 3% of national emissions.  Based on their analysis of the the spatial patterns of methane emissions and correlations between methane and propane,  the authors conclude that fossil fuel extraction and refining are major contributors (45 ± 13%) to methane emissions in the south-central United States.
EDGAR Emissions Inventory EC logo

Furthermore based on their analysis the authors  suggests that regional methane emissions due to fossil fuel extraction and processing could be 4.9 ± 2.6 times larger than in the Emissions Database for Global Atmospheric Research (EDGAR), an international inventory of past and present day anthropogenic emissions of greenhouse gases maintained by the European Commission JRC Joint Research Centre and the Netherlands Environmental Assessment Agency (PBL).

Sunday, January 19, 2014

Google I/O May 2014

Google I/O 2014 -
flickr-meneame comunica...
Google I/O is an annual two-day developer-focused conference, held by Google at Moscone Center in San Francisco, California.

Initiated in 2008, the event features highly technical, in-depth sessions focused on building of web, mobile, and enterprise applications with Google and open web technologies such as Android, Chrome, Google APIs, Google Web Toolkit, App Engine, and more.

The "I" and "O" stand for "Innovation in the Open", and input/output. The "I" and "O" resemble "1" and "0" as in binary code, but this is not what it stands for. The format of the event is similar to that of the Google Developer Day.

Thursday, January 2, 2014

VP9

Google’s VP9 Video Codec Gets Backing from ARM, Nvidia, Sony And Others, Gives 4K Video Streaming A Fighting Chance

h264-vs-vp9-landscape
Google’s VP9 video codec is getting a major boost today. While Mozilla, Google’s own Chrome browser and a few video players like FFmpeg started supporting VP9 over the course of the last year, what was mostly missing from Google’s ecosystem for this highly efficient video codec was hardware support. As Google announced today, however, virtually all major hardware vendors will soon support VP9 natively in their products and allow Google’s YouTube to stream HD content up to 4K directly to computers, TVs and mobile devices.
These new hardware partners include ARM, Broadcom, Intel, LG, Marvell, MediaTek, Nvidia, Panasonic, Philips, Qualcomm, RealTek, Samsung, Sigma, Sharp, Sony and Toshiba.
As Francisco Varela, Google’s global director for platform partnerships told me, we should see native support for VP9 in TVs and Blu-ray players in many of the 2015 models of these manufacturers, and computers and mobile devices will also start supporting it over the course of 2014. For most laptops and high-end mobile devices, hardware support is optional, as they can use a software decoder. For the best results, though – and the longest battery life – hardware support is necessary. Virtually all of these manufacturers already offer this support for H.264.
Google argues that encoding videos with VP9 results in about 50% bandwidth savings compared to its older VP8 codec or the H.264 standard.
vp9-bitrate
As Varela told me, support for VP9 on YouTube means that videos will start faster (there’s less data to move, after all), though it will take a while before the site has converted all videos to VP9. While the new codec will make streaming at any resolution faster, HD – and especially 4K video – will see the biggest benefits. For 4K, Varela argues, more efficient codecs are “absolutely necessary.” While 3D obviously didn’t go the way the industry wanted, he believes 4K will be adopted very quickly, especially as prices for both 4K screens and cameras drop to more reasonable levels over the next few years.
Signing up industry partners, he argued, was pretty easy, given that VP9 is unencumbered by complicated licensing issues. Google is also making the codec available for free, while hardware and software vendors who want to use the H.264 standard have to pay a licensing fee to MPEG LA (which then distributes it to the various patent holders).
LG, Panasonic and Sony will demonstrate YouTube in 4K at CES this year and YouTube says that it has been working with a number of video creators to get them to record in 4K as well.
Google has been struggling to get others to adopt its WebP image format, which is based on the same technology as the VP8 and 9 codecs. VP9, however, seems to have struck a chord with hardware manufacturers. Google is mostly interested in having them support it in order to deliver a better YouTube experience, but in the long run, other video sites will also profit from the company’s work in getting the format adopted by OEMs.
If you’re interested in the technical details surrounding VP9, here is Google’s I/O session from 2013 that covers the topic in more detail: