Showing posts with label camera. Show all posts
Showing posts with label camera. Show all posts

Tuesday, January 27, 2015

Mexico Volcano of Fire eruption caught on camera - Video



Dramatic video caught on webcam showed eruptions with clouds of smoke rising above the crater of the Volcan del Fuego (Volcano of Fire), set between the states of Colima and Jalisco.

Three separate bursts were seen on Wednesday (January 21), Sunday (January 25) with a nocturnal one on Monday (January 26). Webcams de Mexico.com captured the dramatic images.

The 9,939-feet above sea-level (3,860-meters) Volcan del Fuego, one of Mexico's most active, has frequent moderate explosions.

Activity at the volcano was also reported in January.

Thursday, May 22, 2014

CSA DEXTRE: Space Station Robotic Handyman To Replace Canadarm2 Camera - Video



Dextre, the Canadian robotic handyman on board the International Space Station, has done several repair and maintenance jobs to date, as well as the Robotic Refueling Mission technology demonstration, when he became the first robot to refuel a mock satellite in space.

The space bot is now poised to claim a first for robotkind: self-repair. This animation shows how Dextre will swap two cameras on Canadarm2 and the mobile base, which together form the three main components of Canada's Mobile Servicing System.

Dextre will start by retrieving a faulty camera located near Canadarm2's elbow joint. Since the camera is functional, but produces hazy images, Dextre will move it to a less critical location on the mobile base.

Dextre will then head over to Japan's Kibo module to fetch a camera from the module's transfer airlock, a type of sliding drawer that can be depressurized, where the station's crew will place it for Dextre to retrieve.

Dextre will install the new camera on Canadarm2's elbow joint, where it will provide critical views of the robotic arm's movements.

In addition to repairing and replacing two valuable cameras used for robotic operations, Dextre's task has far-reaching implications for what robots could do in the future.

Technologies for on-orbit robotic servicing, repairing and refueling satellites in space, hold great potential for addressing the issue of space debris, a growing concern for the world's space agencies.

The work done by Dextre today is laying the foundation for the future when one day, robots will be sent to repair, refuel and reposition orbiting satellites.

On-orbit robotic servicing could therefore save satellite operators from the significant costs of building and launching new replacement satellites, and help reduce space debris.

Wednesday, April 9, 2014

Cherenkov Telescope Array (CTA): CHEC a cutting-edge camera

This colourful piece of electronics is a photomultiplier module for the CHEC camera undergoing testing. 

Credit: Fabricio Sousa/SLAC

Key components for a new type of camera that will collect only the faintest, fastest flashes of light in the night sky are being assembled and tested now at SLAC.

Their eventual destination: the first Compact High-energy Camera (CHEC), which will be installed in a prototype telescope for the Cherenkov Telescope Array (CTA).

The CTA is a ground-based gamma-ray observatory currently under development by an international consortium with more than 1000 members from 27 countries.

The CTA will detect ultra-high-energy gamma rays, which are beyond even the reach of the Fermi Gamma-ray Space Telescope.

Current plans call for the observatory to comprise two separate arrays, one in the Northern Hemisphere and one in the Southern Hemisphere, totaling more than 100 telescopes of three different sizes.

The telescopes are now under development. Researchers at SLAC are testing modules of electronic components for the first CHEC camera, which will be installed on a prototype telescope later this year.

But most gamma rays from cosmic sources are blocked by the Earth's atmosphere. What will the camera be looking at?

'Seeing' Gamma Rays
Gamma rays are the most energetic form of electromagnetic radiation – energetic enough they cause showers of secondary particles when they hit the atmosphere.

The particles race toward the ground so fast they break the speed of light in our atmosphere.

This is considerably slower than the speed of light in a vacuum, but still speedy enough to cause them to emit a form of radiation called Cherenkov radiation.

All 32 of the photomultiplier modules for the CHEC camera. 

Credit: Tobias Jogler/SLAC

"It's really just light – bluish light," said Stefan Funk, an astrophysicist at the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC), a joint SLAC-Stanford institute.

Its a faint blue light that flashes on and off in about five nanoseconds: "If we had nanosecond eyes, we could see it," he said.

Unfortunately we don't have nanosecond eyes, and neither do CCD cameras, which are the type of camera generally used at observatories to collect light. That's where the CHEC camera comes in.

Nanosecond Eyes
Each CHEC camera contains modules of customized electronic components, beginning with photomultipliers.

A photomultiplier can capture a single photon, or particle of light, and amplify its signal for a detector to read.

But the real heart of each module is a special integrated circuit chip called a TARGET chip, developed at the University of Hawaii in collaboration with SLAC researchers.

Each TARGET chip can read the signals from 16 individual pixels on a photomultipliers one billion times a second, fast enough to capture the flashes of Cherenkov light.

"The prototype camera we're building uses 32 modules, each with a 64-pixel photomultiplier and four TARGET chips," said KIPAC postdoctoral researcher Luigi Tibaldo.

"It will be installed on a prototype of the smallest telescope," which represents the majority of instruments needed for the arrays; the TARGET chips are also under consideration for some of the mid-sized telescopes under development.

This adds up to around 60 telescopes to equip, making cost an important factor in the design.

Luigi Tibaldo of SLAC (center) and collaborators Shigeki Hirose of the University of Nagoya (left) and Mark Bryan of the University of Amsterdam (right) in Building 84, where they're testing the photomultiplier modules for the CHEC camera. 

Credit: Fabricio Sousa/SLAC

To address this issue, the modules can easily be mass-produced, Funk said.

Their colleague Gary Varner of the University of Hawaii works with industry partners to create the TARGET chips and the photomultipliers are supplied by Hamamatsu Photonics, a Japanese firm.

Monday, March 24, 2014

Hasselblad 500EL Unique camera from NASA's moon missions sold at auction

A Hasselblad 500EL "Data Camera HEDC Nasa" Jim Irwin Lunar Module Pilot camera, dated from 1968, used on the moon during Apollo lunar programs is sold at an auction for 550,000 euros ($760,000) at the Westlicht Gallery in Vienna on March 21, 2014

The Hasselblad 500EL is the only camera to return from NASA's moon missions in 1969-1972.

It was sold at an auction in Vienna Saturday for 550,000 euros ($760,000), far outdoing its estimated price.

The boxy silver-coloured camera, which was sold to a telephone bidder, was initially valued at 150,000-200,000 euros.

The Hasselblad model was one of 14 cameras sent to the moon as part of NASA's Apollo 11-17 missions but was the only one to be brought back.

As a rule, the cameras, which weighed several kilos and could be attached to the front of a space suit, were abandoned to allow the astronauts to bring back moon rock, weight being a prime concern on the missions.

"It has moon dust on it... I don't think any other camera has that," Peter Coeln, owner of the Westlicht gallery which organised the auction, said of the rare piece.

The camera, which was being sold by a private collector, was used by astronaut Jim Irwin to take 299 pictures during the Apollo 15 mission in July-August 1971.

A small plate inside is engraved with the number 38, the same number that appears on Irwin's NASA snapshots.

Close to 600 objects were on sale on Saturday at the Westlicht gallery, which is the world's largest auction house for cameras.

It has overseen the sale of some of the most expensive photographic equipment in history, including a 1923 Leica camera prototype that sold for 2.16 million euros, a world record.

Saturday, February 1, 2014

NASA Moon Camera: Hasselblad Used by Apollo Astronaut Up for Auction

A 70-millimeter Hasselblad EDC camera claimed to be the camera used on the moon by Apollo 15 astronaut James Irwin in 1971 will be auctioned in March by Austria's WestLicht Gallery.

Credit: WestLicht Gallery

A camera purported to have been used by the eighth man to walk on the moon while he was exploring the lunar surface in 1971 is now heading for the auction block in Austria, the sale's organizers announced on Thursday (Jan. 30)

A 70-millimeter Hasselblad Electric Data Camera (EDC), described by the WestLicht Gallery in Vienna as having flown to the moon and back on NASA's Apollo 15 mission, is included in the gallery's March 22 auction of vintage and collectible cameras.

The lunar-flown Hasseblad is said to have been used by astronaut James Irwin, as identified by the registration number "38" on a small plate found inside the camera.

Peter Coeln
"[The plate number] is 100-percent proof that this camera is the real thing and really was on the moon," Peter Coeln, owner of the Westlicht Gallery, told the AFP wire service.

Coeln said the camera is estimated to sell for $200,000 to $270,000 (150,000 to 200,000 euros).

The number 38 appears on the camera's Reseau plate, a transparent piece of glass used to superimpose calibration crosshairs on the photographs taken with the camera.

The "38" also appeared on each of the 299 photos captured by Irwin.

Tuesday, January 7, 2014

Gemini Planet Imager: Powerful exoplanet camera turns skyward

Gemini Planet Imager's first light image of Beta Pictoris b, a planet orbiting the star Beta Pictoris. 

The star, Beta Pictoris, is blocked in this image by a mask so its light doesn't interfere with the light of the planet. 

In addition to the image, GPI obtains a spectrum from every pixel element in the field of view to allow scientists to study the planet in great detail. 

Beta Pictoris b is a giant planet – several times larger than Jupiter -- and is approximately ten million years old. 

These near-infrared images (1.5-1.8 microns) show the planet glowing in infrared light from the heat released in its formation. 

The bright star Beta Pictoris is hidden behind a mask in the center of the image. 

Credit: Processing by Christian Marois, NRC Canada.

After nearly a decade of development, construction, and testing, the world's most advanced instrument for directly imaging and analyzing planets around other stars is pointing skyward and collecting light from distant worlds.

The instrument, called the Gemini Planet Imager (GPI), was designed, built, and optimized for imaging faint planets next to bright stars and probing their atmospheres.

It will also be a powerful tool for studying dusty, planet-forming disks around young stars. It is the most advanced such instrument to be deployed on one of the world's biggest telescopes – the 8-meter Gemini South telescope in Chile.

Bruce Macintosh
"Even these early first-light images are almost a factor of 10 better than the previous generation of instruments. In one minute, we are seeing planets that used to take us an hour to detect," says Bruce Macintosh of the Lawrence Livermore National Laboratory who led the team that built the instrument.

GPI detects infrared (heat) radiation from young Jupiter-like planets in wide orbits around other stars, those equivalent to the giant planets in our own Solar System not long after their formation. Every planet GPI sees can be studied in detail.

"Most planets that we know about to date are only known because of indirect methods that tell us a planet is there, a bit about its orbit and mass, but not much else," says Macintosh.

"With GPI we directly image planets around stars – it's a bit like being able to dissect the system and really dive into the planet's atmospheric makeup and characteristics."

Stephen Goodsell
GPI carried out its first observations last November – during an extremely trouble-free debut for an extraordinarily complex astronomical instrument the size of a small car.

"This was one of the smoothest first-light runs Gemini has ever seen" says Stephen Goodsell, who manages the project for the observatory.

This is Gemini Planet Imager's first light image of the light scattered by a disk of dust orbiting the young star HR4796A. 

This narrow ring is thought to be dust from asteroids or comets left behind by planet formation; some scientists have theorized that the sharp edge of the ring is defined by an unseen planet.

The left image (1.9-2.1 microns) shows normal light, including both the dust ring and the residual light from the central star scattered by turbulence in the Earth's atmosphere. 

The right image shows only polarized light. Leftover starlight is unpolarized and hence removed from this image. 

The light from the back edge of the disk is strongly polarized as it scatters towards us.

Credit: Processing by Marshall Perrin, Space Telescope Science Institute.

For GPI's first observations, the team targeted previously known planetary systems, including the well-known Beta Pictoris system; in it GPI obtained the first-ever spectrum of the very young planet Beta Pictoris b.

The first-light team also used the instrument's polarization mode – which can detect starlight scattered by tiny particles – to study a faint ring of dust orbiting the very young star HR4796A.

With previous instruments, only the edges of this dust ring, (which may be the debris remaining from planet formation), could be seen, but with GPI astronomers can follow the entire circumference of the ring.

Thursday, July 18, 2013

NASA Astronaut Chris Cassidy Takes a Photo during EVA

Astronaut Chris Cassidy Takes a Photo

NASA astronaut Chris Cassidy, Expedition 36 flight engineer, uses a digital still camera during a session of extravehicular activity (EVA) as work continues on the International Space Station. 

A little more than one hour into the spacewalk on July 16, 2013, European Space Agency astronaut Luca Parmitano (out of frame) reported water floating behind his head inside his helmet. 

The water was not an immediate health hazard for Parmitano, but Mission Control decided to end the spacewalk early.

Image Credit: NASA

Tuesday, June 25, 2013

Raytheon rocket onboard camera - Video



Onboard video footage shows a rocket’s flight during the International Rocketry Challenge, held during the Paris Air Show on June 21, 2013, at Le Bourget Airport.

The goal of the challenge was to launch a rocket 750 feet in the air within a 48- to

Credit: Raytheon Company

Thursday, May 16, 2013

NASA Mars HiRISE: Camera reveals Two hundred impacts each year

This image shows one of many fresh impact craters spotted by the UA-led HiRISE camera, orbiting the Red Planet on board NASA's Mars Reconnaissance Orbiter since 2006. 

Credit: NASA /JPL-Caltech /MSSS /UA

Scientists using images from NASA's Mars Reconnaissance Orbiter, or MRO, have estimated that the planet is bombarded by more than 200 small asteroids or bits of comets per year forming craters at least 12.8 feet (3.9 meters) across.

Researchers have identified 248 new impact sites on parts of the Martian surface in the past decade, using images from the spacecraft to determine when the craters appeared.

The 200-per-year planet-wide estimate is a calculation based on the number found in a systematic survey of a portion of the planet.

The University of Arizona's High Resolution Imaging Science Experiment, or HiRISE camera, took pictures of the fresh craters at sites where before and after images had been taken.

This combination provided a new way to make direct measurements of the impact rate on Mars and will lead to better age estimates of recent features on Mars, some of which may have been the result of climate change.

"It's exciting to find these new craters right after they form," said Ingrid Daubar of the UA, lead author of the paper published online this month by the journal Icarus.

"It reminds you Mars is an active planet, and we can study processes that are happening today."

These asteroids or comet fragments typically are no more than 3 to 6 feet (1 to 2 meters) in diameter.

Space rocks too small to reach the ground on Earth cause craters on Mars because the Red Planet has a much thinner atmosphere.

MRO has been examining Mars with six instruments since 2006. Daubar is an imaging targeting specialist who has been on the HiRISE uplink operation s team from the very beginning.

She is also a graduate student in the UA's department of planetary science and plans on graduating with her doctorate in spring 2014.

Leslie Tamppari
"There are five of us who help plan the images that HiRISE will take over a two-week cycle," she explained.

"We work with science team members across the world to understand their science goals, help select the image targets and compile the commands for the spacecraft and the camera."

"The longevity of this mission is providing wonderful opportunities for investigating changes on Mars," said MRO Deputy Project Scientist Leslie Tamppari of NASA's Jet Propulsion Laboratory in Pasadena, Calif.

Thursday, February 28, 2013

Nasa Mars Rover Curiosity: Rock Dust sample

Two compact laboratories inside NASA's Mars rover Curiosity have ingested portions of the first sample of rock powder ever collected from the interior of a rock on Mars. 

The powder comes from Curiosity drilling into rock target "John Klein" on Feb. 8. 

One or more additional portions from the same initial sample may be delivered to the instruments as analysis proceeds.


This image from NASA's Curiosity rover shows the first sample of powdered rock extracted by the rover's drill. 

Image credit: NASA/JPL-Caltech/MSSS
This image from the Mars Hand Lens Imager (MAHLI) on NASA's Mars rover Curiosity shows details of rock texture and colour in an area where the rover's Dust Removal Tool (DRT) brushed away dust that was on the rock. 

This rock target, "Wernecke," was brushed on the 169th Martian day, or sol, of Curiosity's mission on Mars (Jan. 26, 2013). 

This image was recorded on Sol 173 (Jan. 30, 2013).

The image shows nine small pits created by the rover's Chemistry and Camera (ChemCam) laser during its analysis of the target, one of four potential drill targets considered. Ultimately, this site was not chosen for the rover's first drilling. 

The rest of the features are natural to the rock, and include fractures, white veins, gray and white nodules, pits and tiny dark grains. Remaining clumps and specks of dust can also be seen. The scale bar at lower left is 0.12 inches (3 millimeters).

Image credit: NASA/JPL-Caltech/MSSS/Honeybee Robotics/LANL/CNES

Thursday, December 6, 2012

NORUSCA II camera: First-ever hyperspectral images of Earth's auroras

The aurora as seen as a color composite image from the NORUSCA II camera.

Three bands were combined to make the image. 

Each band was assigned a different color - red, green, and blue - to enhance the features of the aurora for analysis. 

Credit: Optics Express.

Hoping to expand our understanding of auroras and other fleeting atmospheric events, a team of space-weather researchers designed and built NORUSCA II, a new camera with unprecedented capabilities that can simultaneously image multiple spectral bands, in essence different wavelengths or colors, of light.

The camera was tested at the Kjell Henriksen Observatory (KHO) in Svalbard, Norway, where it produced the first ever hyperspectral images of auroras-commonly referred to as "the Northern (or Southern) Lights" and may already have revealed a previously unknown atmospheric phenomenon.

Details on the camera and the results from its first images were published in the Optical Society's (OSA) open-access journal Optics Express.

Auroras, nature's celestial fireworks, are created when charged particles from the Sun penetrate Earth's magnetic field. These shimmering displays in the night sky reveal important information about the Earth-Sun system and the way our planet responds to powerful solar storms.

Current-generation cameras, however, are simply light buckets-meaning they collect all the light together into one image-and lack the ability to separately capture and analyze multiple slivers of the visible spectrum.

That means if researchers want to study auroras by looking at specific bands or a small portion of the spectrum they would have to use a series of filters to block out the unwanted wavelengths.

The red arrow points to the unidentified low-intensity wave pattern, which the researchers suspect is an auroral-generated wave interaction with airglow. 

For contrast, the blue arrow points to the faint emission of the Milky Way. Credit: Optics Express.

The new NORUSCA II hyperspectral camera achieves the same result without any moving parts, using its advanced optics to switch among all of its 41 separate optical bands in a matter of microseconds, orders of magnitude faster than an ordinary camera.

This opens up new possibilities for discovery by combining specific bands of the same ethereal phenomenon into one image, revealing previously hidden details.

"A standard filter wheel camera that typically uses six interference filters will not be able to spin the wheel fast enough compared to the NORUSCA II camera," said Fred Sigernes of the University Centre in Svalbard (UNIS), Norway.

"This makes the new hyperspectral capability particularly useful for spectroscopy, because it can detect specific atmospheric constituents by their unique fingerprint, or wavelengths, in the light they emit."

These spectral signatures can then reveal subtle changes in atmospheric behaviour, such as the ionization of gases during auroras. This form of multispectral imaging also will enable scientists to better classify auroras from background sky emissions and study the way they cluster in the atmosphere.

Wednesday, September 12, 2012

NASA Mars Rover Curiosity: Rover turns on its Martian Geology cam - MAHLI

To take these latest, highly detailed images, NASA's Curiosity Mars rover, which landed on the Red Planet just over a month ago, relied on its most homely instrument: the Mars Hand Lens Imager, or MAHLI.

The 4-centimetre-wide camera was designed to play the role of a geologist's hand lens, a coin-sized magnifying glass typically worn on a lanyard around the neck.

In the field, geologists use them to get close-up views of the size and texture of the grains that make up rocks, letting them distinguish between cemented sands and solidified soils.

MAHLI's powerful lens leaves all human aids in the dust. The 2-megapixel colour camera can resolve features down to 12.5 micrometres wide.

The microscopic imager on the last Mars rovers, Spirit and Opportunity, could only see down to a few hundred micrometres.

"MAHLI's resolution was designed to resolve down to the size of a grain of talcum powder," said Aileen Yingst of the Planetary Science Institute in Tucson, Arizona, in a press conference last week.

Tuesday, September 11, 2012

NASA Mars Rover Curiosity: MAHLI Image of Rover Wheels and a Destination

This view of the three left wheels of NASA's Mars rover Curiosity combines two images that were taken by the rover's Mars Hand Lens Imager (MAHLI) during the 34th Martian day, or sol, of Curiosity's work on Mars (Sept. 9, 2012).

In the distance is the lower slope of Mount Sharp.

The camera is located in the turret of tools at the end of Curiosity's robotic arm.


A photo of the calibration target for the Mars Hand Lens Imager (MAHLI) aboard NASA's Mars rover Curiosity, taken by that camera on Mars.

The calibration target includes colour references, a metric bar graphic, a 1909 VDB Lincoln penny, and a stair-step pattern for depth calibration.

The penny is a nod to geologists' tradition of placing a coin or other object of known scale as a size reference in close-up photographs of rocks.

The Sol 34 imaging by MAHLI was part of a week-long set of activities for characterizing the movement of the arm in Mars conditions.

The main purpose of Curiosity's MAHLI camera is to acquire close-up, high-resolution views of rocks and soil at the rover's Gale Crater field site.

The camera is capable of focusing on any target at distances of about 0.8 inch (2.1 centimeters) to infinity, providing versatility for other uses, such as views of the rover itself from different angles.

Image Credit: NASA/JPL-Caltech/Malin Space Science Systems

Sunday, July 8, 2012

NASA MARS Rover Opportunity: Greeley Haven Panorama Image

The so-called Greeley Panorama was produced by combining 817 photographs NASA was able to capture via the panoramic camera, or Pancam, on the Mars Exploration Rover Opportunity between Dec. 21 and May 8.

Source: NASA/JPL-Caltech/Cornell/ Arizona State University 

NASA released this week a stunning image produced by combining 817 photographs it was able to capture via the panoramic camera, or Pancam, on the Mars Exploration Rover Opportunity between Dec. 21 and May 8.

North is at the center of the so-called Greeley Panorama image, and south is at both ends, according to NASA.

"During the recent four months that Opportunity worked at Greeley Haven, activities included radio-science observations to better understand Martian spin axis dynamics and thus interior structure, investigations of the composition and textures of an outcrop exposing an impact-jumbled rock formation on the crater rim, monitoring the atmosphere and surface for changes, and acquisition of this full-color mosaic of the surroundings," NASA reported.

"The panorama combines exposures taken through Pancam filters centered on wavelengths of 753 nanometers (near infrared), 535 nanometers (green) and 432 nanometers (violet)." NASA said. "The view is presented in false color to make some differences between materials easier to see."

Opportunity has been working on Mars since January 2004.

Tuesday, April 3, 2012

Tepco space camera detects radiation


 Image credit: TEPCO

The device, the ’super-wide angle Compton Camera’, uses technology that originates from space exploration, namely, it monitors radiation in the same manner that the ASTRO-H satellite (also known as NEXT or New X-ray Telescope) is able to.

Japanese researchers have developed a new way to detect and monitor potentially dangerous radiation.
Scientists based at the Japanese Aerospace Exploration Agency have been working in partnership with the Tokyo Electric Power Company (TEPCO) and the Japanese Atomic Energy Agency (JAEA).

According to a recent press release, the collaborative project designed to develop radiation levels more efficiently has been a success.

In the aftermath of Fukushima and subsequent concerns over radiation and nuclear reactor safety, the team have designed a new gamma camera that can be used to help alleviate some of the these worries.

Radiation is detected via this spectrum and sensor-based technology. The camera is capable of creating images of gamma ray-emitting radioactive particles though advanced sensors with a 180 degree capability.

What makes the camera useful in relation to more land-bound activities is that it can detect radiation that has collected at high altitudes.

These can include area such as building roofs — where it is normally difficult for measurements to be collected with existing survey meters.
 

The Compton Camera has been trialed this year to detect radiation levels in a field test.

At the Kusano area of Iitate village in Fukushima, the camera measured both radiation and concentration levels.

According to the release, the trial was successful — resulting in a broad area and higher degree of accuracy in radiation detection than other gamma cameras are able to detect.

In conjunction with TEPCO, JAXA and JAEA will develop the camera towards feasible use in radiaoactive material monitoring and decontamination work.

Not only can it be used in dangerous areas (such as at the Fukushima nuclear power plant) but it could also be used to monitor close-by areas and assess their safety levels.

Monday, March 19, 2012

Build your own motion-triggered "Internet of Things" camera

Adafruit has come up with a mash-up of Arduino components to create an internet-enabled motion-triggered camera (Photo: Adafruit)

Adafruit's "Internet of Things Camera" is a neat mashup of existing Arduino components into a versatile remote monitoring camera.

The key here is in the word remote - a capability that's granted by the inclusion of a first-generation Eye-Fi card, which is an SD card with built in Wi-Fi, that can upload images to your computer or other device, or better yet to a variety of photo-sharing websites such as Flickr.

The Internet of Things: It's a term that basically describes the notion of objects - potentially all objects - having some sort of uniquely identifiable online presence and, in more recent years, the ability to report data.

This might be data that it's designed to collect (as is obviously the case with this camera), or merely information about its own wellbeing - like a vending machine asking to be restocked.

It's this ability to report online, to Flickr, yes, but also to Twitter, or via email if you prefer, that ensures Adafruit's camera lives up to its name.

Crucially, no coding is required to get online functionality up and kicking - it's simply a case of entering your log-in information into the accompanying Eye-Fi application.

Unfortunately, the camera doesn't come assembled. In fact it doesn't even come as a kit. You'd need to buy each of the required components and assemble them yourself, though Adafruit gives a lot of guidance as to how this is done.

The main components are an Ardunio Uno microcontroller, TTL Serial JPEG Camera (or a weatherproof variant, if required), Adafruit's Data Logging Shield for Arduino, an Eye-Fi wireless SD card, and some sort of power supply.

By default the Internet of Things Camera is a motion-sensing camera - but because it's comprised of Arduino components, this isn't set in stone.

Adafruit suggests that a time-lapse device, or a camera triggered by a laser trip wire are relatively simple modifications. The recommended camera outputs video, from which stills are then logged and shared.

If you're already dreaming up potential applications for this thing then you're probably wondering about the power rating. Adafruit says a 9 V wall adaptor is the easiest way to keep the camera ticking, but for some uses that isn't going to work. In which case a battery pack of six AA batteries will apparently power the camera "for several hours".

Wednesday, December 21, 2011

MIT Camera Snaps at the speed of light



MIT Media Lab researchers have created a new imaging system that can acquire visual data at a rate of one trillion frames per second. That's fast enough to produce a slow-motion video of light traveling through objects. Video: Melanie Gonick.

Scientists at the Massachusetts Institute of Technology report they have developed a camera that can capture the speed of light in fractions of seconds.

The researchers from MIT's Media Labs suggest the new technology and imaging techniques could prove to be highly beneficial, maybe 10 years down the line, in hospitals and testing centers, as a hand-held medical scanner.

The imaging system can acquire visual data and capture images at a rate of 1 trillion frames per second and also produce a slow-motion video of light travelling through an object.

"Watching this it looks like light in slow motion. It is so slow you can see the light itself move across the distance," said Ramesh Raskar, an Associate Professor of Media Arts at the Media Lab.

Wednesday, November 30, 2011

All-seeing Ball Camera snaps panoramas in mid-air



Throw a typical camera in the air and you're unlikely to capture anything stunning. But now a new ball-shaped camera, created by Jonas Pfeil from the Technical University of Berlin and colleagues, is designed to be tossed upwards to snap panoramas in mid-air.

The rubber ball, which contains 36 cellphone cameras, barely moves at the highest point of its trajectory so the photos it captures aren't blurred by movement.

A built-in accelerometer measures in-flight speed to detect this stationary point and trigger the cameras. Software then stitches the pictures together into a spherical panoramic image.

The team developed the system to simplify panoramic photography, which normally involves painstakingly connecting several different images.

The ball will be demonstrated at Siggraph Asia, a conference on computer graphics and interactive techniques, from December 13-15 in Hong Kong.

Monday, October 24, 2011

GoPro Camera: sharper-pictured, easier-to-use HD HERO2 actioncam

If you've been holding off on getting a GoPro HD HERO actioncam, you were right to wait.

Today, the California-based company announced the release of its HD HERO2 camera, which is claimed to be "2X as Powerful in Every Way."

More specifically, it has an 11-megapixel image sensor (as opposed to the regular camera's 5 mp), along with a new processor that is said to be twice as fast, and a redesigned lens that GoPro claims is both twice as sharp and capable of a complete 170-degree field of view even in widescreen 1080p mode.

There's also good news for consumers who think the existing HD HERO is all the camera they need.

The HD HERO2 is the most advanced GoPro camera, yet. To celebrate its release, GoPro traveled the world with some of their favorite athletes, adventurers, and filmmakers to see what they could capture and create with the HD HERO2.

GoPro hope this film inspires you to get out and do the same.



Shot 100% on the new HD HERO2® camera from ‪GoPro.com.

Tuesday, October 11, 2011

Judge Denies NASA Astronaut's Motion to Dismiss Moon Camera Lawsuit

This Data Acquisition Camera, which was flown to the moon's surface by Apollo 14 in 1971, is now the focus of a lawsuit against the astronaut who tried to sell it.
CREDIT: Bonhams

Just who owns a camera flown to the moon, the astronaut who saved it as a souvenir or the government that wanted it left on the lunar surface, will need to be settled in court, a judge ruled this week.

U.S. District Court Judge Daniel Hurley denied Apollo 14 astronaut Edgar Mitchell's motion to dismiss the lawsuit brought against him by the U.S. government last June. At contention, a 16-millimeter data acquisition camera (DAC) that Mitchell returned to Earth in 1971 and then attempted to sell 40 years later.

The government contended it has no record of the camera being given to Mitchell, who elected to remove it from the lunar module (LM) before parting ways with the spacecraft and returning to earth. The LM, which Mitchell and Apollo 14 commander Alan Shepard used to land on and launch off the moon, was destroyed after it was allowed to fall back to the lunar surface.

Apollo 14 astronaut Ed Mitchell

Mitchell's attorney argued that too many years have gone by for the government to pursue the camera as stolen and besides, it was given to the now 80-year-old moonwalker as a gift in line with NASA's then-policies governing spent equipment.

Judge Hurley ruled that the statute of limitations as cited in Mitchell's motion and as defined by the state of Florida where the case is being heard did not apply to the federal government's claim.