The Guardian view on the automated future: fewer shops and fewer people | Editorial

Low-paid and unskilled jobs in retail will soon be automated away. What will happen to the people?

Automation may put a third of a million retail employees out of work in the next eight years, according to the British Retail Consortium. Across the sector as a whole fewer people are now working, and are paid less, than in 2008. Competition, the move online and the welcome rise in the minimum wage are all accelerating the job losses. But it is the rise in technology that will go furthest. Robotics and artificial intelligence are moving to eliminate all kinds of work that had seemed reserved for humans, even those tasks that had appeared too personal or lowly paid to be vulnerable. Most of the automations of the 20th century eliminated unskilled male labour. Now it is the turn of unskilled women. The process is already under way. Salespeople are increasingly regimented and scripted in their interactions, a process that might be called artificial stupidity, while ever greater intelligence and ingenuity is demanded of the customer who tries to navigate an automated checkout. That “unexpected item in the bagging area” is your vestigial humanity.

Related: UK retail sector predicted to cut 900,000 jobs

Continue reading...
via The Guardian view on the automated future: fewer shops and fewer people | Editorial

Could drone-guided robots replace binmen?

A new prototype rubbish lorry built by Volvo uses drone technology to locate bins and send a robot to pick them up, all without driver having to leave the cab

Volvo’s latest research prototype uses a small robot to pick up and empty bins into a rubbish truck, guided by an overhead drone and without the need for humans – but Britain’s binmen should not fear for their jobs just yet.

The Robot-based Autonomous Refuse (Roar) handling system is the product of Volvo’s collaboration with Sweden’s Chalmers University of Technology and MΓ€lardalen University, Penn State University in the US, and recycling company Renova.

Continue reading...
via Could drone-guided robots replace binmen?

10 ways to beat loneliness

People of all ages are discovering new and old ways to connect – from co-housing schemes to friendship apps and chicken care. Here’s a taste of what’s going on

It was a chance remark by a resident of a Gateshead care home that sparked one of the UK’s most innovative schemes to tackle loneliness among older people and those with dementia. The man told carers he was “missing his girls, Joan and Betty and Doreen and Pat”. It turned out that the girls in question were his hens. He’d kept chickens all his life and missed the daily routine and sense of purpose that came from caring for them.

We have two Iranian refugees, a Muslim and a Scandinavian, all different classes

Men ask us, why can't we have a co-housing scheme too? Do it yourselves, I tell them

Continue reading...
via 10 ways to beat loneliness

Announcing Solo Update 2.1.0: Now adjust exposure value

If you couldn’t tell, we’re burning through a series of new features and updates. Last week we dropped the long-awaited Solo flight simulator app (for iOS and Android), and the week before that we released our biggest update yet — Solo 2.0, with multipoint cable cam, Free Look and in-app airspace safety information.

This week’s Solo update is for controlling your GoPro® camera. Now you can adjust your camera’s exposure value (EV) through the Solo app, while you’re flying. To get this new feature, you’ll need to update your Solo app through either the App Store or Google Play. (You don’t need to update your Solo firmware.) We also suggest enabling “auto updates” for your apps, so whenever we release an app update your phone takes care of it for you.

To use EV, open the GoPro control settings menu in the Solo app (it’s next to the record button on your FPV screen; the icon looks like a couple of faders/sliders). Next open the ProTune menu; exposure value is in there.

Exposure determines how bright/dark your video or photo is. GoPro’s exposure is set to “0” as the default. Remember that EV compensation is exponential: Set it to +2.0, for example, and your images will be four times brighter. Dropping the exposure just a touch, however, can help draw out some cool contrasts, like the variegated patches of lighter and darker hues of blue in a lake or the ocean. You can monitor the effects of your adjustments in real time using the HD feed.

What exactly is exposure value, and why would I need to adjust it?

When you capture a photo or video, your camera assesses the amount of light in that scene and automatically makes adjustments to keep the exposure even, given how bright or dark the setting is. This is called exposure value compensation, and it ensures your image has the best range of light and dark tones. It’s kind of like a bell curve built around the class average: Sometimes the class average is too high, so scores are dropped; sometimes the average is too low, so scores are raised.

Sometimes, though, you and the camera see things differently. This is when you might want to adjust the EV yourself to ensure your video or photo is lit the way you want it. For very bright scenes (like snow, for instance, or filming the roof of a Walmart at high noon), your camera will automatically adjust EV to bring those brights down. Now your whites will look gray. To compensate for this, boost EV to a positive number (+.5, +1, etc.); for super dark scenes do the opposite, and set EV to a negative number.

Say you’re shooting a white sheep on the beach with sand dunes behind it. The central exposure value of this scene is very, very bright. The GoPro will adjust its bell curve accordingly to make “white” the center value, which tones all that white down to a gray middle point. But you want white. Boost EV and the scene will whiten. (“All of you did so well on that test that I’m overriding the bell curve and leaving your scores high, so you guys can see just how great you all are! I mean, really! Wow! A+!”)

But here comes a black sheep. When you put your camera on it, the camera wants to boost EV to compensate. But not you. You’re a human. You know that in this instance you want contrast. You want a truly black sheep. Knock the EV down a little and you get better contrast. (“You kids really blew it this time. The plane has crashed into the mountain. You don’t even deserve a curve. Look around and notice how very clear the minute differences are between your utterly abysmal grades. I’m calling your parents.”)

Mission: Now that you have control over the light, go out and shoot some photos and video in some challenging

The post Announcing Solo Update 2.1.0: Now adjust exposure value appeared first on 3DR | Drone & UAV Technology.


via Announcing Solo Update 2.1.0: Now adjust exposure value

Mercedes-Benz swaps robots for people on its assembly lines

Car makers switch to smaller and safer robots working alongside humans for greater flexibility

Bucking modern manufacturing trends, Mercedes-Benz has been forced to trade in some of its assembly line robots for more capable humans.

The robots cannot handle the pace of change and the complexity of the key customisation options available for the company’s S-Class saloon at the 101-year-old Sindelfingen plant, which produces 400, 000 vehicles a year from 1,500 tons of steel a day.

Continue reading...
via Mercedes-Benz swaps robots for people on its assembly lines

Is this the future of work? Scientists predict which jobs will still be open to humans in 2035

Australian science agency CSIRO says workplaces will be increasingly digitally focused and automated. Who wants to be an ‘online chaperone’?

Workers looking for jobs in 2035 might consider retraining as remote controlled vehicle operators or online chaperones.

Those are two of the jobs of the future suggested in a report by the CSIRO that charts 20-year trends in increasingly digitally focused and automated Australian workplaces.

Related: Automation may mean a post-work society but we shouldn't be afraid

Related: Elon Musk: Tesla cars will be able to cross US with no driver in two years

Related: What does it mean to be human in the age of technology?

Related: The superhero of artificial intelligence: can this genius keep it in check?

Continue reading...
via Is this the future of work? Scientists predict which jobs will still be open to humans in 2035

Alec Ross on the industries of the future – Tech Weekly podcast

Former innovation adviser to Hillary Clinton Alec Ross talks us through the big tech changes that are set to radically change how we work and live

Predicting the future can be a fraught process. History is littered with the grand visions of futurists who imagined us living under a dome on the moon come 2015 and yet here we are in 2016 no closer to lunar life but glued to cat videos on the screens in our hands.

One man who has had enough experience travelling the world as Hillary Clinton’s senior adviser to look into his own crystal ball is Alec Ross. His new book The Industries of the Future looks at the sorts of changes that are going to emerge in the next 20 years from the commercialisation of genomics to the impact of robotics and whether the next world war will be a cyber war.

Continue reading...
via Alec Ross on the industries of the future – Tech Weekly podcast

How real is that Atlas robot video?

Video footage of Boston Dynamics’ new Atlas robot has left Twitter in awe of our new robot overlords; but how much of these videos should we take at face value?

Boston Dynamics are a pretty cool company. The robots they produce are world leading and they’ve made enormous strides – sorry - in robot locomotion. They’re also very good at knocking out viral videos that get half of Twitter seriously pondering the end of humanity.

The public debut of their Atlas robot in 2013 was so exciting it prompted one over-excited AI specialist to claim: “A new species, Robo Sapiens, are emerging.” That the specialist in question was the co-founder of a Silicon Valley robotics company I’m sure had no bearing on his hype.

Continue reading...
via How real is that Atlas robot video?

Atlas shrugged: Boston Dynamics robot endures 'torture' test – video

Boston Dynamics tests its newest version of its Atlas robot design. The latest generation is smaller than its predecessor, and able to run without wires entirely. The footage shows the robot enduring ‘robot torture’, being prodded with a hockey stick before eventually being forcefully shoved over entirely with a tube. But when it’s down, Atlas isn’t out, and the robot manages to quickly get up on two legs

Continue reading...
via Atlas shrugged: Boston Dynamics robot endures 'torture' test – video

Watch Google torture an 80kg, 5'9" robot for science

Atlas, the latest robot from Google’s Boston Dynamics, can withstand a beating

Google’s long-standing quest to build the robot overlords which will eventually subjugate humanity and usher in a 1,000-year reign of the machines is apparently progressing nicely.

The company’s robotics subsidiary, Boston Dynamics, has revealed the latest iteration of its Atlas robot, most recently seen doing the hoovering last month.

Continue reading...
via Watch Google torture an 80kg, 5'9" robot for science

Automation will mark the end of our work-obsessed society

We live in a profoundly work-centred world; if automation is to benefit us we need to ask big critical questions about the purpose and value of our jobs

At the beginning of The Lego Movie, we meet an average Joe called Emmet. When he’s not working, Emmet spends most of his time sitting on the sofa, listening to the pop song Everything is Awesome (a sort of Lego-world equivalent of Happy by Pharrell Williams), absorbing adverts, and tuning in for a catchphrase comedy called Where Are My Pants?

Emmet showers, brushes his teeth and exercises at the exact same time every day, before hitting the same traffic jam, having the same empty conversation with his colleagues, and returning home to his best and only friend – a potted plant.

Related: Automation may mean a post-work society but we shouldn't be afraid

Continue reading...
via Automation will mark the end of our work-obsessed society

Google's portrait app is fun – but no robot will ever replace Rembrandt

Google’s Creative Lab have developed a machine for dashing off pen portraits from smartphone photos. But until machines actually choose to express themselves, these are not artworks

Can robots draw portraits? The question may appear answered by an experiment demonstrated by Google’s Creative Lab at the Mobile World Congress in Barcelona.

Related: Will automation make us happier? – live chat

Related: Tell us something we don't know: why science can't show us much about art

Continue reading...
via Google's portrait app is fun – but no robot will ever replace Rembrandt

3DR, Solo And Free Look: The world’s first and best follow drones

Many drones have follow me. Solo does, too, but it’s by far — by far — the best and most sophisticated follow on the market. Big claim. Now let me back it up.

“Follow Me” is probably the most popular autonomous capability of drones today, and it’s certainly the most talked-about. It’s nothing less than miraculous that you can have a robot follow you in the air, let alone that this flying robot has a camera and can keep you in the frame of the video.

First point: 3DR invented Follow Me. Our IRIS drone was the first consumer drone with follow me. This dates back to June, 2014. Other drones who claim to have the first follow — simply not true.

Second, and much more interesting point: 3DR has the best follow. That is, we didn’t rest with launching Follow — we’ve been constantly improving it. No other drone can touch what Solo can do.

 

Dynamic Follow

Other drones just follow you. Only Solo can follow and frame you dynamically. Change the drone’s altitude and position in real time with the controller sticks. Solo can even “lead” you — just position Solo in front of you.

What’s more, Solo can even orbit around you while you’re moving, keeping the camera on you the whole time. As an interesting use case, you can have Solo follow and orbit while you snap photos — grab a series of stills at many different angles, without worrying about piloting the copter at the same time.

 

Look at me

Think of this feature as a tripod in the sky, with the tripod head swiveling the camera to follow you wherever you go.

Just pop Solo up in the air, put it into Follow, enter your Follow options (marked “…” on the Follow screen) and toggle “Look at me.” Solo will now stay locked in one place (as opposed to flying after you) while the camera keeps you in frame as you move. This is great if you want to create distance from the camera.

 

Follow with Free Look

Now things get interesting.

For Solo 2.0 we took the Free Look feature — already available in Cable cam — and applied it to Follow. You can now shoot a moving subject from any angle, in real time, just by using the controller sticks.

Free Look turns Follow into an interactive filming experience: When you enter Free Look, Solo still follows your subject, but you now have full control of the camera. Pan and tilt freely to look anywhere you want while remaining completely confident in Solo’s position and directional heading. It’s similar to the Hollywood motion control of a Russian arm on the back of a truck: Virtually leash Solo to one vehicle, and swivel and tilt the camera manually to track the movements of even the most spontaneous subject.

Plus, if Solo isn’t exactly where you need it, Free Look also allows you to adjust the copter’s position in space with a nudge of the controls. When you know exactly where the camera will be, you can plan shots with confidence and also react in the moment.

Think of those skateboard videos you’ve seen where someone’s following his buddy, carrying and working the camera. Solo would be that buddy, flying automatically, with the pilot having full camera and position control.

To paraphrase Frank Zappa, writing about Free Look is like dancing about architecture. So for a great visual example of how Free Look works in action, check out this video at about 1:35.

 

Cool but complicated. How do I use it?

 Essentially, Free Look lets you take control of the copter and camera while Solo follows. This sounds pretty complicated, but it just means that the drone pilot can change altitude and follow distance in real time, just by using the controller sticks. Not only do you always know the trajectory of the drone and the position of the camera, you can now control both of them.

First, make sure the area is clear of objects. You’ll have a few people involved: Your subject (if you’re filming a particular subject), your drone pilot (let’s say this is you), and the person controlling the vehicle.

Since Solo follows the controller, you’ll ride along in the vehicle. When you Follow, Solo is leashed to that vehicle — and to the controller and mobile device in your hands.

Okay: In the Follow options menu, enter “Free Look.” Begin your follow shot.

While Solo follows the vehicle you’re in, and your buddy drives, you can now change Solo’s position and control the camera to look anywhere you want. This means if you’ve got another subject you want to film, you’re totally free to control the camera as that subject moves. This means you can now track with even fast-moving and spontaneous subjects without having to worry about controlling the copter’s flight.

Other drones have two controllers to pull this stuff off. With you and Solo, well, that’s it.

 

Why is Free Look important and useful?

We can sum up the usefulness of Free Look in one word: Composition.

No other follow me offers you control over image composition — the camera keeps the subject in the center of the frame no matter what. With Free Look you can look anywhere you like, allowing for much more dynamic and interesting shots.

Use case: Let’s say you want to follow a subject driving an ATV. Have Solo follow your vehicle while your vehicle follows (or is near) the subject on the ATV. Control Solo from inside your vehicle, using the FPV video as guide. Go into Follow, toggle Free Look, and Solo is leashed to the vehicle you’re in. You now have complete control over the composition of your shot as you tail the ATV, putting the camera exactly where you want to put it while Solo flies along. This lets you do what no other Follow can: Actually compose your shots. You don’t always want a centered subject — this makes Follow much more visually interesting, and it turns it from a flat, automatically centered shot into a real filming tool.

So you have control over composition, but you simultaneously get the opposite, too: spontaneity. This spontaneity applies to the choices you can make creatively as a cameraman/director, moving the camera in the moment. It also applies in terms of tracking with a spontaneous, fast-moving subject. You can also now design camera moves that start with one subject in the frame, then pan or travel to another subject or area.

Again: Film even fast or unpredictable subjects steadily, and from any angle.

 

Bottom line

So it might sound like this is simply flying manually. Follow a subject yourself with the drone and you can do all of this, right?

What’s important about Free Look is that you now only concern yourself with the placement of the camera. Drone pilots will understand how useful this can be: It takes half of the work out of it. This frees you up to add an entirely new creative layer to your shots. In this way, Solo is also like a virtual two-pilot system.

Try it just once. You’ll forget the drone is up there.

The post 3DR, Solo And Free Look: The world’s first and best follow drones appeared first on 3DR | Drone & UAV Technology.


via 3DR, Solo And Free Look: The world’s first and best follow drones

Will automation make us happier? – live chat

Join experts online on Thursday 25 February 1-2pm GMT to discuss how we can ensure the best possible outcome from automation

Computer scientist Moshe Vardi recently told the American Association for the Advancement of Science that machines could put more than 50% of the world’s population out of a job in the next 30 years. “We are approaching a time when machines will be able to outperform humans at almost any task,” he said.

While we may not yet be facing total unemployment – robots still struggle to do what humans would consider basic, simple tasks such as folding towels – automation is already having profound effects on society and our personal wellbeing.

Related: Automation may mean a post-work society but we shouldn't be afraid

Continue reading...
via Will automation make us happier? – live chat

Read it and beep: what robots will learn from our greatest literature | Stephen Moss

Robots can be enlightened and civilised if they just read out stories, we’re told. We asked an especially well-read android, HOMER16, what it made of 1984, Hamlet et al

An encouraging report from the Georgia Institute of Technology argues that it is possible to inculcate moral values into robots by exposing them to the fictions and fables that underpin human cultures. “We believe story comprehension in robots can eliminate psychotic-appearing behaviour and reinforce choices that won’t harm humans and still achieve the intended purpose,” argue the researchers.

If the Georgia Institute of Technology report is to be believed, a new generation of robots combining artificial intelligence with great physical power may not, as dystopian sci-fi films always insist, wipe us out after all. We can be friends, united in a common appreciation of Middlemarch. But a less sunny outlook is suggested by a rival report from the Shepton Mallet School of Advanced Hermeneutics, of which I’ve had a sneak preview. It fed the entire world’s literature into a robot (called HOMER16) fitted with a high-powered computer; preliminary results are worrying.

Continue reading...
via Read it and beep: what robots will learn from our greatest literature | Stephen Moss

Paper Skin Mimics the Real Thing



Artificial skin made from paper, aluminum foil, and sponges could lead to new wearable electronics and robots that feel
via Paper Skin Mimics the Real Thing

Announcing The Solo Flight Simulator

You know how I know when it’s raining? My thumbs get itchy. Happens on long car rides, too. At airport gates. In long, dull, droning meetings. During the seventh inning stretch, caution flags, awaiting an elusive superfecta in a jai alai playoff. While weathering that ridiculous intervention my family and friends imposed on me to address my drone habit.

I have a feeling I’m not alone.

Never fear, fellow drone addicts: At long last, the Solo flight simulator is here!

To help new users learn how to fly without having to put their copter in the air — and to give all users the chance to hone their skills anytime, anywhere — 3DR has released the Solo flight simulator app. (It’s separate from the Solo app — get the simulator for iOS here, and Android here.) The app connects wirelessly from your mobile device to your controller, and the simulator’s responsiveness is mapped from the sticks to screen — the “sim Solo” on your screen responds to your control stick inputs just like the real thing would in the air.

In the simulator you can practice flying Solo freely, getting a feel for the flight controls as well as the camera controls without putting your investment in the air. It’s just like a video game: Your mobile screen displays the camera’s “first-person view” so you can get accustomed to seeing what Solo sees, using that to help guide your flight. Easter egg: There’s a passing speedboat. Can you land on it? (I don’t know if this is possible; I sure can’t do it.)

But perhaps best of all: You don’t need to be near the copter or even have Solo turned on to use the simulator; all you need is the app and the controller. Practice flying anywhere.

Download the Solo flight simulator app here:

The post Announcing The Solo Flight Simulator appeared first on 3DR | Drone & UAV Technology.


via Announcing The Solo Flight Simulator

Cybernetic Third Arm Makes Drummers Even More Annoying



It keeps proper time and comes with an off switch, making this robotic third arm infinitely better than a human drummer
via Cybernetic Third Arm Makes Drummers Even More Annoying

Automated farming: good news for food security, bad news for job security?

New technology is revolutionising modern farming, but this brave new world of robot farms and hi-tech sensors could have consequences for rural livelihoods

Around the world, but especially in the developing world, food and farming systems continue to rely on 20th century technology. But this is changing. The same information technologies that brought us the internet and transformations in medicine are now revolutionising farming. It’s a new era for agriculture and it’s taking off in at least two distinct areas.

On the farm, technology is changing the way farmers manage farmland and farm animals – such as the use of satellite driven geo-positioning systems and sensors that detect nutrients and water in soil. This technology is enabling tractors, harvesters and planters to make decisions about what to plant, when to fertilise, and how much to irrigate. As this technology progresses, equipment will ultimately be able to tailor decisions on a metre-by-metre basis.

Related: Gene editing could create medicines and self-fertilising crops. But are we facing another GM food-style furore?

Related: Japanese firm to open world’s first robot-run farm

Continue reading...
via Automated farming: good news for food security, bad news for job security?

Solo For Photography – Interview on Drone Panorama and Photo Editing

Hawaii

I’m back with our regular series on aerial photography, this time interviewing one of our most proficient photo and video editors, Jon Mayo-Buttry about creating panoramas, workflow, photography, and his overall drone experience.  Want these in your inbox every week?  Signup for our drone photography list here.

HanaleiBay

You obviously edit a lot of motion graphics, much more so than stills… Do you find the processes are similar, or do you use totally different software/filters/features.
There are certainly similarities between producing video or motion graphics based content and shooting aerial photography. At the core of both, you have a concept that you want to communicate. Whether it’s a landscape, a story, or a message, you are crafting the content to relay something specific, and controlling the way that content is perceived. You have a huge amount of control over designing the look of an aerial photo in post-production. If it’s a warm landscape, you can brighten the highlighted areas and add some color temperature to make the image feel warmer, attempting to capture the feeling of the environment when you shot it. You want to guide the viewer’s eye. Figure out what areas of the photo capture the pieces that you want to accentuate, and brighten or sharpen them. Then you can subdue the other parts, by darkening or blurring them, to help guide the viewer’s eye to where you want it to go. With panoramic images, having some kind of major feature or something that stands out helps give the image character. It also helps establish scale. When you see something manmade, like a skyline or a pier, break the landscape in a panoramic image, it gives you more information about the surrounding landscape. You can start to tell a story in that single image.
I use photoshop exclusively to assemble panoramic images. Photoshop is used in all areas that I work in, including motion graphics. It’s the best tool to explore and refine concepts, and you can easily break things into layers to be animated. These same principals can apply to aerial photography. You can break apart different shots into layers, constructing the exact shot that you want. Maybe there is a perfect sunset in one image, but there is a person or a car in that shot. You can borrow that data from a different photo, removing the elements you didn’t want in the shot and preserving the parts you want to keep. Similar to the post production workflow for video, you have to refine those edits until they are not noticeable to the viewer.
When you’re shooting photos, what are your preferred GoPro settings?
The biggest benefit to shooting imagery with a GoPro is ProTune. You can lock ISO down to the lowest setting to reduce image noise. Shooting at 12 MP (wide) is great for the amount of data you can capture, but it does include lens distortion. You can, however circumvent this with a modded lens. I’ve enjoyed shooting with the Peau Productions 3.97 mm. This allows you to shoot on the wide setting without getting barrel distortion. I usually set the sharpness to “medium” as I’d prefer to leave any post-sharpening to more robust desktop software, which gives you more control.
If a user wants to turn a video screenshot into a still photo, what are some tips for exporting a high quality, clear shot?
GoPro video is very sharp. Unless you are flying very fast or turning aggressively, your frames will be sharp. Film at the highest resolution possible.
Any tips for working with GoPro still photos in post?
If you shoot with the GoPro’s sharpening levels set to medium, then you can control how sharp the image is in post. Use an unsharp mask to bring out details. Don’t use so much that it’s clear to the viewer that sharpening tools were used, as that can bring them out of the photo. Push the levels until you can tell it’s a little too much, then back the settings off a bit. Sometimes closing your photos and opening them later with a fresh set of eyes will help you figure out if you’ve went too far with any types of color correction or post processing.
What are your favorite fisheye removal techniques?
I usually bring images through the camera raw import feature, which you can use on any images, not just raw images. This allows you to batch things like exposure and color temperature settings, and fisheye removal, giving all of your images the same look. That gives you a consistent base to start from when assembling aerial photos into a single panoramic image.
Honolulu
A lot of your work features some pretty dramatic lighting editing… Any favorite plugins or filters others might enjoy experimenting with?
After getting the baseline lighting close in the camera raw import settings, I light everything in post manually. My favorite technique is to use transparent gradients in quick mask mode to make feathered selections of the parts that I want to lighten, darken, or blur. Once I have the selections, I’ll manually adjust levels, curves, or simulated camera blur settings to craft the look I want and focus in on certain parts of the image.
When you’re shooting to create a Panorama, what GoPro settings do you use? 
When shooting aerial panoramic photos, I’ll sometimes shoot 4K video, rotating the drone around. In post, you can isolate frames from the video and stitch together a high resolution panoramic image by assembling those frames in Photoshop. I use the same ProTune settings as stills; ISO set as low as possible, sharpness set to medium. I am fine shooting with either the flat or the GoPro color profiles. The GoPro profile seems to just juice the saturation a bit, which is usually done in post anyway. On higher end cameras, shooting flat usually means that you are shooting raw, with a higher dynamic range. But with GoPro, the flat color profile doesn’t translate to a higher dynamic range, so I don’t think it matters much. 
Any tips and tricks for users on composition? How do you space your photos for proper framing?
Using traditional photographic techniques like the rule of thirds or taking advantage of dramatic one point perspectives can work. Overall, try and identify the most compelling part of your scene. Then set up the framing and camera position to accentuate that element and give it the focus you want.
DCIM100GOPROG0019802.
What software do you use for stitching together panoramas?
I prefer to manually stitch images in photoshop. There are plenty of auto-stitching apps, but I think part of the process in creating an image from multiple images involves choosing which parts you want to use from each image. Water is especially tricky, as waves from one shot won’t line up with waves from the next shot. Manually blending and rotating selections can fix issues like that, as you have maximum control.
Any tips or tricks on getting a great stitch?
Know when to crop. Just because you’ve shot 15 images for a panoramic image, doesn’t mean you should use them all. Once assembled, crop in to get the framing you want, even if it means ditching parts of the photo.
What other post-processing do you do once the stitch is completed?
Traditional color correction and retouching, such as curves, levels, hue/saturation and removing unwanted objects like people, cars, etc.
Any other editing/tips tricks you can share with our users?
For landscape images, once you are finished, try flipping the entire composition horizontally. Sometimes a mirrored version of your image can give you a more interesting look.
Strangest story/encounter/demo experience with Solo?
Filming on set with Michael Bay in Malta was very memorable. Also, filming in Mexico was intense. Too many stories to recount. The best moments are adventures where you think the shoot won’t work out, or you are losing light, and then at the last moment everything comes together and you get something bigger than you imagined.
Kauai
Want our tips, tricks, interviews and inspiration for aerial photography regularly? Sign up for the email list here.

The post Solo For Photography – Interview on Drone Panorama and Photo Editing appeared first on 3DR | Drone & UAV Technology.


via Solo For Photography – Interview on Drone Panorama and Photo Editing

Automation may mean a post-work society but we shouldn't be afraid

To benefit from the automation revolution we need a universal basic income, the slashing of working hours and a redefinition of ourselves without work

When researchers Frey and Osborne predicted in 2013 that 47% of US jobs were susceptible to automation by 2050, they set off a wave of dystopian concern. But the key word is “susceptible”.

The automation revolution is possible, but without a radical change in the social conventions surrounding work it will not happen. The real dystopia is that, fearing the mass unemployment and psychological aimlessness it might bring, we stall the third industrial revolution. Instead we end up creating millions of low skilled jobs that do not need to exist.

A low-work society is only a dystopia if the social system is geared to distributing rewards via work

Continue reading...
via Automation may mean a post-work society but we shouldn't be afraid

The last job on Earth: imagining a fully automated world – video

Machines could take 50% of our jobs in the next 30 years, according to scientists. While we can’t predict the future, we can imagine a world without work – one where those who own the tech get rich from it and everyone else ekes out a living, propped up by an increasingly fragile state. Meet Alice, holder of the last recognisable job on Earth, trying to make sense of her role in an automated world

Continue reading...
via The last job on Earth: imagining a fully automated world – video

Multipoint Cable cam: How-to + Tips and Tricks

The Smart Shot to rule all shots: Multipoint Cable cam gives you even more confidence and creative control over your shots — control you simply can’t get with any other drone. Read on to learn the basics of using Multipoint Cable cam, then how to crack it open and set up jawbreaking shots and scenes, like the video above from Weston Reel.

 

How to start it up

First, MPCC is not a new Smart Shot; it’s simply the new version of Cable cam. We’ve essentially just enhanced the original Cable cam Smart Shot: Instead of two keyframes you can now set unlimited keyframes. Nothing new in the Smart Shots menu —“Cable cam” is now more extensible. No name change: To use multipoint, just select Cable cam.

To get this functionality — available in the recently announced 2.0 release — just tap the box in the upper corner of the main screen that says “update available.” (It’s the first screen you see when you open the app.) The app will automatically update everything that needs updating.

 

How to set it up

“Multipoint” works just like the original Cable cam. (Well, almost identically — we’ll get to the minutiae later.) Take your time to fly to where you want to set your first keyframe, at any point in space, looking any direction you want. Position the copter and the camera to get that first frame perfect. Press A on the controller to save the keyframe. The onboard computer essentially has a photographic memory: It takes an internal “snapshot” of each keyframe you set, memorizing not just the position in the air, but the camera position as well, so Solo will nail that frame.

After you set your first frame, take your time to fly (doesn’t have to be a straight line) to where you want to create the next one; repeat the process for as many keyframes as you need. To set the last frame, instead of pressing A you’ll press B. Your app screen will then tell you the cable is set — Solo has connected all of those frames with straight virtual cables between them.

Solo automatically adds curves to the cable at each keyframe. This makes your entry and exit for each frame feel smooth and polished.

Importantly, Solo saves every cable you create, so you can return to fly your favorites or quickly and easily switch between shots at the same location. Just open “saved shots” on the Smart Shots menu and select the cable you want to fly. You’ll see a map come up with a blue dot for Solo, and another bigger blue circle, which is your target for the first frame. Fly into the circle and the app guides you to the right altitude; Solo will lock into the first frame from there, and you’re good to go.

 

How to use it

Now that your cable and frames are set you’ve got a bunch of options. Just as you could with Cable cam 1.0, you can now engage Solo in a few ways.

1) Virtual two-pilot system. You and Solo: You fly the copter while Solo handles the camerawork. Use the right stick to send Solo up and down the cable. This lets you control the speed and direction of flight. As you go, Solo will automatically and smoothly guide the camera from one frame to the next — you can change direction and adjust speed at any time. You don’t need to press “play”; just push the stick and you’re off.

2) Virtual two-pilot system. You and Solo again: You control the camera, looking around and changing tilt however you want, while Solo flies. When you’ve got your cable set, press “play” and Solo begins its flight along the cable from frame to frame. To take control of the camera, just use the left stick. Press the stick to the right, and Solo will rotate to its right; to the left and Solo rotates left. Look up and down by using the tilt paddle on the controller’s left shoulder. You can even adjust the tilt while you rotate Solo for some simply crazy camera action. If you want to change the direction Solo flies, press just the directional arrow on your app screen at any time.

3) All Solo. Solo controls the flight and the camerawork at the same time. To do this, just tap “play,” sit back and watch Solo go. Solo flies itself along the cable and interpolates the camera smoothly from one frame to the next. Change direction at any time by pressing the directional arrow on your app screen.

4) All you. More personal control, but a higher level of difficulty: You control the copter and camera along the cable. This is a combination of number one and two above. Just as you would in number one, use the right stick to control Solo’s direction and speed along the cable. At the same time, you can use the left stick and the tilt paddle to control the camera, looking anywhere you want. No matter what you do, though, Solo keeps itself locked onto that cable, so you’ll always know where you’re headed.

 

Tips and tricks

Clear line of flight. Always double-check that the straight line between each frame doesn’t pass through any objects. It’s easy to overlook this because you can take any flight path to set your keyframes; while setting your next frame you might fly around or over an object and not be aware that the straight line between the frames actually passes through the object. Make sure the line of flight between frames is well clear of any objects.

Fly the cable manually. Before you fly your cable for your actual take, you might want to fly it manually. Not only will this give you a good idea of how your video will look, it will also give you a good feel for where those automatically added curves are, and for the course that Solo and the camera will take around them. Lastly, it also helps you get a feel for the GPS lock at each frame. Because Solo uses GPS to set frames and cables, they won’t be exact every time; GPS approximates position well, but it’s not a precision navigation system. You’ll want to take this into account when you fly.

Cable cam options. The Cable cam options menu lets you further adjust and customize your shots. (To access the options menu, tap the icon that looks like “…” on your Cable cam screen.) As with the original Cable cam, the options menu allows you to adjust the speed at which Solo flies the Cable, with one difference: Instead of a slider between “fast” and “slow,” the new slider measures your speed in the time it takes to complete your Cable. This not only gives you more precise control when coordinating a scene, but it allows you to set Solo to fly at incredibly slow speeds — down to about 10 cm/second. Flying manually at precise and invariable slow speeds — with cinematic grace — is nearly impossible, but with Solo it’s as easy as a tap.

The options menu includes a “reverse yaw” icon. By default, Solo takes the shortest arc to turn the camera between frames; select “reverse yaw” to make Solo swing the camera around the other way. Reverse yaw opens up the chance for some amazing corkscrew shots that seem to spin forever — up to 358 degrees.

Tilt presets. Don’t sleep on the right paddle! You can use it to save two gimbal angle presets, then toggle between them as you fly. You can adjust the speed of the tilt movement using the wheel between the two preset buttons. This allows you to use automated camera tilt while you fly your Cable, a combination that not only lets you create or improvise some really interesting and surprising shot dynamics, but it allows you to do it without the pressure of controlling the tilt manually. When you’re comfortable flying multipoint, definitely experiment with this extra dimension!

Shooting a scene. Because you can repeat Cables as many times as you want (in the allotted battery life), you can shoot multiple takes. This means you’ll always know exactly where the camera will be. Once you’ve got your cable set how you want it, you can turn your attention to directing your scene or making sure that all elements hit their marks right on time. Need another take? Fly back to your start point. Need the camera to move faster or slower? Adjust the speed in the Cable cam options menu. Need to change speed mid-cable? Just use the right stick. Want to switch between different Cable shots to compare, or to have multiple angles on the same scene? Just set your Cables and then switch between them by finding them in “saved shots.”

Prep. It’s good practice to prep for every shot you take, regardless, but with Cable cam it has even more value. Before you get up in the air, pre-visualize exactly the shot you want to set up and capture. This saves battery life when it’s time to shoot, and it also allows you to plan and coordinate a scene before you shoot it.

Because Cable cam saves every shot, it offers another specific and valuable way to prep your shot. If you want shoot a scene at a particular time of day (for instance, in the “golden hour” before twilight), you can take your time getting your cable set and perfectly tuned ahead of time. Then when it’s go time you can maximize your shooting time, with minimal setup — just load that saved Cable and press play. In fact, perfect any shot ahead of time and open it up when you’re on set and ready to roll. You’ll spend that much less time setting the camera, and you’ll know exactly when and where it will be so you can coordinate the scene.

The post Multipoint Cable cam: How-to + Tips and Tricks appeared first on 3DR | Drone & UAV Technology.


via Multipoint Cable cam: How-to + Tips and Tricks

Digital Baby Project's Aim: Computers That See Like Humans



A huge study on how humans and computers see objects could inspire better artificial intelligence
via Digital Baby Project's Aim: Computers That See Like Humans

Checking in with Andrew Ng at Baidu’s Blooming Silicon Valley Research Lab



The Coursera founder discusses his new gig as head of Baidu Research, Baidu’s autonomous vehicle project, and hiring plans
via Checking in with Andrew Ng at Baidu’s Blooming Silicon Valley Research Lab

AAAI Video Highlights: Drones Navigating Forests and Robot Boat Swarms



We take you through two of the most impressive robot videos submitted to the AAAI Video Competition
via AAAI Video Highlights: Drones Navigating Forests and Robot Boat Swarms

A Google Car Can Qualify As A Legal Driver



A robotic car still can't vote or order a drink, though. Not yet
via A Google Car Can Qualify As A Legal Driver

Exoskeleton Makes Robotic Roach Flexibly Squishy



A cockroach-inspired shell and a flexible spine helps this legged robot squeeze through gaps
via Exoskeleton Makes Robotic Roach Flexibly Squishy

Earthbound Robots Today Need to Take Flight



Drones with manipulators will be able to tackle many real-world applications that current robots can't
via Earthbound Robots Today Need to Take Flight

The superhero of artificial intelligence: can this genius keep it in check?

With his company DeepMind, Londoner Demis Hassabis is leading Google’s project to build software more powerful than the human brain. But what will this mean for the future of humankind?

Demis Hassabis has a modest demeanour and an unassuming countenance, but he is deadly serious when he tells me he is on a mission to “solve intelligence, and then use that to solve everything else”. Coming from almost anyone else, the statement would be laughable; from him, not so much. Hassabis is the 39-year-old former chess master and video-games designer whose artificial intelligence research start-up, DeepMind, was bought by Google in 2014 for a reported $625 million. He is the son of immigrants, attended a state comprehensive in Finchley and holds degrees from Cambridge and UCL in computer science and cognitive neuroscience. A “visionary” manager, according to those who work with him, Hassabis also reckons he has found a way to “make science research efficient” and says he is leading an “Apollo programme for the 21st century”. He’s the sort of normal-looking bloke you wouldn’t look twice at on the street, but Tim Berners-Lee once described him to me as one of the smartest human beings on the planet.

Artificial intelligence is already all around us, of course, every time we interrogate Siri or get a recommendation on Android. And in the short term, Google products will surely benefit from Hassabis’s research, even if improvements in personalisation, search, YouTube, and speech and facial recognition are not presented as “AI” as such. (“Then it’s just software, right?” he grins. “It’s just stuff that works.”) In the longer term, though, the technology he is developing is about more than emotional robots and smarter phones. It’s about more than Google. More than Facebook, Microsoft, Apple, and the other giant corporations currently hoovering up AI PhDs and sinking billions into this latest technological arms race. It’s about everything we could possibly imagine; and much that we can’t.

Related: Google buys UK artificial intelligence startup Deepmind for £400m

Climate modelling, complex disease analysis – it's very exciting to start imagining what it might tackle next

Related: Elon Musk says he invested in DeepMind over 'Terminator' fears

If there is a digital intelligence that exceeds human intelligence, ‘assistance’ is not the correct description

Related: Demis Hassabis: 15 facts about the DeepMind Technologies founder

Related: Elon Musk: artificial intelligence is our biggest existential threat

Related: Musk, Wozniak and Hawking urge ban on warfare AI and autonomous weapons

Continue reading...
via The superhero of artificial intelligence: can this genius keep it in check?

Releasing Solo 2.0, Our Biggest Update Yet: Multipoint cable cam; Free Look; airspace information

At CES we said we’d be dropping the release of Solo 2.0 this spring. We’re excited to say we’ve beaten that deadline by a month and nine days. Right? It’s a leap year, so, I don’t know. How does math work again?

Solo 2.0 offers a suite of new software that makes Solo more useful, more safe, more powerful and an overall a more advanced drone. The update means Solo has grown beyond the drone category and established itself as the first consumer drone platform.

This release introduces powerful new Smart Shot functionality that not only makes getting cinematic video easier than ever but expands the storytelling possibilities for a single user with a creative mind. Check out the details below, and happy flying!

 

Multipoint Cable cam

Multipoint gives you even more confidence and creative control over your shots — control you simply can’t get with any other drone.

MPCC revamps Cable cam’s original design. Instead of a two-frame, beginning-to-end narrative, now set up cables with an unlimited number of keyframes. Just like the original release, take your time to set each frame perfectly, at any point in space, looking any direction you want. Press A to set the keyframe, then fly on to where you want to create the next one and repeat as many times as you need. The onboard computer is like a photographic memory that takes a snapshot of each keyframe, connecting the dots with a virtual cable. Press play and Solo guides the copter and the camera from frame to frame, automatically adding curves to each point so your video is professional and polished. Or fly Solo along the cable yourself, controlling direction and speed — or control the camera as Solo flies the cable.

Importantly, Solo also saves every cable you create, so you can return to fly your favorites or quickly and easily switch between shots at the same location. Just open your “saved shots,” select the one you want, and you’ll see a blue circle. This is your target for the first frame. Fly into the circle and the app guides you to the right altitude; Solo will lock into the first frame from there, and you’re good to go.

Lastly, you can also toggle a new time lapse function, which lets you tell Solo to fly cables at incredibly slow speeds. Flying manually at precise and invariable slow speeds — with cinematic grace — is nearly impossible, but with Solo it’s as easy as a tap.

 

Follow with Free Look

3DR developed the first-ever “follow-me” drone, and our Follow technology remains the most advanced of any drone on the market. For Solo 2.0 we took the Free Look feature — already available in Cable cam and Orbit — and applied it to Follow.

Free Look turns Follow into an interactive filming experience: When you enter Free Look, Solo still follows your subject, but you now have full control of the camera. Pan and tilt freely to look anywhere you want while remaining completely confident in Solo’s position and directional heading. It’s similar to the Hollywood motion control of a camera boom on the back of a truck: Virtually leash Solo to one vehicle, and swivel and tilt the camera manually to track the movements of even the most spontaneous subject.

Plus, if Solo isn’t exactly where you need it to be, Free Look also allows you to adjust the copter’s position in space with a nudge of the controls. When you know exactly where the camera will be, you can plan shots with confidence and also react in the moment.

For a great example of how great Free Look works in action, check out the video above at about 3:26.

 

Flight Zone Safety Information

We collaborated with airspace safety information provider Airmap.io to incorporate real-time flight zone information into the Solo app. The app shows basic airspace information all around the world, alerting you if you’re about to fly in or near restricted airspace. Tap the alert to bring up a map of the area with all restricted airspace clearly marked. Head to a clear area or, if you’re near an airport, the app will show you the phone number for the control tower so you can call them up and request clearance.

Welcome to Solo, season two.

 

The post Releasing Solo 2.0, Our Biggest Update Yet: Multipoint cable cam; Free Look; airspace information appeared first on 3DR | Drone & UAV Technology.


via Releasing Solo 2.0, Our Biggest Update Yet: Multipoint cable cam; Free Look; airspace information

Science and superheroes: how close are we to creating real superpowers?

As Marvel’s Deadpool hits screens we ask: with three out of five fictional superheroes owing their powers to science, will we ever have real superpowers?

There are, according to the Marvel Super Heroes role-playing game (a source I am choosing to accept as 100% canonical), five general origins for all superheroic powers: Altered Humans (Spiderman, Fantastic Four), High-Tech Wonders (Iron Man, Batman), Mutants (X-Men,) Robots (The Vision) and Aliens (Superman and gods like Thor).

Until quite recently all five of the general origins of super powers seemed entirely beyond reach. But is the high speed advance of science in the 21st century bringing those superpowers based upon it - Altered Humans, High Tech Wonders and Robots - any closer?

Related: Andy Miah: The pleasures and pitfalls of body enhancement

Related: Humanoid robots – in pictures

Continue reading...
via Science and superheroes: how close are we to creating real superpowers?

Solo Crushing in The Media; The Media Crushing on Solo

How would one describe the Solo media coverage this past month? One might say,

πŸ”₯πŸ”₯πŸ”₯ πŸ’ͺπŸ’ͺπŸš€πŸš€πŸš€, πŸ’ͺπŸ’°πŸ’°πŸ’°πŸš€πŸš€πŸ’ͺπŸ’ͺπŸ”₯πŸ”₯πŸ”₯ πŸ’°πŸ’°πŸš€πŸš€πŸš€ πŸ’ͺπŸ”₯πŸ”₯πŸ’ͺπŸ’°πŸ”₯πŸ”₯πŸ’° — πŸš€πŸ’°πŸ’°πŸ’ͺπŸš€πŸ’°πŸ”₯ πŸ”₯πŸ”₯!”

This year we’ve announced new Smart Shots — including the shot to rule all shots, multipoint cable cam — along with in-app flight zone information and a suite of Made for Solo accessories. Check out these fresh first-rate reviews, interviews and articles about the world’s smartest drone and the world’s smartest drone company.

πŸ”‘

TIME – What it’s Like to Go To a ‘Drone Rodeo’

“Packing a pair of Linux computers powering a high-flying navigation system the company calls Smart Shots, this [Solo] rig held steady in the wind and rain, flying routes so easy to program that an eleven-year-old could become a master videographer.”

 

The Verge – Solo drone extends its capabilities with a parachute and 360 degree camera

“On the software side, Solo already has the best autonomous flight capabilities among the consumer drones we’ve reviewed. At CES it announced the addition of a powerful new feature, multi-point cable cam. This allows you to sketch out complex shots with multiple camera moves and points of focus. 3D Robotics also added the ability to save these shots so you don’t have to recreate them from scratch after you close the app.”

 

CNET – 3DR Solo drone flies and films for you: First Look

 

TechCrunch [Chris Anderson interview; video] – Drones, Drones Everywhere with 3D Robotics

Chris Anderson, CEO of 3D Robotics, sits down with Matt Burns at CES 2016 to discuss the future of the drone landscape, the next technical challenges to overcome, and complying with the regulatory environment.

 

Engadget [Chris Anderson interview; video] – 3D Robotics: The future of drones needs to be smart yet simple

Quadcopters, drones, UAVs. Whatever you want to call them, they are an unavoidable part of our future according to 3DR CEO Chris Anderson. He should know, his company is the largest drone maker in America.

 

Claudia Cruz, CNET en Espanol – Now you can choreograph your flights, only with Solo from 3DR

“3D Robotics, one of the largest manufacturers of drones in the world and the most recognized by its open developer platform, announced [new software] this week at the International Consumer Electronics Show, CES 2016, which will add three new functions to the software of Solo. This update would be the largest ever at the time.”

 

IGN  This Drone Flies Itself So You Don’t Have To

 

NBC News – 3DR’s Super Bowl Jumbotron PSA

 

CNET en Espanol – Five drones that left us speechless

 

Videomaker – CES 2016: 3DR ADDS MORE WAYS TO FLY SOLO

“The creative minds at 3DR released new software for their Solo drone. The Solo can now Follow with Freelook and pass through multiple points in the Cable Cam setting. Each flight path created will be saved in the app and will be categorized by date and location or a name you specify. The Follow with Freelook feature is next to hiring a motion control setup. You’ll depend on the drone to position the camera, and you can pan and tilt to control the framing of the image.”

 

TechCrunch – “3DR Makes It Easier To Take Exactly The Drone Video You Want”

 

Engadget“3DR’s Made for Solo program does 360 degree video on a budget”

 

Tech Times – “3DR’s eyes in the sky just improved.”

 

Drone Blog – 3DR Announces New Solo Features at CES 2016

“This latest software release is a big one, and it does what we want all of our releases to do: It makes every Solo out there a better Solo.”

The post Solo Crushing in The Media; The Media Crushing on Solo appeared first on 3DR | Drone & UAV Technology.


via Solo Crushing in The Media; The Media Crushing on Solo