High Tech Comes to the High School Basketball Court

Sensors that track the movement of athletes and the basketball were recently used at a high school championship tournament for the first time.

#basketball, #ddsports-inc-shottracker, #draft-and-recruitment-sports, #interscholastic-athletics, #national-federation-of-state-high-school-assns, #new-mexico, #playfly-sports-llc, #sensors

0

These Materials Could Make Science Fiction a Reality

Metamaterials, which could improve smartphones and change how we use other technology, allow scientists to control light waves in new ways.

#computer-chips, #driverless-and-semiautonomous-vehicles, #robots-and-robotics, #sensors, #smartphones

0

The googly eyes of the mantis shrimp inspire new optical sensors

Colorful close-up photo of a shrimp.

Enlarge / Scientists have developed a new type of light sensor inspired by the eyes of the mantis shrimp. (credit: Brent Durand/Getty Images)

Smartphone cameras have improved dramatically since the first camera-equipped cell phone was introduced in 1999, but they are still subject to tiny errors in the alignment of different wavelengths of light in the final image. That’s not a concern for your average Instagram selfie, but it’s far from ideal when it comes to scientific image analysis, for example.

Nature seems to have provided a solution in the eyes of the mantis shrimp, which inspired researchers at North Carolina State University (NCSU) to develop a new type of optical sensor. The sensor is small enough for smartphone applications, but it’s also capable of breaking down visible light wavelengths into narrower bands than current smartphone cameras can manage, as well as capturing polarized light, according to a recent paper published in the journal Science Advances.

Human eyes have three photoreceptors for detecting red, green, and blue light. Dogs have just two photoreceptors (green and blue), while birds have four, including one for detecting uaviolet (UV) light. Octopuses, meanwhile, can detect polarized light. But mantis shrimp (aka stomatopods) have the most complex eyes of all: they can have between 12 and 16 individual photoreceptors and can thus detect visible, UV, and polarized light.

Read 7 remaining paragraphs | Comments

#animals, #biology, #biomimicry, #hyperspectral-imaging, #mantis-shrimp, #optical-sensors, #science, #sensors

0

The New Alliance Shaping the Middle East Is Against a Tiny Bug

Israeli sensors on the Persian Gulf, across the water from Iran, are part of an epic battle in the Middle East. Just not the one we’re used to.

#abu-dhabi-united-arab-emirates, #agriculture-and-farming, #beetles, #dates-fruit, #defense-and-military-forces, #dubai-united-arab-emirates, #fez-morocco, #international-relations, #israel, #jews-and-judaism, #peace-process, #pesticides, #politics-and-government, #sensors, #trees-and-shrubs, #united-states-international-relations, #west-bank

0

Metalenz reimagines the camera in 2D and raises $10M to ship it

As impressive as the cameras in our smartphones are, they’re fundamentally limited by the physical necessities of lenses and sensors. Metalenz skips over that part with a camera made of a single “metasurface” that could save precious space and battery life in phones and other devices… and they’re about to ship it.

The concept is similar to, but not descended from, the “metamaterials” that gave rise to flat beam-forming radar and lidar of Lumotive and Echodyne. The idea is to take a complex 3D structure and accomplish what it does using a precisely engineered “2D” surface — not actually two-dimensional, of course, but usually a plane with features measured in microns.

In the case of a camera, the main components are of course a lens (these days it’s usually several stacked), which corrals the light, and an image sensor, which senses and measures that light. The problem faced by cameras now, particularly in smartphones, is that the lenses can’t be made much smaller without seriously affecting the clarity of the image. Likewise sensors are nearly at the limit of how much light they can work with. Consequently most of the photography advancements of the last few years have been done on the computational side.

Using an engineered surface that does away with the need for complex optics and other camera systems has been a goal for years. Back in 2016 I wrote about a NASA project that took inspiration from moth eyes to create a 2D camera of sorts. It’s harder than it sounds, though — usable imagery has been generated in labs, but it’s not the kind of thing that you take to Apple or Samsung.

Metalenz aims to change that. The company’s tech is built on the work of Harvard’s Frederico Capasso, who has been publishing on the science behind metasurfaces for years. He and Rob Devlin, who did his doctorate work in Capasso’s lab, co-founded the company to commercialize their efforts.

“Early demos were extremely inefficient,” said Devlin of the field’s first entrants. “You had light scattering all over the place, the materials and processes were non-standard, the designs weren’t able to handle the demands that a real world throws at you. Making one that works and publishing a paper on it is one thing, making 10 million and making sure they all do the same thing is another.”

Their breakthrough — if years of hard work and research can be called that — is the ability not just to make a metasurface camera that produces decent images, but to do it without exotic components or manufacturing processes.

“We’re really using all standard semiconductor processes and materials here, the exact same equipment — but with lenses instead of electronics,” said Devlin. “We can already make a million lenses a day with our foundry partners.”

Diagram comparing the multi-lens barrel of a conventional phone camera, and their simpler "meta-optic"

The thing at the bottom is the chip where the image processor and logic would be, but the meta-optic could also integrate with that. the top is a pinhole.

The first challenge is more or less contained in the fact that incoming light, without lenses to bend and direct it, hits the metasurface in a much more chaotic way. Devlin’s own PhD work was concerned with taming this chaos.

“Light on a macro [i.e. conventional scale, not close-focusing] lens is controlled on the macro scale, you’re relying on the curvature to bend the light. There’s only so much you can do with it,” he explained. “But here you have features a thousand times smaller than a human hair, which gives us very fine control over the light that hits the lens.”

Those features, as you can see in this extreme close-up of the metasurface, are precisely tuned cylinders, “almost like little nano-scale Coke cans,” Devlin suggested. Like other metamaterials, these structures, far smaller than a visible or near-infrared light ray’s wavelength, manipulate the radiation by means that take a few years of study to understand.

Diagram showing chips being manufactured, then an extreme close up showing nano-scale features.The result is a camera with extremely small proportions and vastly less complexity than the compact camera stacks found in consumer and industrial devices. To be clear, Metalenz isn’t looking to replace the main camera on your iPhone — for conventional photography purposes the conventional lens and sensor are still the way to go. But there are other applications that play to the chip-style lens’s strengths.

Something like the FaceID assembly, for instance, presents an opportunity. “That module is a very complex one for the cell phone world — it’s almost like a Rube Goldberg machine,” said Devlin. Likewise the miniature lidar sensor.

At this scale, the priorities are different, and by subtracting the lens from the equation the amount of light that reaches the sensor is significantly increased. That means it can potentially be smaller in every dimension while performing better and drawing less power.

Image (of a very small test board) from a traditional camera, left, and metasurface camera, right. Beyond the vignetting it’s not really easy to tell what’s different, which is kind of the point.

Lest you think this is still a lab-bound “wouldn’t it be nice if” type device, Metalenz is well on its way to commercial availability. The $10M round A they just raised was led by 3M Ventures, Applied Ventures LLC, Intel Capital, M Ventures and TDK Ventures, along with Tsingyuan Ventures and Braemar Energy Ventures — a lot of suppliers in there.

Unlike many other hardware startups, Metalenz isn’t starting with a short run of boutique demo devices but going big out of the gate.

“Because we’re using traditional fabrication techniques, it allows us to scale really quickly. We’re not building factories or foundries, we don’t have to raise hundreds of mils; we can use whats already there,” said Devlin. “But it means we have to look at applications that are high volume. We need the units to be in that tens of millions range for our foundry partners to see it making sense.”

Although Devlin declined to get specific, he did say that their first partner is “active in 3D sensing” and that a consumer device, though not a phone, would be shipping with Metalenz cameras in early 2022 — and later in 2022 will see a phone-based solution shipping as well.

In other words, while Metalenz is indeed a startup just coming out of stealth and raising its A round… it already has shipments planned on the order of tens of millions. The $10M isn’t a bridge to commercial viability but short term cash to hire and cover up-front costs associated with such a serious endeavor. It’s doubtful anyone on that list of investors harbors any serious doubts on ROI.

The 3D sensing thing is Metalenz’s first major application, but the company is already working on others. The potential to reduce complex lab equipment to handheld electronics that can be fielded easily is one, and improving the benchtop versions of tools with more light-gathering ability or quicker operation is another.

Though a device you use may in a few years have a Metalenz component in it, it’s likely you won’t know — the phone manufacturer will probably take all the credit for the improved performance or slimmer form factor. Nevertheless, it may show up in teardowns and bills of material, at which point you’ll know this particular university spin-out has made it to the big leagues.

#3m, #braemar-energy-ventures, #funding, #fundings-exits, #gadgets, #hardware, #harvard, #harvard-university, #intel, #intel-capital, #metamaterials, #photography, #recent-funding, #sensors, #startups, #tc, #tdk

0

On Factory Floors, a Chime and Flashing Light to Maintain Distance

Businesses like Henkel, a big German chemical company, are trying wearable sensors to prevent virus outbreaks among workers.

#coronavirus-2019-ncov, #factories-and-manufacturing, #germany, #henkel-kgaa, #kinexon, #national-basketball-assn, #poland, #sensors, #wearable-computing

0

Amazon announces a bunch of products aimed at industrial sector

One of the areas that is often left behind when it comes to cloud computing is the industrial sector. That’s because these facilities often have older equipment or proprietary systems that aren’t well suited to the cloud. Amazon wants to change that, and today the company announced a slew of new services at AWS re:Invent aimed at helping the industrial sector understand their equipment and environments better.

For starters, the company announced Amazon Monitron, which is designed to monitor equipment and send signals to the engineering team when the equipment could be breaking down. If industrial companies can know when their equipment is breaking, it allows them to repair on it their own terms, rather than waiting until after it breaks down and having the equipment down at what could be an inopportune time.

As AWS CEO Andy Jassy says, an experienced engineer will know when equipment is breaking down by a certain change in sound or a vibration, but if the machine could tell you even before it got that far, it would be a huge boost to these teams.

“…a lot of companies either don’t have sensors, they’re not modern powerful sensors, or they are not consistent and they don’t know how to take that data from the sensors and send it to the cloud, and they don’t know how to build machine learning models, and our manufacturing companies we work with are asking [us] just solve this [and] build an end-to-end solution. So I’m excited to announce today the launch of Amazon Monotron, which is an end-to-end solution for equipment monitoring,” Jassy said.

The company builds a machine learning model that understands what a normal state looks like, then uses that information to find anomalies and send back information to the team in a mobile app about equipment that needs maintenance now based on the data the model is seeing.

For those companies who may have a more modern system and don’t need the complete package that Monotron offers, Amazon has something for these customers as well. If you have modern sensors, but you don’t have a sophisticated machine learning model, Amazon can ingest this data and apply its machine learning algorithms to find anomalies just as it can with Monotron.

“So we have something for this group of customers as well to announce today, which is the launch of Amazon Lookout for Equipment, which does anomaly detection for industrial machinery,” he said.

In addition, the company announced the Panorama Appliance for companies using cameras at the edge who want to use more sophisticated computer vision, but might not have the most modern equipment to do that. “I’m excited to announce today the launch of the AWS Panorama Appliance which is a new hardware appliance [that allows] organizations to add computer vision to existing on premises smart cameras,” Jassy told AWS re:Invent today.

In addition, it also announced a Panorama SDK to help hardware vendors build smarter cameras based on Panorama.

All of these services are designed to give industrial companies access to sophisticated cloud and machine learning technology at whatever level they may require depending on where they are on the technology journey.

#amazon, #aws-reinvent-2020, #cloud, #edge-computing, #enterprise, #hardware, #industrial-iot, #industrial-sector, #sensors, #tc

0

Loose Screws Help North Korean Defector Give Border Guards the Slip

South Korea said its sensors had malfunctioned, allowing a North Korean to defect undetected, in the military’s most embarrassing border security breach in years.

#defectors-political, #defense-and-military-forces, #korean-demilitarized-zone, #north-korea, #sensors, #south-korea

0

The Hot New Covid Tech Is Wearable and Constantly Tracks You

Sports leagues, large employers and colleges are turning to devices that could usher in more invasive forms of surveillance.

#athletics-and-sports, #bluetooth-wireless-technology, #colleges-and-universities, #computers-and-the-internet, #contact-tracing-public-health, #coronavirus-2019-ncov, #coronavirus-reopenings, #data-mining-and-database-marketing, #mobile-applications, #national-football-league, #privacy, #sensors, #wearable-computing, #workplace-environment

0

Verkada adds environmental sensors to cloud-based building operations toolkit

As we go deeper into the pandemic, many buildings sit empty or have limited capacity. During times like these having visibility into the state of the building can give building operations peace of mind. Today, Verkada, a startup that helps operations manage buildings via the cloud, announced a new set of environmental sensors to give customers even greater insight into building conditions.

The company had previously developed cloud-based video cameras and access control systems. Verkdada CEO and co-founder of Filip Kaliszan says today’s announcement is about building on these two earlier products.

“What we do today is cameras and access control — cameras, of course provide the eyes and the view into building in spaces, while access control controls how you get in and out of these spaces,” Kaliszan told TechCrunch. Operations teams can manage these devices from the cloud on any device.

The sensor pack that the company is announcing today, layers on a multi-function view into the state of the environment inside a building. “The first product that we’re launching along this environmental sensor line is the SV11, which is a very powerful unit with multiple sensors on board, all of which can be managed in the cloud through our Verkada command platform. The sensors will give customers insight into things like air quality, temperature, humidity, motion and occupancy of the space, as well as the noise level,” he said.

There is a clear strategy behind the company’s product road map. The idea is to give building operations staff a growing picture of what’s going on inside the space. “You can think of all the data being combined with the other aspects of our platform, and then begin delivering a truly integrated building and setting the standard for enterprise building security,” Kaliszan said.

These tools, and the ability to access all the data about a building remotely in the cloud, obviously have even more utility during the pandemic. “I think we’re fortunate that our products can help customers mitigate some of the effects of the pandemic. So we’ve seen a lot of customers use our tools to help them manage through the pandemic, which is great. But when we were originally designing this environmental sensor, the rationale behind it were these core use cases like monitoring server rooms for environmental changes.”

The company, which was founded in 2016, has been doing well. It has 4200 customers and roughly 400 employees. It is still growing and actively hiring and expects to reach 500 by the end of the year. It has raised $138.9 million, the most recent coming January this year, when it raised an $80 million Series C investment led Felicis Ventures on a $1.6 billion valuation.

#cloud, #enterprise, #iot, #security, #sensors, #startups, #tc, #verkada

0

I Live in California. How Do I Know It’s Safe to Go Outside?

A high-tech sensor network brought me closer to the natural cycles of my environment.

#air-pollution, #california, #global-warming, #san-francisco-calif, #sensors, #wildfires

0

The Brain Implants That Could Change Humanity

Brains are talking to computers, and computers to brains. Are our daydreams safe?

#addiction-psychology, #artificial-intelligence, #brain, #chang-edward-f, #computers-and-the-internet, #data-mining-and-database-marketing, #gallant-jack-l, #halpern-casey-h, #implants, #kirsch-robert-f, #paralysis, #privacy, #sensors, #touch-sense, #voice-recognition-systems, #yuste-rafael

0

The Lesson We’re Learning From TikTok? It’s All About Our Data

We should minimize how much we share with all of our favorite and not-so-favorite apps. Here’s how.

#advertising-and-marketing, #android-operating-system, #apple-inc, #computers-and-the-internet, #content-type-service, #data-mining-and-database-marketing, #fyde-inc, #google-inc, #ios-operating-system, #mobile-applications, #politics-and-government, #privacy, #sensors, #social-media, #software, #tiktok-bytedance

0

Fighting the Coronavirus With Innovative Tech

Some of these devices have been around for years but are now being mustered to help keep us safe.

#coronavirus-2019-ncov, #coronavirus-aid-relief-and-economic-security-act-2020, #hospitals, #hygiene-and-cleanliness, #robots-and-robotics, #sensors, #small-business, #ultraviolet-light, #workplace-hazards-and-violations

0

Optimized sensors are key to future of automated vehicles

Sensors are critical components of the modern vehicle. They are the eyes of a car, enabling everything from existing ADAS (Advanced Driver-Assistance Systems) features such as automated braking and lane keeping to potential removal of the driver altogether. The consequences of these “eyes” not pointing in the right direction or not seeing clearly could be catastrophic; your car could needlessly break in the middle of the highway or suddenly swerve into another lane. Sufficiently high and safe sensor accuracy is essential, and calibration is critical to ensuring that a vehicle’s sensors are operating at the highest fidelity.

Sensors can be miscalibrated due to everything from daily normal use and changes in operating conditions (temperature or vibrations) to something more severe like accidents or part replacements. Unfortunately, very little emphasis has been placed on addressing the issue. This comes as no surprise; the automotive product cycle is incredibly long, and automated vehicles simply haven’t been tested long enough yet to thoroughly expose this issue.

Most standard perception sensors in the market today can perform intrinsic (refers to internal parameters of one sensor) calibration autonomously. However, extrinsic (refers to parameters relating multiple sensors together) calibration poses significant problems to fleets given the ever-increasing reliance on multiple sensors to overcome the shortcomings of individual sensors. Most calibration solutions today rely on picking functionally or economically inferior sensor configurations and/or simply hoping that the sensors never become miscalibrated from initial factory settings in the first place. Yet while this is obviously unsafe, there exist no common metrics to measure what it means for a sensor to be miscalibrated and no common standards that companies can hold their sensor calibrations up against. Every player in this space has their own unique sensor suites and an accompanying set of unique calibration practices, further complicating the matter.

Current aftermarket, maintenance, and return-to-service options are woefully underprepared to address the issue. Consider ADAS calibration at a typical maintenance shop. The procedure takes 15-120 minutes and requires expensive equipment (scanning tools, large and clear paved areas, alignment racks, etc.). The vehicle itself also needs to be prepared to meticulous standards; the fuel tank must be full, the tires must be properly inflated, the vehicle must be perfectly flat on a balanced floor, etc. Most garages and mechanics are underequipped and insufficiently trained to conduct what is an incredibly tedious and technically complex procedure. This ultimately causes improper calibration that endangers the vehicle’s passengers and those around them.

Innovations and opportunities in sensor calibration

#adas, #artificial-intelligence, #automotive, #bridgestone, #calibration, #column, #extra-crunch, #gps, #machine-learning, #market-analysis, #mems, #reilly-brennan, #robotics, #sensors, #standards, #startups

0

With an Internet of Animals, Scientists Aim to Track and Save Wildlife

Using tiny sensors and equipment aboard the space station, a project called ICARUS seeks to revolutionize animal tracking.

#animal-behavior, #animal-migration, #animals, #biodiversity, #computers-and-the-internet, #conservation-of-resources, #endangered-and-extinct-species, #international-space-station, #mobile-applications, #poaching-wildlife, #research, #sensors, #space-and-astronomy, #wikelski-martin, #your-feed-science

0

Did Mom Take Her Medicine? Keeping Eyes on Elders in Quarantine

Technology can help families monitor the health and safety of older people kept from their families by the coronavirus.

#bluetooth-wireless-technology, #coronavirus-2019-ncov, #elder-care, #elderly, #longevity, #mobile-applications, #quarantines, #retirement, #sensors, #wireless-communications

0

VergeSense grabs $9M for its people-counting sensor tech as offices eye COVID changes

Facilities management looks to be having a bit of a moment, amid the coronavirus pandemic.

VergeSense, a US startup which sells a ‘sensor as a system’ platform targeted at offices — supporting features such as real-time occupant counts and foot-traffic-triggered cleaning notifications — has closed a $9M strategic investment led by Allegion Ventures, a corporate VC fund of security giant Allegion.

JLL Spark, Metaprop, Y Combinator, Pathbreaker Ventures, and West Ventures also participated in the round, which brings the total funding raised by the 2017-founded startup to $10.6M including an earlier seed round.

VergeSense tells TechCrunch it’s seen accelerated demand in recent weeks as office owners and managers try to figure out how to make workspaces safe in the age of COVID-19 — claiming bookings are “on track” to be up 500% quarter over quarter. (Though it admits business did also take a hit earlier in the year, saying there was “aftershock” once the coronavirus hit.)

So while, prior to the pandemic, VergeSense customers likely wanted to encourage so called ‘workplace collisions’ — i.e. close encounters between office staff in the hopes of encouraging idea sharing and collaboration — right now the opposite is the case, with social distancing and looming limits on room occupancy rates looking like a must-have for any reopening offices.

Luckily for VergeSense, its machine learning platform and sensor packed hardware can derive useful measurements just the same.

It’s worked with customers to come up with relevant features, such as a new Social Distancing Score and daily occupancy reports. While it already had a Smart Cleaning Planner feature which it reckons will now be in high demand. It also envisages customers being able to plug into its open API to power features in their own office apps that could help to reassure staff it’s okay to come back in to work, such as indicating quiet zones or times where there are fewer office occupants on site.

Of course plenty of offices may remain closed for some considerable time or even for good — Twitter, for example, has told staff they can work remotely forever — with home working a viable job for much office work. But VergeSense and its investors believe the office will prevail in some form, but with smart sensor tech that can (for example) detect the distance between people becoming a basic requirement.

“I think it’s going to less overall office space,” says VergeSense co-founder Dan Ryan, discussing how he sees the office being changed by COVID-19. “A lot of customers are rethinking the need to have tonnes of smaller, regional offices. They’re thinking about still maintaining their big hubs but maybe what those hubs actually look like is different.

“Maybe post-COVID, instead of people coming into the office five days a week… for people that don’t necessarily need to be in the office to do their work everyday maybe three days a week or two days a week. And that probably means a different type of office, right. Different layout, different type of desks etc.”

“That trend was already in motion but a lot of companies were reluctant to experiment with remote work because they weren’t sure about the impact on productivity and that sort of thing, there was a lot of cultural friction associated with that. But now we all got thrust into that simultaneously and it’s happening all at once — and we think that’s going to stick,” he adds. “We’ve head that feedback consistently from basically all of our customers.”

“A lot of our existing customers are pulling forward adoption of the product. Usually the way we roll out is customers will do a couple of buildings to get started and it’ll be phased rollout plan from there. But now that the use-case for this data is more connected to safety and compliance, with COVID-19, around occupancy management — there’s CDC guidelines [related to building occupancy levels] — now to have a tool that can measure and report against that is viewed as more of a mission critical type thing.”

VergeSense is processing some 6 million sensor reports per day at this point for nearly 70 customers, including 40 FORTUNE 1000 companies. In total it says it provides its sensor hardware plus SaaS across 20 million sqft, 250 office buildings, and 15 countries.

“There’s an extreme bear case here — that the office is going to disappear,” Ryan adds. “That’s something that we don’t see happening because the office does have a purpose, rooted in — primarily — human social interaction and physical collaboration.

“As much as we love Zoom and the efficiency of that there is a lot that gets lost without that physical collaboration, connection, all the social elements that are built around work.”

VergeSense’s new funding will go on scaling up to meet the increased demand it’s seeing due to COVID and for scaling its software analytics platform.

It’s also going to be spending on product development, per Ryan, with alternative sensor hardware form factors in the works — including “smaller, better, faster” sensor hardware and “some additional data feeds”.

“Right now it’s primarily people counting but there’s a lot of interest in other data about the built environment beyond that — more environmental types of stuff,” he says of the additional data feeds it’s looking to add. “We’re more interested in other types of ambient data about the environment. What’s the air quality on this floor? Temperature, humidity. General environmental data that’s getting even more interest frankly from customers now.

“There is a fair amount of interest in wellness of buildings. Historically that’s been more of a nice to have thing. But now there’s huge interest in what is the air quality of this space — are the environmental conditions appropriate? I think the expectations from employees are going to be much higher. When you walk into an office building you want the air to be good, you want it to look nicer — and that’s why I think the acceleration [of smart spaces]; that’s a trend that was already in motion but people are going to double down and want it to accelerate even faster.”

Commenting on the funding in a statement, Rob Martens, president of Allegion Ventures, added: “In the midst of a world crisis, [the VergeSense team] have quickly positioned themselves to help senior business leaders ensure safer workspaces through social distancing, while at the same time still driving productivity, engagement and cost efficiency. VergeSense is on the leading edge of creating data-driven workspaces when it matters most to the global business community and their employees.”

#artificial-intelligence, #coronavirus, #covid-19, #fundings-exits, #hardware, #machine-learning, #sensors, #tc, #united-states, #vergesense, #y-combinator

0

Helping the Environment, One Small Sensor at a Time

New York City nonprofits are using a cloud-based service from the start-up Temboo that helps monitor storm-water runoff and other environmental factors.

#ai-may-2020, #clean-water-act, #computers-and-the-internet, #infrastructure-public-works, #nonprofit-organizations, #sensors, #temboo-inc, #trees-and-shrubs, #water-pollution

0