US Army-Funded Algorithm Decodes Brain Signalshttps://www.defenseone.com/technology/2020/11/us-army-funded-algorithm-decodes-brain-signals/170102/A new machine-learning algorithm can successfully determine which specific behaviors—like walking and breathing—belong to which specific brain signal, and it has the potential to help the military maintain a more ready force.
At any given time, people perform a myriad of tasks. All of the brain and behavioral signals associated with these tasks mix together to form a complicated web. Until now, this web has been difficult to untangle and translate.
But researchers funded by the U.S. Army developed a machine-learning algorithm that can model and decode these signals, according to a Nov. 12 press release. The research, which used standard brain datasets for analysis, was recently published in the journal
Nature Neuroscience. https://www.army.mil/article/240796“Our algorithm can, for the first time, dissociate the dynamic patterns in brain signals that relate to specific behaviors and is much better at decoding these behaviors,” Dr. Maryam Shanechi, the engineering professor at the University of Southern California who led the research, said in a statement.
... “The algorithm has significant implications for basic science discoveries,” Krim said. “The algorithm can discover shared dynamic patterns between any signals beyond brain signals, which is widely applicable for the military and many other medical and commercial applications.”
... The research is part of an effort to establish a machine-brain interface. Eventually, Krim said, this research may contribute to the development of technology that can not only interpret signals from the brain but also send signals back to help individuals take automatic corrective action for certain behaviors, he added.
In the future, the new algorithm could also enhance future brain-machine interfaces by decoding behaviors better. For example, the algorithm could help allow paralyzed patients to directly control prosthetics by thinking about the movement.
Imagination is the only limit when it comes to the potential of this technology, Krim said. Another futuristic application could enable soldiers to communicate with each other without ever opening their mouths.
“If you’re in the theater, and you can’t talk, you can’t even whisper, but you can still communicate,” Krim said. “If you can talk to your machine, and the machine talks to the other machine, and the machine talks to the other soldier, you have basically a full link without ever uttering a word."Omid G. Sani, et.al,
Modeling behaviorally relevant neural dynamics enabled by preferential subspace identification,
Nature Neuroscience, (2020)
https://www.nature.com/articles/s41593-020-00733-0 ------------------------------------------
... and then, one day, the machines will talk to each other and bypass the humans altogether ...------------------------------------------
Pilot In A Real Aircraft Just Fought An AI-Driven Virtual Enemy Jet For The First Timehttps://www.thedrive.com/the-war-zone/37647/pilot-in-a-real-aircraft-just-fought-an-ai-driven-virtual-enemy-jet-for-the-first-timeTwo U.S. companies have recently completed what they say is the world’s first dogfight between a real aircraft and an artificial intelligence-driven virtual fighter jet. The experiment, run by Red 6 and EpiSci, is the first step toward similar technology being provided to U.S. military fighter pilots, which would allow them to battle virtual adversaries as part of augmented reality training.
The live-flight augmented reality dogfight involved a Freeflight Composites Berkut 560 experimental plane and a simulated, reactive adversary aircraft in the form of a computer-generated projection inside the Berkut pilot's augmented reality helmet-mounted display. The adversary was a representation of the Chinese J-20 stealth fighter, created in augmented reality by EpiSci’s Tactical AI technology. The unusual aerial contest took place out of Camarillo Airport, California.
EpiSci drew upon its previous work in the U.S. Defense Advanced Research Projects Agency’s (DARPA) Alpha Dogfight program to create its Tactical AI technology in a hybrid AI system. In this way, the kind of AI-driven simulation that was previously found only in traditional ground-based simulators can be introduced to the cockpit — in this case, presenting the pilot in the real aircraft with a simulated adversary flying a J-20 fighter.
The demonstration also used the Airborne Tactical Augmented Reality System (ATARS) developed by Red 6, which includes the display and control systems needed to inject augmented reality into the real world of the cockpit, and then for these virtual entities to interact with the surroundings as if they were a part of the real world.
There is also the potential to employ Tactical AI in scenarios beyond replicating enemies in air combat exercises. More generally, it could serve to present various other presentations for military pilots at different levels of instruction, for example, simulating flying in a larger formation, including alongside unmanned loyal wingmen.
---------------------------------
Tyndall Air Force Base to Receive Military’s First Robot Dogshttps://warisboring.com/tyndall-air-force-base-to-receive-militarys-first-robot-dogs/ https://www.af.mil/News/Article-Display/Article/2413766/computerized-canines-to-join-tyndall-afb/TYNDALL AIR FORCE BASE, Fla. (AFNS) --
Over the last year, Tyndall Air Force Base and the 325th Security Forces Squadron have been working with Ghost Robotics to develop a system to enhance security and safety for the base population.
Tyndall AFB will be one of the first Air Force bases to implement semi-autonomous robot dogs into their patrolling regiment. Ghost Robotics held a demonstration Tuesday morning next to Maxwell Flag Park to show a few dozen airmen and civilians how the robots work. The almost 100-pound robots, which look somewhat like dogs, can be controlled with a remote but will operate autonomously around the base as security.
Hurricane Michael in 2018 significantly damaged static cameras, sensor platforms and fence lines in Tyndall’s integrated defense operation. Maj. Jordan Criss, the 325th Security Squadron commander, has been working with Ghost Robotics for several years to get the robot dogs to Tyndall
... “These robot dogs will be used as a force multiplier for enhanced situational awareness by patrolling areas that aren’t desirable for human beings and vehicles.” Criss said. “Rather than using a person, we can now leverage technology. We can use these robotic sentries to go out and sweep massive areas.”
Criss explained that the robot dogs will be given a patrol path which will be set and monitored by the Security Forces Electronic Security Sensor System noncommissioned officer in charge.
According to Ghost Robotics CEO Jiren Parikh, the robot dogs can cover 7.5 feet per second and the length of a football field in about 40 seconds.
Parikh said his company plans in the near future to have the robot dogs moving up to 10 feet per second. [
... A Rottweiler can do 30 feet per second in a sprint. It can reach the fence in 2.8 seconds; Can you?]
“We will be able to drive them via a virtual-reality headset within our Base Defense Operations Center,” Criss said. “We will be able to see exactly what the robot dog is detecting through its mobile camera and sensor platform if desired, we will also be able to issue verbal commands to a person or people through a radio attached to the dogs.”
This technology has the potential to replace and exceed the capabilities of certain static defense equipment especially in a contingency, disaster, or deployed environment. This makes Tyndall AFB, post Hurricane Michael, the perfect home for the Air Force’s newest computerized canines
Fahrenheit 451 - The Hound-----------------------------------
Boston Dynamics Dog Robot 'Spot' Learns New Tricks On BP Oil Righttps://www.reuters.com/article/us-bp-boston-dynamics-robot-oil-rig-idUSKBN27T2SBWorking on an oil rig operated by BP Plc nearly 190 miles (305 km) offshore in the Gulf of Mexico, the company is programming Spot to read gauges, look for corrosion, map out the facility and even sniff out methane on its Mad Dog rig.
... “Several hours a day, several operators will walk the facility; read gauges; listen for noise that doesn’t sound right; look out at the horizon for anomalies, boats that may not be caught on radar; look for sheens,” Ballard said.
“What we’re doing with Spot is really trying to replicate that observation piece,” Ballard said, adding that an operator could then review the information from a central location.
“We’ve got multispectral imaging that basically you can see many bands across that spectrum... to be able to see things that the human eye can’t see,” said Ballard.
Spot also has an integrated gas sensor that is programmed to shut the robot down if it detects a methane leak.
---------------------------------------
QinetiQ Delivers Armed Scout Robot To Army: RCV-Lhttps://breakingdefense.com/2020/11/qinetiq-delivers-armed-scout-robot-to-army-rcv-l/WASHINGTON: Robot-builder QinetiQ formally delivered the first of four experimental Robotic Combat Vehicles (Light) to the Army on Nov. 5, the company has announced. They will be used alongside four Textron-built RCV-Mediums in field tests.
After their delivery, the Army plans to buy 16 more of each variant as it scales up to more complex experiments.
Those 2022 exercises will determine the feasibility of the service’s ambitious plans for a “forward line of robots” to precede human troops into battle.The RCV-L also carries a mini-drone, the HoverFly Tethered Unmanned Aerial System, which it can launch to look over buildings, hills, and obstacles while the ground vehicle stays hidden. The drone is physically connected to the robot by a power and communications cable, even during flight – hence the term “tethered.” That does limit its range but effectively allows it unlimited flight time.