New Israeli Tank Features Xbox Controllers, AI Honed by ‘StarCraft II’ and ‘Doom’https://www.washingtonpost.com/video-games/2020/07/28/new-israeli-tank-features-xbox-controllers-ai-honed-by-starcraft-ii-doom/The Carmel armored fighting vehicle (AFV) prototype was created by Israeli Aerospace Industries a part of the Israeli military’s three-year-old Carmel program, which seeks to develop a new concept for armor built around a two-person crew, a “fighter jet-like” cockpit, hybrid propulsion and
autonomous capabilities such that, “the soldiers are only required to make decisions that the mechanism cannot (yet) make by itself,” according to a statement from the Israeli Ministry of Defense.
The Israeli armor prototype was designed with a specific user experience in mind for the Israeli Defense Force’s active-duty element, which is typically men and women age 18 to 21. Any teen or 20-something who enters the hatch of the Carmel will likely feel familiar with the environment, thanks to video games.
It’s dark in the windowless Carmel, but you can see outside via a panoramic screen. It has tablet-like devices that allow operators to set the vehicle’s speed and change weapons. The side of the screen features up-to-date intelligence information. And controlling the steering, weapons systems and all manner of other operations is a friendly and familiar Microsoft Xbox controller.
The similarity is no accident. To develop their Carmel model, one of three models under consideration to become the IDF’s next armored fighting vehicle, Israel Aerospace Industries (IAI) engineers and managers worked with teenage gamers who critiqued the system, which originally came equipped with a state of the art fighter jet-like joystick. If approved, the new weapons platform would be operated by the dual thumbsticks, triggers, bumpers and buttons of a video game handset.
... IAI used design principles from video games as well, using familiar icons and layouts to display information. The screens contain much of the same info one would find when playing “Fortnite” or “Apex Legends,” including a map, a tally of ammo stores and a list of available weapons. It’s also displayed in a way one would expect to see while dropping into “Call of Duty Warzone,” with relevant information framing the crew’s view.
Of course, here, the screen displays the reality outside — with live targets and real weapons.... The video game functionality in the IAI model is not just focused on the user interface. The urban combat-focused vehicle has artificial intelligence that was trained mostly using the game “StarCraft II” and was integrated into the tank with the Unity game engine and VBS platform.
The AI system integrates weapon selections based on myriad circumstances, he said, giving the example that it would not present the operator with a missile option to respond to a single enemy shooting a rifle from a populated area. ... or maybe it would.“StarCraft II,” a 10-year-old real time strategy game, is highly regarded by engineers in the AI field, according to Google’s DeepMind Technologies, which last year created AI that defeated a pro StarCraft player. In a post, the company cited “StarCraft II” as an ideal AI trainer due to its constantly changing, real-time, competitive situations, the long duration of matches, incomplete information for players, and a vast playing space that has hundreds of variables.
The AI system for Carmel was reinforced and trained using thousands of scenarios that the technology generates, according to Moshe Beutel, 44, who leads the algorithm group. One of the ways the team developed this technology was by grinding “StarCraft II,” mixed in with a few other games such as “Doom,” to teach the AI different strategies for navigation, target detection, weapon selection and other autonomous capabilities. He and his team also wrote what he called “reinforcement agents,” but which gamers call “bots,” to beat the games.
After only two weeks, Beutel said that his bots performed 20 to 30 percent better than humans in figuring out how to get from point A to point B while fighting several enemies. He defined “better” as “getting to the end point without getting hurt.”IAI’s Carmel entry presents a vehicle capable of fully autonomous, semiautonomous, and manual driving modes. It can also automatically engage in target acquisition and weapon selection, though [currently] weapon deployment requires a human action.
Critics of the intersection of gaming and war have pointed to the potential desensitizing effect that games could introduce to an arena with lethal consequences.
... There has also been ethical scrutiny on the subject. United States drone pilots and other observers have likened piloting the unmanned crafts to playing a video game, and a 2012 documentary by The Guardian noted military efforts to recruit pilots from gaming conventions. More recently the U.S. Army’s official video gaming team began streaming on Twitch as part of a recruiting effort that resulted in criticism and claims of first amendment violations when members of the chat were banned from the stream for referencing wartime atrocities. (Twitch is owned by Amazon, whose CEO, Jeff Bezos, owns The Washington Post.)
--------------------------------------
Alexa With Firepower: Army Research Enables Conversational AI Between Soldiers, Robotshttps://www.army.mil/article/237580/army_research_enables_conversational_ai_between_soldiers_robotsResearchers from the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory, in collaboration with the University of Southern California’s Institute for Creative Technologies, developed the Joint Understanding and Dialogue Interface, or JUDI, capability, which enables bi-directional conversational interactions between Soldiers and autonomous systems.
This effort supports the Next Generation Combat Vehicle Army Modernization Priority and the Army Priority Research Area for Autonomy through reduction of Soldier burden when teaming with autonomous systems and by allowing verbal command and control of systems
... “This technology enables a Soldier to interact with autonomous systems through bidirectional speech and dialogue in tactical operations where verbal task instructions can be used for command and control of a mobile robot. In turn, the technology gives the robot the ability to ask for clarification or provide status updates as tasks are completed. Instead of relying on pre-specified, and possibly outdated, information about a mission, dialogue enables these systems to supplement their understanding of the world by conversing with human teammates.”... The goal, he said, is to shift the paradigm of Soldier-robot interaction from today’s heads-down, hands-full joystick operation of robots to a heads-up, hands-free mode of interaction where a Soldier can team with one or more robots while maintaining situational awareness of their surroundings.
According to the researchers, JUDI is distinct from current similar research conducted in the commercial realm.
“Commercial industry has largely focused on intelligent personal assistants like Siri and Alexa – systems that can retrieve factual knowledge and perform specialized tasks like setting reminders, but do not reason over the immediate physical surroundings"
In contrast, Marge said,
JUDI is designed for tasks that require reasoning in the physical world, where data is sparse because it requires previous human-robot interaction and there is little to no reliable cloud-connectivity. Current intelligent personal assistants may rely on thousands of training examples, while JUDI can be tailored to a task with only hundreds, an order of magnitude smaller.
Moreover, he said,
JUDI is a dialogue system adapted to autonomous systems like robots, allowing it to access multiple sources of context, like Soldier speech and the robot’s perception system, to help in collaborative decision-making.JUDI will be integrated into the CCDC ARL Autonomy Stack, a suite of software algorithms, libraries and software components that perform specific functions that are required by intelligent systems such as navigation, planning, perception, control and reasoning, which was developed under the decade-long Robotics Collaborative Technology Alliance.
Successful innovations in the stack are also rolled into the CCDC Ground Vehicle System Center’s Robotics Technology Kernel.
... “Once ARL develops a new capability that is built into the autonomy software stack, it is spiraled into GVSC’s Robotics Technology Kernel where it goes through extensive testing and hardening and is used in programs such as the Combat Vehicle Robotics, or CoVeR, program,” said Dr. John Fossaceca, AIMM ERP program manager. “Ultimately, this will end up as Army owned intellectual property that will be shared with industry partners as a common architecture to ensure that Next Generation Combat Vehicles are based on best of breed technologies with modular interfaces.”
-------------------------------
Mind-Controlled Drones and Robots: How Thought-Reading Tech Will Change the Face of Warfarehttps://www.zdnet.com/article/mind-reading-particles-for-the-military-the-bcis-that-enable-soliders-to-fly-planes-with-their-thoughts-alone/DARPA is funding a number of brain-computer interface projects to enable soliders to control equipment at the speed of thought.
"DARPA is preparing for a future in which a combination of unmanned systems, artificial intelligence, and cyber operations may cause conflicts to play out on timelines that are too short for humans to effectively manage with current technology alone," said Al Emondi, the Next-Generation Nonsurgical Neurotechnology (N3) program manager last year, when funding for six projects was announced: "By creating a more accessible brain-machine interface that doesn't require surgery to use, DARPA could deliver tools that allow mission commanders to remain meaningfully involved in dynamic operations that unfold at rapid speed."
https://www.battelle.org/newsroom/press-releases/press-releases-detail/battelle-led-team-wins-darpa-award-to-develop-injectable-bi-directional-brain-computer-interface------------------------------------