ecosmak.ru

How to stop autonomous killer robots? A nightmare of progress: will killer robots be banned? Tesla radio-controlled boat.

We rode the unmanned Yandex. Taxi" at Skolkovo, military engineers figured out how to adapt unmanned vehicle technologies to create new weapons.

In reality, technology is not quite what it seems. The problem with all technological evolution is that the line between commercial robots “for life” and military killer robots is incredibly thin, and it costs nothing to cross it. For now, they are choosing a route, and tomorrow they will be able to choose which target to destroy.

This is not the first time in history that technological progress has called into question the very existence of humanity: first, scientists created chemical, biological and nuclear weapon, now - “autonomous weapons”, that is, robots. The only difference is that until now weapons of “mass destruction” were considered inhumane - that is, they do not choose who they kill. Today the perspective has changed: a weapon that will kill with particular discrimination, choosing victims according to its own taste, seems much more immoral. And if any warlike power was stopped by the fact that if it used biological weapons, everyone around them would suffer, then with robots everything is more complicated - they can be programmed to destroy a specific group of objects.

In 1942, when American writer Isaac Asimov formulated the Three Laws of Robotics, it all seemed exciting but completely unrealistic. These laws stated that a robot could not and should not harm or kill a human being. And they must unquestioningly obey the will of man, except in cases where his orders would contradict the above imperative. Now that autonomous weapons have become a reality and may well fall into the hands of terrorists, it turns out that programmers somehow forgot to put Asimov’s laws into their software. This means that robots can pose a danger, and no humane laws or principles can stop them.

The Pentagon-developed missile detects targets on its own thanks to software, artificial intelligence (AI) identifies targets for the British military, and Russia displays unmanned tanks. To develop robotic and autonomous military equipment V various countries Colossal amounts of money are spent, although few people want to see it in action. Like most chemists and biologists, they are not interested in their discoveries ultimately being used to create chemical or biological weapons, and most AI researchers are not interested in creating weapons based on it, because then a serious public outcry would harm their research programs.

In his speech at the beginning of the United Nations General Assembly in New York on 25 September Secretary General Antonio Guterres called AI technology a "global risk" along with climate change and rising income inequality: "Let's call a spade a spade," he said. “The prospect of machines determining who lives is disgusting.” Guterres is probably the only one who can urge the military departments to come to their senses: he previously dealt with conflicts in Libya, Yemen and Syria and served as High Commissioner for Refugees.

The problem is that with further development of technology, robots will be able to decide who to kill. And if some countries have such technologies and others do not, then uncompromising androids and drones will predetermine the outcome of a potential battle. All this contradicts all of Asimov's laws at the same time. Alarmists may be seriously worried that a self-learning neural network will get out of control and kill not only the enemy, but all people in general. However, the prospects for even completely obedient killer machines are not at all bright.

Most active work in the field artificial intelligence and machine learning today is carried out not in the military, but in the civilian sphere - in universities and companies like Google and Facebook. But much of this technology can be adapted for military use. This means that a potential ban on research in this area will also affect civilian developments.

In early October, the American non-governmental organization Stop Killer Robots Campaign sent a letter to the United Nations demanding that the development of autonomous weapons be limited at the international legislative level. The UN made it clear that it supported this initiative, and in August 2017 Elon Musk and participants joined it International conference United Nations on Artificial Intelligence (IJCAI). But in fact, the United States and Russia oppose such restrictions.

The last meeting of the 70 countries party to the Convention on Certain Conventional Weapons (inhumane weapons) took place in Geneva in August. Diplomats have been unable to reach consensus on how global AI policy could be implemented. Some countries (Argentina, Austria, Brazil, Chile, China, Egypt and Mexico) expressed support for a legislative ban on the development of robotic weapons; France and Germany proposed introducing a voluntary system of such restrictions, but Russia, the USA, South Korea and Israel have stated that they have no intention of limiting the research and development being carried out in this area. In September, Federica Mogherini, the European Union's senior official on foreign policy and security policy, said guns “affect our collective security“, therefore, the decision on the issue of life and death should in any case remain in the hands of man.

Cold War 2018

US defense officials believe autonomous weapons are necessary for the United States to maintain its military advantage over China and Russia, which are also pouring money into similar research. In February 2018, Donald Trump demanded $686 billion for the country's defense in the next fiscal year. These costs have always been quite high and decreased only under the previous President Barack Obama. However, Trump - unoriginally - argued the need to increase them by technological competition with Russia and China. In 2016, the Pentagon budget allocated $18 billion for the development of autonomous weapons over three years. This is not much, but here you need to take into account one very important factor.

Most AI development in the US is carried out by commercial companies, so it is widely available and can be sold commercially to other countries. The Pentagon does not have a monopoly on Hi-tech machine learning. The American defense industry no longer conducts its own research the way it did during the cold war“, but uses the developments of startups from Silicon Valley, as well as Europe and Asia. At the same time, in Russia and China, such research is under the strict control of defense departments, which, on the one hand, limits the influx of new ideas and the development of technology, but, on the other, guarantees government funding and protection.

The New York Times estimates that military spending on autonomous military vehicles and unmanned aerial vehicles will exceed $120 billion over the next decade. This means that the debate ultimately comes down not to whether to create autonomous weapons, but to what degree of independence to give them.

Today, fully autonomous weapons do not exist, but Vice Chairman of the Joint Chiefs of Staff General Paul J. Selva of the Air Force said back in 2016 that within 10 years the United States will have the technology to create weapons that can independently decide who and when to kill. And while countries debate whether to restrict AI or not, it may be too late.

A large gathering of scientists, industry leaders and NGOs have launched a campaign to stop killer robots, dedicated to preventing the development of combat autonomous weapons systems. Among those who signed up were: Stephen Hawking, Noam Chomsky, Elon Musk and Steve Wozniak.

These big names are generating a lot of attention and lending legitimacy to the fact that killer robots, once considered science fiction, are in fact fast approaching reality.

An interesting study published in the International Journal of Cultural Research takes a different approach to the idea of ​​"killer robots" as a cultural concept. Researchers argue that even the most advanced robots are just machines, like everything else humanity has ever made.

“The thing is, killer robot’ as an idea didn’t come out of thin air,” said co-author Tero Karppi, an assistant professor of media theory at the University at Buffalo. “This was preceded by methods and technologies to make thinking and development of these systems possible.”

In other words, we worry about killer robots. The authors explore the theme of killer robots in films such as The Terminator or I, Robot, in which they theorize that far in the future, robots will end up enslaving the human race.

“Over recent decades, the expanded use of unmanned weapons has dramatically changed warfare, bringing new humanitarian and legal challenges. There has now been rapid advancement in technology, resulting from efforts to develop fully autonomous weapons. These robotic weapons will have the ability to select fire on a target independently, without any human intervention."

The researchers respond that these alarmist dystopian scenarios reflect a “techno-deterministic” worldview, where technological systems are given too much autonomy, which could be destructive not only to society, but to the entire human race.

But what if we code machine intelligence in such a way that robots will not even be able to tell the difference between a person and a machine? It's an intriguing idea: if there is no "us" and "them" there can be no "us versus them."

Indeed, Karppi suggested that we may be able to control how future machines will think about people on a fundamental level.

If we want to make changes in the development of these systems, now is the time. Simply ban lethal autonomous weapons and address the root causes of this dilemma. To truly avoid the development of autonomous killing machines.

Elon Musk recently expressed his strong opposition to AI being used to create killer robots. We are not talking about Terminators yet, but about robotic systems capable of performing some tasks that are usually the responsibility of soldiers. The military's interest in this topic is understandable, but their far-reaching plans frighten many.

But not only modern warriors dream and see machine guns that can replace ten, or even a hundred soldiers at the same time. These thoughts visited the heads of figures from different eras. Sometimes some ideas were realized, and they looked pretty good.

Robot Knight Da Vinci


Leonardo was a genius in almost every field. He managed to achieve success in almost all areas in which he showed interest. In the 15th century, he created a robot knight (of course, the word “robot” was not in use then).

The machine was able to sit, stand, walk, move its head and arms. The creator of the mechanical knight achieved all this with the help of a system of levers, gears and gears.

The Knight was re-created in our era - a working prototype was built in 2002. It was created “based on” the Da Vinci project by Mark Rosheim.

Tesla RC Boat


In 1898, inventor Nicola Testa showed the world the first invention of its kind - a remotely controlled vehicle (small boat). The demonstration was held in New York. Tesla controlled the boat, and it maneuvered, performing various actions as if by magic.

Tesla later tried to sell his other invention to the US military - something like a radio-controlled torpedo. But for some reason the military refused. True, he described his creation not as a torpedo, but as a robot, a mechanical man who is capable of performing complex work instead of his creators.

Radio controlled tanks of the USSR



Yes, engineers Soviet Union they were not cut out for it. In 1940 they created radio-controlled combat vehicles on the base light tank T-26. The operating range of the control panel is more than a kilometer.

The operators of these military terminators could open fire from machine guns, use a cannon and a flamethrower. True, the disadvantage of this technology was that there was no feedback. That is, the operator could only directly observe the actions of the tank at a distance. Naturally, the efficiency of the operator’s actions in this case was relatively low.

This is the first example of a military robot in action.

Goliath


The Nazis created something similar, but instead of equipping conventional tanks with radio control, they created miniature tracked systems. They could be controlled remotely. The Goliaths were started with explosives. The idea was as follows: a nimble kid made his way to an “adult” enemy tank and, once nearby, carried out the operator’s command to destroy everything with an explosion. The Germans created both an electric version of the system and a mini-tank with an internal combustion engine. In total, about 7,000 such systems were produced.

Semi-automatic anti-aircraft guns


These systems were also developed during World War II. The founder of cybernetics, Norbert Wiener, had a hand in their creation. He and his team were able to create anti-aircraft systems, who adjusted the accuracy of the fire themselves. They were equipped with technology that made it possible to predict where enemy aircraft would appear next.

Smart weapons of our time


In the 1950s, the US military, in an effort to win the Vietnam War, pioneered laser-guided weapons as well as autonomous aircraft, essentially drones.

True, they required human help in choosing a target. But it was already close to what it is now.

Predator


Probably everyone has heard about these drones. The MQ-1 Predator was introduced by the US military a month after the events of 9/11. Now Predators are the most common military drones in the world. They also have older relatives - the Reaper UAV.

Sappers


Yes, in addition to killer robots, there are also sapper robots. Now they are very common, they began to be used several years ago, in Afghanistan and other hot spots. By the way, these robots were developed by iRobot - it is the company that creates the most popular cleaning robots in the world. We are, of course, talking about Roomba and Scooba. In 2004, 150 of these robots (not vacuum cleaners, but sappers) were produced, and four years later - already 12,000.

Now the military has completely dispersed. Artificial intelligence (its weak form) promises great opportunities. The US is going to take full advantage of these opportunities. Here we are creating a new generation of killer robots, with cameras, radars, lidars and weapons.

It is they who scare Elon Musk, and with him many other bright minds from various fields of activity.

While Prime Minister Dmitry Medvedev and Arkady Volozh were driving an unmanned Yandex.Taxi around Skolkovo, military engineers were figuring out how to adapt unmanned vehicle technologies to create new weapons.

In reality, technology is not quite what it seems. The problem with all technological evolution is that the line between commercial robots “for life” and military killer robots is incredibly thin, and it costs nothing to cross it. For now, they are choosing a route, and tomorrow they will be able to choose which target to destroy.

This is not the first time in history when technological progress calls into question the very existence of humanity: first, scientists created chemical, biological and nuclear weapons, now - “autonomous weapons,” that is, robots. The only difference is that until now weapons of “mass destruction” were considered inhumane - that is, they do not choose who they kill. Today, the perspective has changed: a weapon that will kill with particular discrimination, choosing victims according to its own taste, seems much more immoral. And if any warlike power was stopped by the fact that if it used biological weapons, everyone around them would suffer, then with robots everything is more complicated - they can be programmed to destroy a specific group of objects.

In 1942, when American writer Isaac Asimov formulated the Three Laws of Robotics, it all seemed exciting but completely unrealistic. These laws stated that a robot could not and should not harm or kill a human being. And they must unquestioningly obey the will of man, except in cases where his orders would contradict the above imperative. Now that autonomous weapons have become a reality and may well fall into the hands of terrorists, it turns out that programmers somehow forgot to put Asimov’s laws into their software. This means that robots can pose a danger, and no humane laws or principles can stop them.

A Pentagon-developed missile detects targets itself thanks to software, artificial intelligence (AI) identifies targets for the British military, and Russia demonstrates unmanned tanks. Colossal amounts of money are being spent on the development of robotic and autonomous military equipment in various countries, although few people want to see it in action. Just as most chemists and biologists are not interested in their discoveries eventually being used to create chemical or biological weapons, most AI researchers are not interested in creating weapons based on them, because then serious public outcry would harm their research programs.

In his speech at the beginning General Assembly United Nations in New York on September 25, Secretary-General Antonio Guterres called AI technology a "global risk" along with climate change and growing income inequality: "Let's call a spade a spade," he said. “The prospect of machines determining who lives is disgusting.” Guterres is probably the only one who can urge the military departments to come to their senses: he previously dealt with conflicts in Libya, Yemen and Syria and served as High Commissioner for Refugees.

The problem is that with further development of technology, robots will be able to decide who to kill. And if some countries have such technologies and others do not, then uncompromising androids and drones will predetermine the outcome of a potential battle. All this contradicts all of Asimov's laws at the same time. Alarmists may be seriously worried that a self-learning neural network will get out of control and kill not only the enemy, but all people in general. However, the prospects for even completely obedient killer machines are not at all bright.

The most active work in the field of artificial intelligence and machine learning today is not in the military, but in the civilian sphere - at universities and companies like Google and Facebook. But much of this technology can be adapted for military use. This means that a potential ban on research in this area will also affect civilian developments.

In early October, the American non-governmental organization Stop Killer Robots Campaign sent a letter to the United Nations demanding that the development of autonomous weapons be limited at the international legislative level. The UN made it clear that it supports this initiative, and in August 2017, Elon Musk and the participants of the UN International Conference on the Use of Artificial Intelligence (IJCAI) joined it. But in fact, the United States and Russia oppose such restrictions.

The last meeting of the 70 countries party to the Convention on Certain Conventional Weapons (inhumane weapons) took place in Geneva in August. Diplomats have been unable to reach consensus on how global AI policy could be implemented. Some countries (Argentina, Austria, Brazil, Chile, China, Egypt and Mexico) expressed support for a legislative ban on the development of robotic weapons, France and Germany proposed introducing a voluntary system of such restrictions, but Russia, the USA, South Korea and Israel stated that they were not going to limit the research and development that is being done in this area. In September, Federica Mogherini, a senior official European Union on Foreign and Security Policy, said that guns “affect our collective security”, so the decision of life and death should in any case remain in the hands of the individual.

Cold War 2018

US defense officials believe autonomous weapons are necessary for the United States to maintain its military advantage over China and Russia, which are also investing in similar research. In February 2018, Donald Trump demanded $686 billion for the country's defense in the next fiscal year. These costs have always been quite high and decreased only under the previous President Barack Obama. However, Trump - unoriginally - argued the need to increase them by technological competition with Russia and China. In 2016, the Pentagon budget allocated $18 billion for the development of autonomous weapons over three years. This is not much, but here you need to take into account one very important factor.

Most AI development in the US is carried out by commercial companies, so it is widely available and can be sold commercially to other countries. The Pentagon does not have a monopoly on advanced machine learning technologies. The American defense industry no longer conducts its own research as it did during the Cold War, but uses the developments of startups from Silicon Valley, as well as Europe and Asia. At the same time, in Russia and China, such research is under the strict control of defense departments, which, on the one hand, limits the influx of new ideas and the development of technology, but, on the other, guarantees government funding and protection.

The New York Times estimates that military spending on autonomous military vehicles and unmanned aerial vehicles will exceed $120 billion over the next decade. This means that the debate ultimately comes down not to whether to create autonomous weapons, but to what degree of independence to give them.

Today, fully autonomous weapons do not exist, but Vice Chairman of the Joint Chiefs of Staff General Paul J. Selva of the Air Force said back in 2016 that within 10 years the United States will have the technology to create weapons that can independently decide who and when to kill. And while countries debate whether to restrict AI or not, it may be too late.

WikiHow works like a wiki, which means that many of our articles are written by multiple authors. This article was produced by 14 people, including anonymously, to edit and improve it.

Have you ever wanted to build combat robot? You probably thought it was too expensive and dangerous. However, most robot fighting competitions have a 150 gram weight class, including RobotWars. This class is called "Antweight" in most countries and "FairyWeight" in the USA. They are much cheaper than large combat robots and not as dangerous. Therefore, they are ideal for beginners in the field of fighting robots. This article will tell you how to design and build an Antweight class combat robot.


NOTE: This article assumes that you have already read and built a simple RC robot. If not, come back and at first do it. It should be noted that this article Not is a recommendation for using a specific part of your robot. This is necessary to encourage creativity and diversity among robots.

Steps

    Understand the rules. Before designing a robot for competition, you must understand all the rules. They can be found most important rule builds to watch out for are the size/weight requirements (4"X4"X4" 150 grams), and the metal armor rule which states you can't have armor more than 1mm thick.

    What weapon will you use? An important part of a combat robot is the weapon. Come up with a weapon idea, but make sure you stay within the rules. For your first antweight bot, it is highly recommended to use a "flipper" or even a "pusher" (one who pushes). Flipping weapons, if designed correctly, can be the most effective weapons in the Antweight class. Push weapons are the simplest, as they are not moving weapons. The entire robot acts like a weapon and pushes robots around. This is effective since the rules state that half of the arena must be without walls. You will be able to push another robot out of the arena.

    Select your details. Yes you need choose your details before design. However, don't buy them. Bye. Simply select the parts and the corresponding project. If something doesn't fit or doesn't work while you're designing, you'll save money because you can still replace the parts. And again, Not Buy the parts now!

    • Select servo. Usually for beginners in the Antweight class it is recommended to use a servo instead of a motor as with a servo you won't need a speed controller which will save you money and some weight for your robot. You should look for "micro" servos as they will save you a lot of weight. Make sure the servo "is" 360 modifiable. For combat robots, it is recommended to take a high torque servo instead of a high speed servo to make it easier to push other robots, even if you have a different weapon. Servo drive can be purchased
      • If you can't find a servo that perfectly suits your needs, check out the other section of the site that sells Futaba servos. Futaba is another brand that makes servos. Sometimes they come in different sizes than HiTec brand servos.
    • Select a motor for your weapon. If you have an active weapon (not a "pusher" for example), then you will probably need a motor to keep the weapon moving. If you have a weapon that needs to move really fast (like a spinning weapon) then you'll want to equip yourself with a DC motor (brushless usually works better, but brushes will work too) with a speed controller. It is not recommended to use spinning weapons for your first antweight robot, as they are difficult to build and balance correctly. However, if you want to make a flipping weapon, then you will need a servo. It is recommended to purchase a micro servo with particularly high torque so that it can easily flip another robot. Another thing that you should pay attention to when choosing a servo for a weapon is the type of gears. If you are using nylon gears and the motor is under heavy load, the gears may stretch over time. Try to choose more durable metal gears.
    • Select wheels. When choosing wheels, remember the rule that the robot must fit into a 4"X4"X4 cube. This means that your robot should have smaller diameter wheels. It is recommended to use 2" diameter wheels. Make sure the wheels can be easily mounted to the servo and protected. Another great technique used by combat robots of any size is the ability to ride upside down. Yes, the controls will be a little backwards, but you can prevent yourself from losing the immobilization competition. To do this, make your robot lower than your wheels so it can ride upside down. You can buy wheels
    • Select receiver/transmitter. When purchasing a receiver, make sure it is "fail safe". This is a mandatory rule in most competitions and safety. AR500 receiver Not has this trait. You will need to buy a receiver for the BR6000 bot, or another receiver with fault tolerance. It is recommended to use SpektrumDX5e as a transmitter. If you built the remote control robot from the previous wikiHow article, you can use that transmitter again, but you will have to buy a new receiver.
    • Select battery. It is highly recommended to purchase a LiPo battery instead of a NiHM battery. LiPo batteries are lighter. However, they are more dangerous, expensive and require a special charger. Invest in a LiPo battery and charger to save on weight.
    • Select a material. The material from which the chassis is made and armor combat robot, is very important, since it protects your electrical components from punctures by enemy weapons. There are three options to choose from: (note: there are more options, but these three are most suitable for this weight class) aluminum, titanium and polycarbonate. Aluminum is lightweight and strong, but can be expensive and difficult to cut. Plus, he might be completely Not 1 mm thick. Titanium is lightweight and very strong, but it is difficult to cut and very expensive. And the 1 mm thickness rule also applies to it. Polycarbonate, or Lexan, is a lightweight, inexpensive, easy-to-cut, shatterproof, durable plastic that is sometimes used in bulletproofing. Polycarbonate is also plastic, so it can be of any thickness, but it is recommended to use a thickness of 1 mm. It is highly recommended to use polycarbonate. It is as durable as the plastic used to make the walls of an antweight competition arena. When shopping, make sure to take a little extra in case you miscalculate. All these materials can be purchased
  1. Collect characteristics. Now that you have selected all the parts, you need to take down the dimensions and weights. They should be listed on the website you purchased them from. Convert all values ​​in inches to millimeters using the converter. Write down the specifications (in mm) of all your parts on a piece of paper. Now, convert the weight values ​​(ounces, pounds) to grams using the converter. Write down the weight characteristics on paper.

    Design. You want the project to be as accurate as possible. This means that you should try making a 3D design on a computer rather than a 2D design on paper. However, a 3D design doesn't have to look complicated. A simple prism and cylinder project will do.

    1. Add up the weight of all the parts (in grams) and make sure the total is less than 150 grams.
    2. If you don't have CAD, download the free version of Sketchup.
    3. Learn the basics of Sketchup with free lessons.
    4. Create all the parts you will use in Sketchup with the dimensions you recorded earlier.
    5. Develop your chassis and armor. Make sure to make it smaller than 4X4X4 inches.
    6. Place all components into a 3D model of the chassis/armor to see if they fit. This will help you decide where the components will be located.
  2. Order your parts. If all your components match your design flawlessly, order the parts. If not, select new parts.

    Collect it. Now you need to assemble your chassis/armor. Place all your components in the locations specified in your design. Connect everything and test. You should try to assemble everything so that you can easily remove the components if they need to be replaced. And components will need to be replaced more often than a regular robot, as this robot will fight. Attacking robots can damage yours. It is recommended to use Velcro tape to store the parts.

    Practice management. It doesn't matter how good your robot is, if you fall, you lose. Before you even think about competing, you need to practice management. Use upside down cups as cones and drive around them. Use the foam as targets and attack it (try this on a small table to practice pushing and try not to fall yourself). You could even buy a cheap RC car (on a different frequency with your robot), have another person drive it, and try to push or destroy the car without falling over. If you know another person with an Antweight robot, have a friendly fight with him (if possible, replace the spinning weapons with less destructive plastic ones).

  3. Compete. Find competitions in your area and have fun destroying other robots! Remember that if you are going to compete in the US, you should look for Fairyweight competitions, not Antweights.

    • If you want your robot to be able to punch, it is advisable to attach a servo to a spherical "arm", and have the arm set at a 90 degree angle to do uppercuts.
    • Will your robot be more defensive or offensive? Since weight is limited, you may want to use most of it on weapons or armor. Try to balance these characteristics on your first robot.
    • Any robot can be improved. Just because your first robot model doesn't work, don't throw it away completely. You may just need to replace the motor. Even if you have a fully functioning robot, you can still improve it. Look at the motors that best suit your purposes, if the new motor is not used in the project, just leave it and you will be able to build another robot. Try upgrading some parts (usually the front, back and weapons) of the armor to aluminum, or even titanium, for more "spinner protection".
    • Remember that you can place your robot diagonally into the cube.
    • Order spare parts for your robot. Since this is a combat robot, your parts may get damaged in battle. If you have spares on hand, you can replace parts faster.

    The rules state that the robot must fit into a 4X4X4 inch cube, however it can be expanded using remote control. You can benefit from this. For example, your flipping weapon sticks out too much. Try to design it so that the flipper can go straight up and be less than four inches tall. But when the flipper is lowered (after the cube is raised), the length will become more than four inches.

    • After building your first robot and having a clear understanding of combat robots, try to build another one. But this time, be unique. Try to make it different from the robots of other people in this weight category. If you're really ambitious, you can try making a flying robot! Flying robots are allowed by the rules, but they are rarely built.
    • If you use SketchUp, you can find the perfect models of servos and other components on Warehouse. Just search for the name of the servo (or component you want) and see if anything matches. Not everything is there, but what you find usually looks better and will give you a neater model. Make sure the model you find is the same size as the actual part.
    • If you are skilled in mechanics and fighting robots, you can try to build a walking robot. If you make a combat robot that walks, you'll have extra weight to work with.

    Warnings

    • LiPo batteries Very dangerous. Not charge them using a NiHM or Nicad battery charger.
    • Even micropneumatics are dangerous. If you use air guns, follow safety precautions.
    • Combat robots even of this size can be dangerous. If you are using a spinning weapon, move away when operating it. Turn it off when working on weapons.
    • Always wear safety glasses when cutting material or operating the robot.
    • Some arenas are considered unsafe for spinning weapons. Don't try to use spinning weapons in these arenas.
    • LiPo batteries can catch fire if they are punctured. When designing a robot, try to place the battery in a place that won't be punctured. If the battery catches fire, the rules state that you Not You can touch the robot while it is burning. You won't be able to get it out, which means all other components may be destroyed. Protect your battery like it's the heart of a robot!
Loading...