Practice 3.7 Robots and autonomous technologies with authentic IB Digital Society (DS) exam questions for both SL and HL students. This question bank mirrors Paper 1, 2, 3 structure, covering key topics like systems and structures, human behavior and interaction, and digital technologies in society. Get instant solutions, detailed explanations, and build exam confidence with questions in the style of IB examiners.
The UN secretary-general has called for states to conclude a new international treaty by 2026 to prohibit weapons systems without human control or oversight and that cannot be used in compliance with international humanitarian law. The treaty should regulate all types of autonomous weapons systems. The report reflects 58 submissions from over 73 countries and 33 submissions from the International Committee of the Red Cross and civil society groups. The UN General Assembly is considered a venue for inclusive discussions on autonomous weapons systems, considering international peace and security concerns.
On 27 March 2020, the Prime Minister of Libya, Faiez Serraj, announced the commencement of Operation PEACE STORM, which moved GNA-AF to the offensive along the coastal littoral. The combination of the Gabya-class frigates and Korkut short-range air defence systems provided a capability to place a mobile air defence bubble around GNA-AF ground units, which took Hafter Affiliated Forces (HAF) air assets out of the military equation. Libya classifies HAF as a terrorist rebel organization. The enhanced operational intelligence capability included Turkish-operated signal intelligence and the intelligence, surveillance and reconnaissance provided by Bayraktar TB-2 and probably TAI Anka S unmanned combat aerial vehicles. This allowed for the development of an asymmetrical war of attrition designed to degrade HAF ground unit capability. The GNA-AF breakout of Tripoli was supported with Firtina T155 155 mm self-propelled guns and T-122 Sakarya multi-launch rocket systems firing extended range precision munitions against the mid-twentieth century main battle tanks and heavy artillery used by HAF.
Logistics convoys and retreating HAF were subsequently hunted down and remotely engaged by the unmanned combat aerial vehicles or the lethal autonomous weapons systems such as the STM Kargu-2 (see annex 30) and other loitering munitions. The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true ‘fire, forget and find’ capability. The unmanned combat aerial vehicles and the small drone intelligence, surveillance and reconnaissance capability of HAF were neutralized by electronic jamming from the Koral electronic warfare system.
In the autumn of 2001, however, the United States was unwilling to launch a full-scale land invasion in a region 7000 miles from home. Instead, a plan evolved to send into Afghanistan a small number of CIA agents and Special Forces in support of anti-Taliban militias, with the aid of the US Air Force. That first October night was a powerful display of coordination involving laser- guided munitions dropped from the air and Tomahawk cruise missiles launched from the sea. General Tommy Franks, who then led the US Central Command (CENTCOM), the military command overseeing operations in Afghanistan, wrote in his memoir American Soldier that the assault involved in total some 40,000 personnel, 393 aircraft, and 32 ships. But one aircraft did not feature at all in the Air Force’s complex planning: a tiny, CIA-controlled, propeller-driven spy drone, Predator tailfin number 3034 which had crept into Afghanistan some hours earlier. It now hangs suspended in the Smithsonian Air and Space Museum in Washington, D.C., its place in history assured. Yet its actions that first night of the war – in which numerous agencies in the vast US military-intelligence machine each played sharply contradictory roles – remain steeped in controversy.
Human Rights Watch released a report stating that representatives from around 50 countries will meet in the summer of 2021 at the UN to discuss worldwide policy alignment on ‘killer robots’ or ‘lethal autonomous weapons systems’. In their report, Human Rights Watch expressed objections to delegating lethal force to machines without the presence of meaningful human control. Bonnie Docherty, senior arms research at Human Rights Watch said: ‘The fundamental moral, legal and security concerns raised by autonomous weapons systems warrant a strong and urgent response in the form of a new international treaty ... International law needs to be expanded to create new rules that ensure human control and accountability in the use of force.’ Human Rights Watch proposes a treaty that covers the use of all weapons that operate autonomously that includes limitations and restrictions such as banning the use of killer robots, with many claims reinforcing that meaningful human control must be involved in the selection and engagement of targets. It goes on to define the scope and prevalence of ‘meaningful human control’ to ensure that humans have access to the data, risks and potential impacts prior to authorizing an attack.
With reference to Source A, identify two different or unexpected impacts of ‘killer robots’.
With reference to Source D, explain why it may be difficult to reach global agreement on ‘killer robot’ policy.
Compare and contrast how Sources B and C present their messages of events involving unmanned combat aerial vehicles events.
With reference to the sources and your own knowledge, evaluate the decision to ban automated military technology.
Drones are widely used for surveillance in law enforcement and border control. While they enhance monitoring capabilities and can improve public safety, drones also raise concerns about privacy, consent, and the potential misuse of surveillance technology in public and private spaces.
Discuss the impact of drone technology on public surveillance and privacy, considering both the benefits for security and the ethical implications for individual privacy rights.
Define the term ‘autonomous vehicle’.
Identify and explain the function of 2 sensors on an autonomous vehicle
Explain how sensors would be used by autonomous vehicles to avoid obstacles in the road.
Machine learning (ML) allows systems to learn from data, enabling applications like image recognition in social media and fraud detection in finance. These applications rely on different types of machine learning, such as supervised learning, where algorithms are trained on labeled data, and unsupervised learning, where systems find patterns without labels.
Identify two types of machine learning and describe their uses.
Outline how supervised learning is applied in image recognition.
Explain how unsupervised learning helps in detecting fraud in financial transactions.
Evaluate the challenges of using machine learning for high-stakes decisions, such as in financial fraud detection, considering both accuracy and accountability.
Robots and autonomous technologies are being deployed in diverse contexts, from industrial robots in factories to virtual assistants in homes. Each type of robot serves a distinct purpose, helping with productivity, service, or social interaction in different sectors.
Identify two types of robots and provide an example of each.
Outline one benefit of using service robots in retail environments.
Explain how virtual assistants, such as Alexa or Siri, enhance user experience in home environments.
Evaluate the impact of productivity robots on manufacturing efficiency, considering both advantages and potential risks for the workforce.
The integration of autonomous vehicles (AVs) has the potential to transform urban infrastructure by reducing traffic congestion, improving road safety, and optimizing parking. However, AVs also require infrastructure modifications, and their presence raises questions about road-sharing laws, pedestrian safety, and ethical decision-making in complex traffic situations.
Evaluate the extent to which autonomous vehicles should influence the redesign of urban infrastructure, considering both the potential benefits for traffic efficiency and the challenges of ensuring public safety.
Discuss how citizens could be impacted through the use of smart city digital technologies
Source A
Source B Through this article today, we are here to share with you the various challenges in the internet of things. Security challenges- The first and foremost on the list of challenges is security. As the backbone of IoT is data storage and sharing, the biggest question arises about the security of data. Enabling every small physical object with the feature of sharing information may attract multiple raised brows. Lack of encryption- Encryption sounds like the ultimate answer to security issues. But hackers may manipulate these algorithms and turn a protective system into the most serious loophole. Lack of sufficient testing and updating- With the increasing market of IoT, production has to be faster. To compete in the race of production, manufacturers lack tests and updates. The main focus of IoT manufacturers now seems to be production, not security. Thus products lack proper and regular testing and updates. This makes IOTs prone to being attacked by hackers.
Source C A smart home is a household with internet-connected appliances you can remotely control using a tablet or smartphone. It uses smart devices such as smart TVs, smart thermostats, air conditioners, and even a robot vacuum. They are then connected together in a single network, through either hardwired or wireless systems like Zigbee, Wi-Fi, Bluetooth, and NFC, among others. Using the Internet of Things (IoT) technology, your smart appliances can communicate and share real time data with each other. This allows the devices to perform scheduled and automated tasks. IoT home gadgets bounce data back and forth with the use of sensors, learning and processing your patterns to automatically adjust themselves to your comfort. Some smart home Internet of Things applications are automatic light switches, burglar alarms, and voice-activated sound systems.
Source D A tired business person returns to their certified IoT smart home after a long working week. The smart security system senses they are alone and initiates the “Friday Night In” sequence. An intercom with a thoughtful, comforting voice suggests they might want to order in tonight. The business person unloads their things in the kitchen where the smart stove displays a selection of take-outs, rather than its default recipe guide. After the food arrives they retreat to the living room to watch some TV. The smart TV prepares a selection of Netflix marathons categorized by mood. They choose: “Looking to be cheered up? Comedy Playlist.” Before starting the program, they review a set of graphs displaying the data from their activity and diet throughout the day. A list of tips for smart living is generated, one of which reads that based on the number of consecutive nights spent alone, they might consider exploring a selection of popular dating sites instead of watching TV. With an inadvertent slip of their thumb the request is OK’d and instantly a set of profiles are displayed, each chosen from a generated list of their tracked preferences. A flurry of pings and messages from other stay-at-home hopefuls fills the screen. The smart home intercom exclaims, “You’ve got mail!” The confused and beleaguered business person fumbles for the remote and… uh-oh, the TV snaps a selfie in response to the flood of pings. Their image, sitting in their underwear eating noodles appears briefly on the screen before being whisked off into the ether. The flood of messages doubles, the system freezes causing the smart home to reboot. The house goes dark.
With reference to Source A identify 2 positives of IoT.
With reference to Source B and one other real-world example you have studied, explain why it may be difficult to have IoT in smart homes.
Compare what Source C and Source D reveal about impacts of homeowners with IoT smart homes.
With reference to the sources and your own knowledge, discuss whether the use of IoT in smart homes is an incremental or transformational change.
Identify two examples of autonomous technologies.
Outline one use a social robot may have in a health care environment.
Distinguish between social robots and service robots.
With reference to a real-world example, discuss whether social robots should be used as ‘companion robots’ for elderly people.
Autonomous vehicles rely on AI for navigation and decision-making. In cases of accidents, assigning accountability can be challenging, as decisions are made by the vehicle’s AI system rather than a human driver. This issue raises legal and ethical questions about responsibility in AI-driven systems.
Evaluate the challenges of ensuring accountability in autonomous vehicles, considering both the potential for safer roads and the complexities of AI decision-making.