Australia targets Big Tech with tougher competition rules

Australia has proposed a law to curb anti-competitive practices by major tech companies, including fines of up to A$50 million ($33 million) for suppressing competition or preventing consumers from switching services. The move builds on recent efforts by the Labor government to regulate Big Tech, including a ban on social media use for children under 16 passed last week.

Assistant Treasurer Stephen Jones highlighted the dominance of platforms like Apple, Google, and Meta, warning that their practices stifle innovation, limit consumer choice, and inflate costs. The proposed law, inspired by the European Union’s Digital Markets Act, aims to make it easier for users to switch between services such as social media platforms, internet browsers, and app stores.

The law would empower Australia’s competition regulator to enforce compliance, investigate digital market practices, and impose fines. It prioritises oversight of app stores and ad tech services, targeting practices like promoting low-rated apps and favouring in-house services over competitors. Consultation on the legislation will run until February 14, with further discussions to refine the draft.

Big Tech companies, which dominate Australia’s digital market, have yet to comment on the proposal. Government reports reveal Google controls up to 95% of online search, Apple’s App Store handles 60% of app downloads, and Facebook and Instagram account for 79% of social media services in the country.

YouTuber partners with Google for space selfie stunt

Popular YouTuber and former NASA engineer Mark Rober has unveiled a unique project: a satellite designed to take selfies with Earth as the backdrop. Partnering with Google and T-Mobile, Rober plans to launch the satellite aboard SpaceX’s Transporter 12 mission in January 2025. Users can upload their selfies to be displayed on a Google Pixel aboard the satellite, which will snap their portrait alongside the planet when it passes over their location.

Accessing the ‘Earth selfie’ is free for subscribers of CrunchLabs, Rober’s educational engineering kit program, as well as T-Mobile customers and Google Pixel users with special codes. Codes can be redeemed starting 3 December at spaceselfie.com, and participants will receive updates on when their photo will be taken.

The satellite itself features innovative engineering, including solar arrays and dual Google Pixel phones for redundancy. To overcome challenges in orientation, Rober’s team equipped the satellite with a flywheel to shift its position for tasks like photo capture and data transmission. Rober’s latest YouTube video provides a deeper dive into the satellite’s mechanics.

Alibaba’s QwQ-32B AI model challenges OpenAI’s dominance

Alibaba has unveiled QwQ-32B-Preview, a new reasoning AI model designed to rival OpenAI’s o1 series. With 32.5 billion parameters and support for prompts up to 32,000 words, the model surpasses competitors in specific benchmarks, including logic puzzles and maths tests. Available for download under a permissive Apache 2.0 licence, it introduces robust reasoning capabilities but also exhibits limitations like language switching and occasional lapses in common sense.

The model incorporates test-time compute, enabling more thorough problem-solving by planning its steps before providing answers. However, such reasoning processes may result in slower responses. Like other AI models made by companies in China, QwQ-32B complies with local regulatory requirements, including constraints on politically sensitive topics, reflecting national ideological alignment.

Reasoning models like QwQ-32B mark a shift in AI development as traditional scaling laws show diminishing returns. Major firms, including Google, are exploring similar approaches, highlighting the race to innovate AI capabilities globally.

Australia begins trial of teen social media ban

Australia‘s government is conducting a world-first trial to enforce its national social media ban for children under 16, focusing on age-checking technology. The trial, set to begin in January and run through March, will involve around 1,200 randomly selected Australians. It will help guide the development of effective age verification methods, as platforms like Meta, X (formerly Twitter), TikTok, and Snapchat must prove they are taking ‘reasonable steps’ to keep minors off their services or face fines of up to A$49.5 million ($32 million).

The trial is overseen by the Age Check Certification Scheme and will test several age-checking techniques, such as video selfies, document uploads for verification, and email cross-checking. Although platforms like YouTube are exempt, the trial is seen as a crucial step for setting a global precedent for online age restrictions, which many countries are now considering due to concerns about youth mental health and privacy.

The trial’s outcomes could influence how other nations approach enforcing age restrictions, despite concerns from some lawmakers and tech companies about privacy violations and free speech. The government has responded by ensuring that no personal data will be required without alternatives. The age-check process could significantly shape global efforts to regulate social media access for children in the coming years.

Transparency issues plague UK mobile games

A recent investigation revealed that most top-selling mobile games in the UK fail to disclose the presence of loot boxes in their advertisements, despite regulations mandating transparency. Loot boxes, which provide randomised in-game items often obtained through payments, have drawn criticism for fostering addictive behaviors and targeting vulnerable groups, including children. Of the top 45 highest-grossing games analysed on Google Play, only two clearly mentioned loot boxes in their advertisements.

The UK Advertising Standards Authority, which oversees compliance, acknowledges the issue and promises further action but has faced criticism for its slow and limited enforcement. Critics argue that lax self-regulation within the gaming industry enables companies to prioritise profits over player well-being, particularly as loot boxes reportedly generate $15B annually.

Advocacy groups and researchers have voiced alarm over these findings, warning of long-term consequences. Zoë Osmond of GambleAware emphasised the risks of exposing children to gambling-like features in games, which could lead to harmful habits later in life. The gaming industry has so far resisted stricter government intervention, despite mounting evidence of non-compliance and harm.

Australian social media ban sparked by politician’s wife’s call to action

Australia has passed a landmark law banning children under 16 from using social media, following a fast-moving push led by South Australian Premier Peter Malinauskas. The law, which takes effect in November 2025, aims to protect young people from the harmful effects of social media, including mental health issues linked to cyberbullying and body image problems. The bill has widespread support, with a government survey showing 77% of Australians backing the measure. However, it has sparked significant opposition from tech companies and privacy advocates, who argue that the law is rushed and could push young users to more dangerous parts of the internet.

The push for the national ban gained momentum after Malinauskas’s state-level initiative to restrict social media access for children under 14 in September. This led to a broader federal response, with Prime Minister Anthony Albanese’s government introducing a nationwide version of the policy. The legislation eliminates parental discretion, meaning no child under 16 will be able to use social media without facing fines for platforms that fail to enforce the rules. This move contrasts with policies in countries like France and Florida, where minors can access social media with parental permission.

While the law has garnered support from most of Australia’s political leaders, it has faced strong criticism from social media companies like Meta and TikTok. These platforms warn that the law could drive teens to hidden corners of the internet and that the rushed process leaves many questions unanswered. Despite the backlash, the law passed with bipartisan support, and a trial of age-verification technology will begin in January to prepare for its full implementation.

The debate over the law highlights growing concerns worldwide about the impact of social media on young people. Although some critics argue that the law is an overreach, others believe it is a necessary step to protect children from online harm. With the law now in place, Australia has set a precedent that could inspire other countries grappling with similar issues.

India introduces new rules for critical telecom infrastructure

The government of India introduced the Telecommunications (Critical Telecommunication Infrastructure) Rules, 2024, on 22 November, which require telecom entities designated as Critical Telecommunication Infrastructure (CTI) to grant government-authorised personnel access to inspect hardware, software, and data. These rules are part of the Telecommunications Act, 2023, empowering the government to designate telecom networks as CTI if their disruption could severely impact national security, the economy, public health, or safety.

The rules mandate that telecom entities appoint a Chief Telecom Security Officer (CTSO) to oversee cybersecurity efforts and report incidents within six hours, a revised deadline from the original two hours proposed in the draft rules. This brings the telecom sector in India in line with existing Telecom Cyber Security Rules and CERT-In directions, though experts argue that the six-hour window does not meet global standards and may contribute to over-regulation.

Telecom networks are already governed under the Information Technology Act, creating potential overlaps with other regulatory frameworks such as the National Critical Information Infrastructure Protection Centre (NCIIPC). The rules also raise concerns about inspection protocols and data access, as they lack clarity on when inspections can be triggered or what limitations should be placed on government personnel accessing sensitive information.

Experts have also questioned the accountability measures in case of abuse of power and the potential for government officials to access the personal data of telecom subscribers during these inspections. To implement these rules, telecom entities must provide detailed documentation to the government, including network architecture, access lists, cybersecurity plans, and security audit reports. They must also maintain logs and documentation for at least two years to assist in detecting anomalies.

Additionally, remote maintenance or repairs from outside India require government approval, and upgrades to hardware or software must be reviewed within 14 days. Immediate upgrades are allowed during cybersecurity incidents, with notification to the government within 24 hours. A digital portal will be established to manage these rules, but concerns about the lack of transparency in communications have been raised. Finally, all CTI hardware, software, and spares must meet Indian Telecommunication Security Assurance Requirements.

AI brings change to US farming

Artificial intelligence is reshaping agriculture in the United States, offering solutions to longstanding challenges like labor shortages and rising costs. With US farms dwindling from 6.8 million in the 1930s to just 1.9 million in 2023, AI-powered technologies are stepping in to improve crop yields, resource efficiency, and food production. Experts emphasise AI’s ability to analyse massive amounts of data, guiding decisions on irrigation, fertilisation, and pest control to maximise productivity.

Despite its potential, adoption remains limited, with only 27% of US farmers currently using emerging technologies like AI. However, investment is projected to grow significantly, from $2B in 2024 to over $5B by 2028. Researchers at institutions like the AI Institute for Next Generation Food Systems are exploring applications from robotics to controlled indoor environments, which enable year-round farming and climate adaptability for crops like grapes.

While high upfront costs and accessibility remain hurdles, proponents believe AI can accelerate agricultural innovation and foster collaboration among farmers. By combining advanced tools and shared data, AI could help build a more sustainable food system and support the delivery of fresh, nutritious produce to underserved areas.

Mixed reactions as Australia bans social media for minors

Australia’s recent approval of a social media ban for children under 16 has sparked mixed reactions nationwide. While the government argues that the law sets a global benchmark for protecting youth from harmful online content, critics, including tech giants like TikTok, warn that it could push minors to darker corners of the internet. The law, which will fine platforms like Meta’s Facebook, Instagram and TikTok up to A$49.5 million if they fail to enforce it, takes effect one year after a trial period begins in January.

Prime Minister Anthony Albanese emphasised the importance of protecting children’s physical and mental health, citing the harmful impact of social media on body image and misogynistic content. Despite widespread support—77% of Australians back the measure—many are divided. Some, like Sydney resident Francesca Sambas, approve of the ban, citing concerns over inappropriate content, while others, like Shon Klose, view it as an overreach that undermines democracy. Young people, however, expressed their intent to bypass the restrictions, with 11-year-old Emma Wakefield saying she would find ways to access social media secretly.

This ban positions Australia as the first country to impose such a strict regulation, ahead of other countries like France and several US states that have restrictions based on parental consent. The swift passage of the law, which was fast-tracked through parliament, has drawn criticism from social media companies, which argue the law was rushed and lacked proper scrutiny. TikTok, in particular, warned that the law could worsen risks to children rather than protect them.

The move has also raised concerns about Australia’s relationship with the United States, as figures like Elon Musk have criticised the law as a potential overreach. However, Albanese defended the law, drawing parallels to age-based restrictions on alcohol, and reassured parents that while enforcement may not be perfect, it’s a necessary step to protect children online.

Starlink operations halted in Namibia for lacking licence

Namibia’s communications regulator has ordered Starlink, operated by SpaceX, to cease its operations in the country. The Communications Regulatory Authority of Namibia (CRAN) stated that the company was running a telecommunications network without the required licence.

A cease-and-desist order was issued on 26 November, demanding that Starlink immediately halt all activities. CRAN has also advised the public against purchasing or subscribing to Starlink services, warning that these actions are illegal under Namibian law.

Investigators have already confiscated unlicensed terminals from consumers and have opened criminal cases with the police. The regulator emphasised its commitment to enforcing compliance with national telecommunications regulations.

Earlier this year, Cameroon faced a similar situation, seizing equipment at ports due to licence violations. SpaceX has yet to comment on the developments in Namibia.