During my TNA visit at KTH Royal Institute of Technology, we pursue the task of designing efficient algorithms to solve combinatorial optimisation problems with algorithmic fairness in perspective. It has been observed that algorithms designed solely for optimising the cost–to maximising profits–can exhibit inherent bias in their decisions, particularly towards certain groups of the society such as women and people of colour [1]. Using software systems with inherent bias is questionable and many legislations prohibit deploying systems known to exhibit bias.
Sheffield is a city of tailored made tools. Renowned for its steel craftsmanship, it was apparently once hailed as the steel-making capital of the world and even boosted in the Canterbury Tales for its cutlery-making. No pair for scissors will top the ones you can still until this day get in Sheffield, although customers are warned waiting time can take up to ten weeks for them to be ready for delivery.
Hi, my name is Maxime and I had the pleasure to do an internship at the Scuola Normale Superiore in Pisa on the modelization of decentralised exchanges based on stochastic processes. I am French and for two months I lived in Pisa to have a foreign experience in research.
LEGISLATIVE BACKGROUND
The emergence of artificial intelligence in the educational context presents great opportunities, but also poses unprecedented challenges. The European Union introduced the AI Act (EU Reg. 2024/1689) to establish a common regulatory framework, ensuring the safe and reliable use of AI systems. The key points relating to education are:
Children interact with AI technologies through a variety of devices, such as smart toys, virtual assistants, video games, and adaptive learning platforms. While these innovations offer significant advantages, they also present substantial risks to children's privacy, safety, and overall well-being due to their inherent vulnerability which stems from their “lack of psychophysical maturity and corresponding legal incapacity”[1].
The rise of restrictive data access policies, commonly referred to as the APIcalypse, has significantly impacted researchers' ability to collect and analyse data from social media platforms.
Recommendation systems and assistants (in short recommenders) have become integral to our daily interactions on online platforms. These algorithms suggest items or provide solutions based on users' preferences or requests, influencing almost every aspect of our digital experience. From guiding our social connections on platforms like Facebook and Instagram to recommending products on Amazon and mapping routes on Google Maps, recommenders shape our decisions and interactions instantaneously and profoundly.
Starting on 11 June 2024, Rennes hosted the TNC24 (https://tnc24.geant.org/), the largest and most prestigious research and education networking conference. SoBigData, with Sara Lelli and Valerio Grossi, was there to present training initiatives related to the research infrastructure.
As the "Empowering Data for Social Good" summer school comes to a close, we reflect on an incredible week filled with learning, collaboration, and groundbreaking ideas. The final session, where students presented their innovative research projects, was a fitting conclusion to an inspiring event.