Axon Lawyers

Axon Lawyers

Rechtspraktijken

We do life sciences law and regulation - medicines, medical devices and technology, biotech and food

Over ons

Axon Lawyers is an Amsterdam based law firm with an international orientation. Our focus is on the legal and regulatory aspects of the life sciences sector. We offer over 30 years of experience, gained as attorney at law and in-house counsel. We are familiar with the needs of small and large companies, both privately and publicly funded. We look forward to get to know you, and to provide you with the services you need.

Branche
Rechtspraktijken
Bedrijfsgrootte
2-10 medewerkers
Hoofdkantoor
Amsterdam
Type
Particuliere onderneming
Opgericht
2011
Specialismen
Life Sciences, Intellectual Property, Financing of Start-ups and Spin-outs, Pharmaceutical contracts, Medical Devices en Regulatory aspects

Locaties

Medewerkers van Axon Lawyers

Updates

  • Organisatiepagina weergeven voor Axon Lawyers, afbeelding

    744 volgers

    Join our colleague Judith de Wilde for an in-depth presentation on product liability throughout the medical device supply chain in October! #MDR #IVDR #Medicaldevice #lifecycle #EUMDR #IVDR #MedTech #FemTech

    Organisatiepagina weergeven voor GBA Key2Compliance, afbeelding

    5.606 volgers

    🌟 Exciting Announcement! Meet three of our speakers for the Medical Device Product Life Cycle Conference 🌟 Join us on October 8-9 in Stockholm, Sweden for our international conference dedicated to navigating medical device regulations and standards. Our expert speakers will share invaluable insights and tools to simplify complexities in product safety and regulatory compliance. 🔹 Judith de Wilde, Axton Lawyers - "Product liability throughout the medical device supply chain: a changing landscape" 🔹 Sandra Larsson, Technia - "'Product Lifecycle Management’: a recipe to maintain regulatory compliance" 🔹 Erik Hansson - "Harmonization of Medical Device Regulations – current initiatives by IMDRF and the Global Harmonization Working Party" Don't miss out on this must-attend event for professionals involved in product safety, Quality Assurance, Regulatory Affairs, and more. Secure your spot today! 👉🏻https://lnkd.in/dbznGMcZ

    • Geen alternatieve tekst opgegeven voor deze afbeelding
  • Organisatiepagina weergeven voor Axon Lawyers, afbeelding

    744 volgers

    Read the update of our colleague Cécile van der Heijden about human oversight under the AI Act.

    Profiel weergeven voor Cécile van der Heijden, afbeelding

    Attorney-at-law at Axon Lawyers | life sciences | MedTech | data protection | medicines | healthcare law

    It is always interesting when two entirely separate parts of one’s life collide.  What you may not know about me is that I am not only a lawyer but also a theology student. It was therefore interesting to read that pope Francis (very validly) called for human oversight with respect to the use of AI (in the context of technological weapons) during the G7 summit last week.   The need for human oversight has been recognized by the EU-regulator in article 14 of the EU AI Act for high-risk AI systems. Human oversight must aim to prevent or minimise the risks to health, safety, or fundamental rights that might arise from use of the high-risk AI system in accordance with its intended purpose or under conditions of reasonably foreseeable misuse. The latter is defined in the AI-Act as: “the use of an AI system in a way that is not in accordance with its intended purpose, but which may result from reasonably foreseeable human behaviour or interaction with other systems, including other AI systems”.   Misuse concerns both incorrect use of the AI system and off-label use. For MedTech companies, such obligation goes beyond the normal development processes under the MDR / IVDR. An initial issue arises with identifying the expected misuse. It is impossible to await the clinical use of the device, due to the obligation for the provider to identify measures permitting human oversight prior to the AI system being placed on the market or being put into service. Measures must be proportionate to the risk, the level of autonomy and the context in which the AI system is used. The AI Act remains silent about how such off-label use should be determined and when off-label use can be considered “reasonably foreseeable”. What if the AI systems is deployed off-label in an entirely different matter than the provider decided on? Should the provider have made a different assumption with respect to what uses outside of the intended purpose of the device would be reasonably foreseeable?   The obligation re. human oversight comes with its own design requirements. The provider should enable those people who provide the human oversight with the possibility to actually understand the relevant capacities and limitations of the high-risk AI systems. They should be able to actually monitor the operation of the AI system. These deployers will likely be ordinary healthcare providers without technical background. They must be able to interpret and to correct the output of the AI or to override, decline to use or reverse the output of the AI system. This should be considered in the design process.   This requires a new way of thinking from MedTech companies developing AI systems: ‘oversight by design’. MedTech companies should incorporate oversight by design into their development processes as soon as possible.   #AI #artificialintelligence #medtech #MDR #IVDR #IVD #medicaldevice #AIAct #AI #AIA #EUAIAct #EUMDR #IVDR #healthcare #SamD #connecteddevice

    Pope Francis tells G7 that humans must not lose control of AI

    Pope Francis tells G7 that humans must not lose control of AI

    reuters.com

Vergelijkbare pagina’s