academics - Logo - 20 Jahre
Jobs Arbeitgeber Ratgeber Services
Gemerkte Jobs
Für Arbeitgeber
Jobs Arbeitgeber Ratgeber Services Gemerkte Jobs

Für Arbeitgeber

Finden Sie qualifizierte Mitarbeiter:innen

Aktuell ist der Funktionsumfang unserer Webseite eingeschränkt - Laden Sie die Seite neu, wenn dieser Hinweis nach wenigen Sekunden weiter angezeigt wird.

Diese Stellenanzeige ist nicht mehr aktiv.

Starten Sie eine neue Suche und finden Sie ihre nächste berufliche Herausforderung oder informieren Sie sich über spannende Arbeitgeber.

Neue Suche starten
Zur Arbeitgeberübersicht
Universität Duisburg-Essen - Logo

Postdoc position in Responsible AI (f/m/d)
Universität Duisburg-Essen

Befristet
Vollzeit, Teilzeit
Bewerbungsfrist: 05.06.2025
Veröffentlicht am: 09.05.2025
Duisburg, Essen
Ich bin interessiert
Universität Duisburg-Essen - Logo
Postdoc position in Responsible AI (f/m/d)
Ich bin interessiert
Ich bin interessiert
Postdoc position in Responsible AI (f/m/d) - Universität Duisburg-Essen - Universität Duisburg-Essen - Header
The Ruhr area, one of Europe‘s largest metropolitan regions, offers attractive career opportunities for excellent scientists and scholars from around the world. In 2021, Ruhr-Universität Bochum, TU Dortmund University and the University of Duisburg-Essen established the Research Alliance Ruhr to bundle their cutting-edge international research on the most urgent challenges facing humankind. There are four research centres and a college. This is just the latest chapter in our long-standing collaboration as the University Alliance Ruhr (UA Ruhr), a community of 14,000 researchers and 120,000 students in the heart of Germany.

As part of the Research Alliance Ruhr, the Research Centre for Trustworthy Data Science and Security unites the expertise of its member universities to bridge the gaps between Psychology & Social Sciences, Artificial Intelligence & Machine Learning, Data Science & Statistical Learning, Law, Cybersecurity & Privacy, and more.

Postdoc position in Responsible AI (f/m/d)
(salary 14 TV-L, 100 %)


Appointment date: as soon as possible
Contract duration: 3 years
Position type: 100 percent of a full-time position (part-time is possible)
Application deadline: 05.06.2025

The Compliant & Accountable Systems Group seeks a post-doctoral researcher to work on issues relating to responsible AI. Our group takes an interdisciplinary, socio-technical approach, exploring the technical, legal, policy, social, and user dimensions to help improve the governance of emerging technologies to help ensure they are appropriate, secure, safe, accountable, and aligned with the public good.

About the Role

We are looking for a post-doctoral researcher to investigate how AI and other emerging technologies can be more appropriately designed, deployed, used, governed, and challenged. This could involve analysing and measuring how systems operate in practice, conducting empirical research into their use and impact, or developing tools, technical mechanisms, and governance/regulatory strategies that support real-world and effective accountability, transparency, and oversight.

You will join a supportive interdisciplinary research environment and contribute to addressing real-world challenges in understanding and shaping how emerging technologies are approached, developed, used, and governed in ways aligned with the public good and responsive to individual and broader societal needs.

Research Areas and Directions

We welcome applicants with a wide range of interests and backgrounds that are relevant to responsible AI. Potential topics include (but are not limited to):
  • Transparency, oversight, and contestability in AI systems
  • Accountability in algorithmic supply chains and infrastructures
  • Technical tools for logging, tracing, and system scrutiny
  • Governance of general-purpose and open-source AI models
  • Data rights, privacy, and control in AI-driven environments
  • Participatory and human-centred methods for responsible technology
  • Legal and policy frameworks for AI accountability and risk mitigation
The research will be shaped according to the selected candidate’s strengths and interests and may involve self-initiated directions aligned with the group’s broader themes.

Who We Are Looking For

We welcome applicants from a wide range of disciplines, including computer science (e.g. systems, AI/ML, HCI, security, privacy), law (e.g. data protection, technology regulation, liability, digital rights), and public policy, to name a few.

You should have:
  • A PhD degree (completed, near-completion, or equivalent experience) in a relevant discipline
  • A demonstrated interest in responsible technology and/or AI governance
  • Excellent analytical, writing, and communication skills in English
  • The ability to work independently and collaboratively in an interdisciplinary setting
Whether your strengths lie in technical development, empirical investigation, legal analysis, socio-technical system design, or somewhere else, we encourage you to apply.

Responsibilities

  • Lead a defined research direction aligned with the group's themes, working with a high degree of independence, initiative, and leadership.
  • Conduct research using methods appropriate to the project and your background — including empirical, qualitative, quantitative, or systems-based approaches — and develop tools, prototypes, or other artefacts to explore and demonstrate key ideas.
  • Publish and present research at top-tier venues and contribute to the academic community.
  • Support the supervision, instruction, mentoring, and development of students/junior researchers.
  • Collaborate with researchers across disciplines and institutions, contributing to a supportive and engaged research environment.
  • Contribute to shaping new research agendas, funding proposals, and collaborative projects within and beyond the group.
  • Drive broader impact activities, including engagement with policy, industry, or the public where appropriate.

Opportunities and Environment

Successful candidates will join the Compliant & Accountable Systems Group, which is part of the broader RC-Trust: Research Centre for Trustworthy Data Science and Security. RC-Trust is a multi-university collaboration focusing on developing trustworthy intelligent systems through a human-centred, interdisciplinary approach. Work will also involve collaboration with the University of Cambridge.

Candidates will have access to:
  • A highly interdisciplinary and collaborative research environment.
  • Opportunities to engage with external stakeholders and contribute to policy discussions.
  • Travel opportunities for conferences, workshops, and research collaborations.
  • A network of researchers and industry partners working on AI trust and governance.

What we offer

This position is embedded in a creative, dynamic, and internationally renowned research environment. Your research will play a crucial role in developing our new Research Centre and promoting trustworthy technology to the general public. Our extensive international network of researchers and industry partners ensures a seamless transition into your next career step, whether in academia or international research institutions.

We prioritize a balanced and family-friendly work-life relationship, offering options for flexible working hours and part-time remote home-office arrangements.

How to Apply

Applicants must submit the following documents (in separate PDF files):
  • A CV detailing academic qualifications, relevant experience, and publications (if applicable).
  • A one-page cover letter outlining their motivation for applying and how their background aligns with the position.
  • A one-page research statement outlining their research interests within the themes of the position.
Incomplete applications will not be reviewed. The appointment will be made based on academic merit, experience, and overall alignment with the group’s research goals. As such, fit to the group’s needs and culture, and the potential to contribute meaningfully to ongoing or emerging research directions will be important factors in the final selection.

Please send your application in an email stating the reference number (221-25) until 05.06.2025 to recruit-compacctsys@rc-trust.ai

For informal inquiries about research specifics, please contact Prof. Jat Singh at jat@rc-trust.ai, for questions about the application process, please contact recruit-compacctsys@rc-trust.ai

The University of Duisburg-Essen aims to promote the diversity of its members (see https://www.uni-due.de/diversity).
It seeks to increase the proportion of women on academic staff and, therefore, strongly encourages qualified women to apply.
Women will be given preferential treatment according to the NRW State Equality Act if they have equal qualifications.
Applications from suitable severely disabled persons and those of equal status according to § 2 Abs. 3 SGB IX are welcome.
Arbeitsort

Kontakt

recruit-compacctsys@rc-trust.ai

Weitere Aktionen

Ich bin interessiert
Ähnliche Jobs per Mail erhalten?

Abonnieren Sie unsere Job-Mail!

Ähnliche Jobs

Bergische Universität Wuppertal - Logo
Wissenschaftliche*r Mitarbeiter*in (Post-Doc)

Bergische Universität Wuppertal

Wuppertal
12.08.2025
Universität Duisburg-Essen - Logo
Wissenschaftliche*r Mitarbeiter*in (Doktorand*in oder Postdoktorand*in) an Universitäten mit dem Schwerpunkt Mensch-Computer-Interaktion

Universität Duisburg-Essen

Duisburg
08.08.2025
Universität Paderborn - Logo
Wissenschaftliche*r Mitarbeiter*in (w/m/d) als Post-Doc - Fachgebiet Signal und Systemtheorie

Universität Paderborn

Paderborn
21.07.2025

Ähnliche Jobs

Technische Universität Braunschweig - Logo
Top Job
Two PhD positions (m/f/d) in Urban Design / Participatory Planning at the Collaborative Research Center (CRC, SFB/TRR 408) "AgiMo: Data-driven agile planning for responsible mobility"

Technische Universität Braunschweig

Braunschweig
23.07.2025
Technische Universität Braunschweig - Logo
Top Job
ReSpace! One postdoctoral and several doctoral positions (m/f/d) at the interdisciplinary research group “ReSpace! Connected Response-able Spaces and Infrastructures for Sustainable Living” (3-year fixed-term, fulltime-position)

Technische Universität Braunschweig

Braunschweig
23.07.2025
Hochschule für angewandte Wissenschaften München - Logo
Postdoc für den Bereich "Generative KI für Autonomes Fahren" (m/w/d)

Hochschule für angewandte Wissenschaften München

München
06.08.2025
Informatik Jobs Postdoc Jobs Lehre & Forschung, Wissenschaft Jobs Nordrhein-Westfalen Jobs

Bitte entnehmen Sie die Bewerbungsinformationen der Stellenausschreibung.

academics - Logo - 20 Jahre

academics

Über uns

Karriere

Kontakt

Impressum

Nutzungsbedingungen

Datenschutzerklärung

Cookies & Tracking

Barrierefreiheit

Partner

BOA Berufstest

academics.com

FAQ

Jobs

Jobs Professor

Jobs Wissenschaftlicher Mitarbeiter

Jobs Öffentlicher Dienst

Jobs Postdoc

Jobs Berlin

Jobs Hamburg

Ratgeber

Ratgeber Gehalt

Ratgeber Öffentlicher Dienst

Ratgeber Bewerbung

Ratgeber Professur

Ratgeber Promotion

Ratgeber Habilitation

Öffentlicher Dienst Gehalt

Professor Gehalt

Wissenschaftlicher Mitarbeiter Gehalt

Für Arbeitgeber

Stellenanzeige schalten

Mediadaten

AGB

academics

Über uns

Karriere

Kontakt

Impressum

Nutzungsbedingungen

Datenschutzerklärung

Cookies & Tracking

Barrierefreiheit

Partner

BOA Berufstest

academics.com

FAQ

Jobs

Jobs Professor

Jobs Wissenschaftlicher Mitarbeiter

Jobs Öffentlicher Dienst

Jobs Postdoc

Jobs Berlin

Jobs Hamburg

Ratgeber

Ratgeber Gehalt

Ratgeber Öffentlicher Dienst

Ratgeber Bewerbung

Ratgeber Professur

Ratgeber Promotion

Ratgeber Habilitation

Öffentlicher Dienst Gehalt

Professor Gehalt

Wissenschaftlicher Mitarbeiter Gehalt

Für Arbeitgeber

Stellenanzeige schalten

Mediadaten

AGB

In Kooperation mit

Forschung & Lehre - Logo