Ph3A Researcher in Multimodal Large Language Models (LLMs) /h3 pJoin to apply for the A Researcher in Multimodal Large Language Models (LLMs) role at Fondazione Bruno Kessler - FBK /p pThe Machine Translation (MT) Unit pursues research in the field of automatic translation, both from text and speech, to enable multilingual communication across modalities. Its core activities include the creation of advanced models and their adaptation to diverse contexts, domains, and applications (e.g. simultaneous speech translation and subtitling), leveraging the latest technology, including LLMs and foundation models. The unit also conducts fundamental and applied research that focuses on the ethical and societal impact of translation technology, with a long history in the design of evaluation methods, resources and studies that include human participants (e.g. gender inclusive and human-centered AI). /p h3Job Description /h3 pThe successful candidate is expected to research, design, and develop advanced models for multimodal (speech and video) processing, with a focus on novel architectures, fine‑tuning strategies, integration of background knowledge (e.g., documents from knowledge bases), integration of speech and video modalities into LLMs, and optimization for computational efficiency. The candidate will also contribute to the evaluation and integration of these solutions in real-world use cases. /p pThe successful candidate will contribute to the following tasks: /p ul liConduct research in one or more of the following areas: multimodal (speech, video) integration, speech translation, and low-resource fine-tuning; /li liDesign, develop, and deploy advanced machine learning algorithms, with a focus on efficiency; /li liCollaborate with team members to integrate research outcomes into practical solutions; /li liContribute to publications and present research outcomes. /li /ul h3Job Requirements /h3 pThe ideal candidate should have: /p ul liMaster’s Degree in areas related to Computer Science; /li liSolid background in deep learning; /li liStrong theoretical knowledge of the foundational concepts behind multimodal LLMs (e.g., Transformer architecture); /li liExcellent programming skills (Python); /li liGood knowledge and proficiency in the English language; /li liTeam-oriented with excellent communication and interpersonal skills; /li liSelf‑driven and proactive, contributing actively to common research goals /li /ul pFurthermore, the following elements will be positively evaluated: /p ul liPhD Degree in areas related to Computer Science; /li liExpertise in working on large-scale projects and distributed environments; /li liAbility to develop and manage a research agenda, including contributing to research proposals and mentoring students; /li liStrong publication record (major conferences and top‑ranked journals) in one or more of the following fields: machine translation, speech translation. /li /ul h3Employment /h3 pType of contract: fixed-term contract /p pWorking hours: Full time (38 h per week) /p pStart date: February 2026 /p pDuration: 24 months with the possibility of extending the contract depending on funding /p pGross annual salary: from €38,539.67 to about €44,087.26, depending on background and expertise in the field, plus objectives achievements bonus /p pBenefits: flexi-time, company subsidized cafeteria or meal vouchers, internal car park, welcome office support for visa formalities and for research in accommodation, accommodation etc. supplementary pension and health fund, social security (SANIFONDS), family-work balance, free training courses, support on bank account opening, discount on public transport, sport, language course fees, counseling and psychological support service. More info at /p h3Application /h3 pInterested candidates are requested to submit their application by completing the online form ( Please make sure that your application contains the following attachments (in pdf format): /p ul liDetailed CV including relevant past experiences (as an attached document in PDF format); /li liCover Letter (explaining your motivation for this specific position). /li /ul /p #J-18808-Ljbffr