🎓 Roberta-Fanshawe — Fine-Tuned QA Model for Fanshawe College

This is a fine-tuned version of deepset/roberta-base-squad2 on a custom Q&A dataset about Fanshawe College programs and admission requirements. It answers questions related to Fanshawe’s educational offerings in a helpful, focused way.

Disclaimer: This model is not affiliated with or officially endorsed by Fanshawe College. It was built using publicly available information for educational purposes.


Use Cases

  • ❓ Ask program-specific questions (e.g., admission requirements, program descriptions)
  • 🏫 Build a smart chatbot for academic assistance
  • 🤖 Plug into a retrieval-augmented generation (RAG) system

❤️ Inspired by Education + AI

I created RoBERTa-Fanshawe, a domain-specific question-answering model fine-tuned on content from Fanshawe College, to contribute my AI knowledge toward improving access to academic information. This project reflects my belief that education should be more accessible, and showcases how fine-tuned language models can empower students with instant, accurate answers tailored to their institution. It’s my way of giving back to the learning community that helped shape my journey.

🏷️ Model Details

Base Model: deepset/roberta-base-squad2 Fine-tuned on: 3,500 QA pairs about Fanshawe College License: MIT (free for any use with attribution) Author: @lamtdse61743


📢 Disclaimer

This model may not reflect the most current information from Fanshawe College. Always refer to the official Fanshawe College website for accurate and up-to-date program information.


Example Usage

from transformers import AutoTokenizer, AutoModelForQuestionAnswering
import torch

tokenizer = AutoTokenizer.from_pretrained("lamtdse61743/Roberta-Fanshawe")
model = AutoModelForQuestionAnswering.from_pretrained("lamtdse61743/Roberta-Fanshawe")

def ask_question(question, context):
    inputs = tokenizer(question, context, return_tensors="pt")
    with torch.no_grad():
        outputs = model(**inputs)
    start = torch.argmax(outputs.start_logits)
    end = torch.argmax(outputs.end_logits) + 1
    return tokenizer.decode(inputs["input_ids"][0][start:end], skip_special_tokens=True)

context = "Fanshawe College requires a minimum GPA of 2.5 for admission into its Construction Project Management program."
question = "What is the GPA requirement for admission?"
print(ask_question(question, context))
Downloads last month
20
Safetensors
Model size
124M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support