首先下载bert-base-chinese,可以在 Huggingface, modelscope, github下载

pip install gradio torch transformers

import gradio as gr
import torch
from transformers import BertTokenizer, BertForQuestionAnswering

# 加载bert-base-chinese模型和分词器
model_name = "D:/dev/php/magook/trunk/server/learn-python/models/bert-base-chinese"
tokenizer = BertTokenizer.from_pretrained(model_name)
model = BertForQuestionAnswering.from_pretrained(model_name)


def question_answering(context, question):
    # 使用分词器对输入进行处理
    inputs = tokenizer(question, context, return_tensors="pt")
    # 调用模型进行问答
    outputs = model(**inputs)
    # 获取答案的起始和结束位置
    start_scores = outputs.start_logits
    end_scores = outputs.end_logits
    # 获取最佳答案
    answer_start = torch.argmax(start_scores)
    answer_end = torch.argmax(end_scores) + 1
    answer = tokenizer.decode(inputs["input_ids"][0][answer_start:answer_end])
    return answer


# 创建Gradio界面
interface = gr.Interface(
    fn=question_answering,
    inputs=["text", "text"],  # 输入分别为context和question
    outputs="text",  # 输出为答案
)

interface.launch()

运行

> python llm_and_transformer/bert/use_bert-base-chinese4.py
Some weights of BertForQuestionAnswering were not initialized from the model checkpoint at D:/dev/php/magook/trunk/server/learn-python/models/bert-base-chinese and are
newly initialized: ['qa_outputs.bias', 'qa_outputs.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
Running on local URL:  http://127.0.0.1:7860

To create a public link, set `share=True` in `launch()`.

访问 http://127.0.0.1:7860
在这里插入图片描述

点赞(0) 打赏

评论列表 共有 0 条评论

暂无评论

微信公众账号

微信扫一扫加关注

发表
评论
返回
顶部