React+Spark large model, build contextual AI Q&A page (expandable)

Foreword

The core functions of the open source project I wrote recently ran smoothly, and I had a sudden idea two days ago. Regarding whether the project can involve large models to assist users in using the platform, I went to study the recently popular domestic large model-iFlytek Spark model.

Get large model api

Console login

Address: https://console.xfyun.cn/app/myapp
After creating a new application, click on it:
image.png

Get the api address and its key

image.png
The large model version is selected on the left, and the circled area on the right is the data to be called by our API.

If there is no normal token or the key may not have real-name authentication, you need to authenticate it first!

The following is the web calling interface. This is the api interface address we will call later:
image.png

Technology stack

react hooks + TypeScript + semi-ui (component library, optional)

Download toolkit:

npm i crypto-js base-64 -d

Implementation

The api and key have been obtained, let’s start the code operation directly!

Directory

image.png

utils (tool class)

The following tool class getWebsocketUrl method is responsible for constructing the URL address of the API. For specific reasons, please refer to the corresponding official documentation.

import * as base64 from 'base-64';
import CryptoJs from 'crypto-js';
import {<!-- --> requestObj } from '../config';
export const getWebsocketUrl = () => {<!-- -->
  return new Promise<string>((resovle, reject) => {<!-- -->
    let url = 'ws://spark-api.xf-yun.com/v1.1/chat';
    let host = 'spark-api.xf-yun.com';
    let apiKeyName = 'api_key';
    // let date = new Date().toGMTString();
    let date = new Date().toUTCString();
    let algorithm = 'hmac-sha256';
    let headers = 'host date request-line';
    let signatureOrigin = `host: ${<!-- -->host}\\
date: ${<!-- -->date}\\
GET /v1.1/chat HTTP/1.1`;
    let signatureSha = CryptoJs.HmacSHA256(signatureOrigin, requestObj.APISecret);
    let signature = CryptoJs.enc.Base64.stringify(signatureSha);

    let authorizationOrigin = `${<!-- -->apiKeyName}="${<!-- -->requestObj.APIKey}", algorithm="${<!-- -->algorithm}\ ", headers="${<!-- -->headers}", signature="${<!-- -->signature}"`;

    let authorization = base64.encode(authorizationOrigin);

    //encode spaces
    url = `${<!-- -->url}?authorization=${<!-- -->authorization} & amp;date=${<!-- -->encodeURI(date)} & amp; host=${<!-- -->host}`;

    resovle(url);
  });
};

config (configuration class)

The keys mentioned in the preface can be filled in here separately, and the Uid does not matter.
image.png

server/AiTool (ws service tool)

This tool class is responsible for receiving questions from the parent component and returning relevant data information.

  • **forwardRef, useImperativeHandle, props ** Implement parent-child communication
  • Build ws communication by returning url address through **getWebsocketUrl**
  • **websocket** returns relevant ai response data and communicates the data loading status in real time
import {<!-- --> FC, forwardRef, useImperativeHandle, useState } from 'react';
import {<!-- --> requestObj } from '../config';
import {<!-- --> getWebsocketUrl } from '../utils';
interface AiToolProps {<!-- -->
  isText?: boolean;
  respondHoodle: (result: string) => void; //Associated data
  loadHoodle?: (isLoading: boolean) => void; //Loading status
  errorHoodle?: (isLoading: boolean) => void; //Failed call
}

interface CropperRef {<!-- -->
  submitHoodle: (v: any) => void; //called by parent class
}

const AiTool = forwardRef<CropperRef, AiToolProps>(function AiTool(
  {<!-- --> isText, respondHoodle, loadHoodle, errorHoodle },
  ref
) {<!-- -->
  let result: string = '';
  const [historyMessage, setHistoryMessage] = useState<any[]>([
    {<!-- --> role: 'user', content: 'Who are you' }, //# User's history questions
    {<!-- --> role: 'assistant', content: 'I am an AI assistant' }
  ]);

  useImperativeHandle(ref, () => ({<!-- -->
    submitHoodle: sendMsg
  }));
  const sendMsg = async (questionText: string) => {<!-- -->
    result = ' ';
    // Get the request address
    let myUrl = await getWebsocketUrl();
    // Get the content in the input box
    // Each time a question is sent, it is a new websocket request.
    let socket = new WebSocket(myUrl);
    // Monitor the events of each stage of websocket and handle them accordingly
    socket.addEventListener('open', (event) => {<!-- -->
      if (loadHoodle) loadHoodle(true);
      // Send a message
      let params = {<!-- -->
        header: {<!-- -->
          app_id: requestObj.APPID,
          uid: 'wzz'
        },
        parameter: {<!-- -->
          chat: {<!-- -->
            domain: 'general',
            temperature: 0.5,
            max_tokens: 1024
          }
        },
        payload: {<!-- -->
          message: {<!-- -->
            // If you want to get contextual answers, the developer needs to pass the historical question and answer information to the server each time, as shown in the following example
            // Note: The total number of tokens for all the content in the text needs to be controlled within 8192. If developers need longer conversations, they need to appropriately crop historical information.
            text: [
              ...historyMessage,
              //.......Omitted historical dialogue
              {<!-- --> role: 'user', content: questionText } //# The latest question, if no context is needed, only the latest question can be passed
            ]
          }
        }
      };
      socket.send(JSON.stringify(params));
    });
    socket.addEventListener('message', (event) => {<!-- -->
      let data = JSON.parse(event.data);
      if (!data.payload) {<!-- -->
        socket.close();
        return;
      }
      result + = data.payload.choices.text[0].content;
      respondHoodle(result);
      if (data.header.code !== 0) {<!-- -->
        console.log('An error occurred', data.header.code, ':', data.header.message);
        // An error occurred "Manually close the connection"
        socket.close();
      }
      if (data.header.code === 0) {<!-- -->
        // The conversation has been completed
        if (data.payload.choices.text & amp; & amp; data.header.status === 2) {<!-- -->
          setTimeout(() => {<!-- -->
            // "Conversation completed, close connection manually"
            socket.close();
          }, 1000);
        }
      }
    });
    socket.addEventListener('close', (event) => {<!-- -->
      setHistoryMessage([
        ...historyMessage,
        {<!-- --> role: 'user', content: questionText },
        {<!-- --> role: 'assistant', content: result }
      ]);
      if (loadHoodle) loadHoodle(false);
      // After the conversation is completed, the socket will be closed and the chat record will be wrapped.
    });
    socket.addEventListener('error', (event) => {<!-- -->
      if (errorHoodle) errorHoodle(true);
      console.log('Connection sending error!!', event);
    });
  };
  // return result;
  return '';
});
export default AiTool;

Chat.tsx (specific component)

Specific implementation of chat component

  • messageList records message data
  • submit Send question function
  • overRespond introduces the information function
  • **moveY **Return to bottom
import {<!-- --> memo, useRef, useState } from 'react';
//type
import type {<!-- --> FC } from 'react';
import styles from './index.module.scss';
import {<!-- --> Button, Spin } from '@douyinfe/semi-ui';
import AiTool from '@/ai/server/AiTool';
interface IProps {<!-- -->
  datas?: any[];
}
//user: true represents user information, otherwise ai
interface messageInfo {<!-- -->
  text: string;
  user: boolean;
}

const Chat: FC<IProps> = () => {<!-- -->
  const [question, setQuestion] = useState<string>('');
  // const [result, setResult] = useState<string>('');
  let result = '';
  const [isLoading, setIsLoading] = useState<boolean>(false);
  const [messageList, setMessageList] = useState<messageInfo[]>([]);
  const ref = useRef<any>(null);
  const messageContainerRef = useRef<any>(null);
  const loadingRef = useRef<any>(null);
  const submit = () => {<!-- -->
    setQuestion('');
    console.log(messageList);
    if (!messageList.length) {<!-- -->
      setMessageList([
        {<!-- -->
          user: true,
          text: question
        }
      ]);
      console.log(messageList);
    } else {<!-- -->
      setMessageList([
        ...messageList,
        {<!-- -->
          user: true,
          text: question
        }
      ]);
      console.log(messageList);
    }
    moveY();
    if (ref.current) {<!-- -->
      ref.current.submitHoodle(question);
    }
  };
  const respondHoodle = (respond: string) => {<!-- -->
    result = respond;
    loadingRef.current.innerText = result;
    moveY();
    // loadingRef.current
  };
  const overRespond = (v: boolean) => {<!-- -->
    if (!v) {<!-- -->
      setMessageList((prevList) => [...prevList, {<!-- --> user: false, text: result }]);
      console.log(messageList);
      moveY();
    }
    setIsLoading(v);
  };
  const handleSendMessage = (e: any) => {<!-- -->
    e.preventDefault();
  };
  const handleKeyPress = (e: any) => {<!-- -->
    if (e.keyCode === 13) {<!-- -->
      submit();
    }
  };
   //return to bottom
  const moveY = () => {<!-- -->
    const h = messageContainerRef.current.scrollHeight;
    messageContainerRef.current.scrollTop = h + 20;
  };
  return (
    <div className={<!-- -->styles.chat}>
      <div className={<!-- -->styles.chat__main}>
        <header className={<!-- -->styles.chat__mainHeader}>
          <p>Welcome to Qingyou AI Assistant!</p>
          <div>
            <Button onClick={<!-- -->moveY} style={<!-- -->{<!-- --> marginRight: 4 }}>
              Return to bottom
            </Button>
            <Button type="danger" theme="solid" onClick={<!-- -->() => setMessageList([])}>
              Clear chat history
            </Button>
          </div>
        </header>
        {<!-- -->/* Display the content of the message you sent */}
        <div className={<!-- -->styles.message__container} ref={<!-- -->messageContainerRef}>
          {<!-- -->messageList.map((item, index) => {<!-- -->
            return item.user ? (
              <div key={<!-- -->item.user.toString() + index} className={<!-- -->styles.message__chats}>
                <p className={<!-- -->styles.sender__name}>You</p>
                <div className={<!-- -->styles.message__sender}>
                  <p>{<!-- -->item.text}</p>
                </div>
              </div>
            ) : (
              <div className={<!-- -->styles.message__chats}>
                <p>Ai</p>
                <div className={<!-- -->styles.message__recipient}>
                  <p>{<!-- -->item.text}</p>
                </div>
              </div>
            );
          })}
          {<!-- -->isLoading ? (
            <div className={<!-- -->styles.message__chats}>
              <p>Ai</p>
              <div className={<!-- -->styles.message__recipient}>
                <p ref={<!-- -->loadingRef}>{<!-- -->result}</p>
                <Spin />
              </div>
            </div>
          ) : (
            ''
          )}
        </div>
        <div className={<!-- -->styles.chat__footer}>
          <form className="form" onSubmit={<!-- -->handleSendMessage}>
            <input
              type="text"
              placeholder="Write a message"
              className={<!-- -->styles.message}
              value={<!-- -->question}
              onChange={<!-- -->(e) => setQuestion(e.target.value)}
              onKeyUp={<!-- -->handleKeyPress}
            />
            <Button onClick={<!-- -->submit} type="primary" theme="solid" className={<!-- -->styles.sendBtn}>
              send
            </Button>
          </form>
        </div>
        <AiTool loadHoodle={<!-- -->overRespond} respondHoodle={<!-- -->respondHoodle} ref={<!-- -->ref} />
      </div>
    </div>
  );
};

export default memo(Chat);

scss is not v. If you need it, you can call me in the comment area.

Results

Renderings

QQ screenshot 20231111211013.jpg

Afterword

Regarding the use of the server’s AiTool.tsx tool, there are actually many other extensions that can be generated. Let’s focus on studying it next! come on! Come on!