rename prj to ai-css and add cloud model provider support

This commit is contained in:
goder 2026-01-28 22:58:38 +08:00
parent 2a97a6a98b
commit 771ca207bb
56 changed files with 1798 additions and 311 deletions

BIN
.DS_Store vendored

Binary file not shown.

201
LICENSE
View File

@ -1,201 +0,0 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "[]"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright [yyyy] [name of copyright owner]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

View File

@ -1,12 +1,13 @@
package cmd
import (
"github.com/spf13/cobra"
"goflylivechat/models"
"goflylivechat/tools"
"ai-css/models"
"ai-css/tools"
"log"
"os"
"strings"
"github.com/spf13/cobra"
)
var installCmd = &cobra.Command{

View File

@ -1,16 +1,17 @@
package cmd
import (
"ai-css/middleware"
"ai-css/router"
"ai-css/tools"
"ai-css/ws"
"fmt"
"log"
"os"
"github.com/gin-gonic/gin"
"github.com/spf13/cobra"
"github.com/zh-five/xdaemon"
"goflylivechat/middleware"
"goflylivechat/router"
"goflylivechat/tools"
"goflylivechat/ws"
"log"
"os"
)
var (

View File

@ -1,8 +1,8 @@
package common
import (
"ai-css/tools"
"encoding/json"
"goflylivechat/tools"
"io/ioutil"
)

View File

@ -1,6 +1,6 @@
{
"Server":"localhost",
"Port":"3306",
"Server":"192.168.1.81",
"Port":"33306",
"Database":"goflychat",
"Username":"goflychat",
"Password":"goflychat"

View File

@ -1,8 +1,8 @@
package controller
import (
"ai-css/models"
"github.com/gin-gonic/gin"
"goflylivechat/models"
)
func GetAbout(c *gin.Context) {

View File

@ -1,9 +1,9 @@
package controller
import (
"ai-css/models"
"ai-css/tools"
"github.com/gin-gonic/gin"
"goflylivechat/models"
"goflylivechat/tools"
"time"
)

View File

@ -1,8 +1,8 @@
package controller
import (
"ai-css/models"
"github.com/gin-gonic/gin"
"goflylivechat/models"
)
func Index(c *gin.Context) {

View File

@ -1,9 +1,9 @@
package controller
import (
"ai-css/common"
"ai-css/models"
"github.com/gin-gonic/gin"
"goflylivechat/common"
"goflylivechat/models"
"strconv"
)

View File

@ -1,10 +1,10 @@
package controller
import (
"ai-css/models"
"ai-css/tools"
"ai-css/ws"
"github.com/gin-gonic/gin"
"goflylivechat/models"
"goflylivechat/tools"
"goflylivechat/ws"
"net/http"
)

View File

@ -1,9 +1,9 @@
package controller
import (
"ai-css/models"
"ai-css/tools"
"github.com/gin-gonic/gin"
"goflylivechat/models"
"goflylivechat/tools"
"time"
)

View File

@ -1,18 +1,19 @@
package controller
import (
"ai-css/common"
"ai-css/models"
"ai-css/tools"
"ai-css/ws"
"errors"
"fmt"
"github.com/gin-gonic/gin"
"github.com/jinzhu/gorm"
"goflylivechat/common"
"goflylivechat/models"
"goflylivechat/tools"
"goflylivechat/ws"
"io/ioutil"
"log"
"os"
"strings"
"github.com/gin-gonic/gin"
"github.com/jinzhu/gorm"
)
func PostInstall(c *gin.Context) {

View File

@ -1,19 +1,25 @@
package controller
import (
"ai-css/common"
"ai-css/library/logger"
"ai-css/library/modelprovider"
"ai-css/library/modelprovider/bootstrap"
"ai-css/library/modelprovider/consts"
"ai-css/models"
"ai-css/tools"
"ai-css/ws"
"context"
"encoding/json"
"fmt"
"github.com/gin-gonic/gin"
"github.com/gorilla/websocket"
"goflylivechat/common"
"goflylivechat/models"
"goflylivechat/tools"
"goflylivechat/ws"
"os"
"path"
"strconv"
"strings"
"time"
"github.com/gin-gonic/gin"
"github.com/gorilla/websocket"
)
func SendMessageV2(c *gin.Context) {
@ -87,14 +93,20 @@ func SendMessageV2(c *gin.Context) {
if ok && guest != nil {
guest.UpdateTime = time.Now()
}
//kefuConns, ok := ws.KefuList[kefuInfo.Name]
//if kefuConns == nil || !ok {
// c.JSON(200, gin.H{
// "code": 200,
// "msg": "ok",
// })
// return
//}
if ws.AIAnswerAvailable(guest) {
// AI回答
var err error
if err = AIChat(vistorInfo.VisitorId, content, guest.Conn); err == nil {
c.JSON(200, gin.H{
"code": 200,
"msg": "ok",
})
return
}
logger.Errorf("ai chat failed err:%v,visitorID:%s,content:%s", err, vistorInfo.VisitorId, content)
}
msg := ws.TypeMessage{
Type: "message",
Data: ws.ClientMessage{
@ -351,3 +363,30 @@ func GetMessagespages(c *gin.Context) {
},
})
}
func AIChat(visitorID string, question string, ws *websocket.Conn) error {
var ctx = context.Background()
mgs, err := models.FindLatestMessageByVisitorId(visitorID, 3)
if err != nil {
logger.Errorf("find latest message err: %v", err)
return err
}
cli, err := bootstrap.DefaultAIManager.NewClient(consts.ProviderOpenAI, bootstrap.WithDefaultModel("gpt5-mini"))
if err != nil {
logger.Errorf("init gpt cli fail err:%v", err)
return err
}
err = cli.StreamChat(ctx, modelprovider.ChatRequest{}, func(modelprovider.StreamEvent) error {
})
if err != nil {
logger.Errorf("init gpt cli fail err:%v", err)
return err
}
return nil
}
func MakeAIMsg(msg []models.Message, curcontent string) []modelprovider.Message {
}

View File

@ -1,8 +1,8 @@
package controller
import (
"ai-css/models"
"github.com/gin-gonic/gin"
"goflylivechat/models"
)
func GetNotice(c *gin.Context) {

View File

@ -1,8 +1,8 @@
package controller
import (
"ai-css/models"
"github.com/gin-gonic/gin"
"goflylivechat/models"
"log"
)

View File

@ -1,8 +1,8 @@
package controller
import (
"ai-css/models"
"github.com/gin-gonic/gin"
"goflylivechat/models"
)
func GetRoleList(c *gin.Context) {

View File

@ -1,8 +1,9 @@
package controller
import (
"ai-css/models"
"github.com/gin-gonic/gin"
"goflylivechat/models"
)
func GetConfigs(c *gin.Context) {

View File

@ -1,11 +1,11 @@
package controller
import (
"ai-css/models"
"ai-css/tools"
"ai-css/ws"
"encoding/json"
"fmt"
"goflylivechat/models"
"goflylivechat/tools"
"goflylivechat/ws"
"log"
"strconv"
"time"

View File

@ -1,12 +1,12 @@
package controller
import (
"ai-css/common"
"ai-css/models"
"ai-css/tools"
"ai-css/ws"
"encoding/json"
"github.com/gin-gonic/gin"
"goflylivechat/common"
"goflylivechat/models"
"goflylivechat/tools"
"goflylivechat/ws"
"strconv"
)

View File

@ -1,10 +1,10 @@
package controller
import (
"ai-css/models"
"crypto/sha1"
"encoding/hex"
"github.com/gin-gonic/gin"
"goflylivechat/models"
"log"
"sort"
)

55
go.mod
View File

@ -1,6 +1,8 @@
module goflylivechat
module ai-css
go 1.16
go 1.22
toolchain go1.22.4
require (
github.com/dchest/captcha v0.0.0-20200903113550-03f5f0333e1f
@ -14,10 +16,55 @@ require (
github.com/gorilla/websocket v1.4.2
github.com/ipipdotnet/ipdb-go v1.3.0
github.com/jinzhu/gorm v1.9.14
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd // indirect
github.com/modern-go/reflect2 v1.0.1 // indirect
github.com/openai/openai-go/v3 v3.17.0
github.com/satori/go.uuid v1.2.0
github.com/sirupsen/logrus v1.4.2
github.com/spf13/cobra v0.0.5
github.com/stretchr/testify v1.10.0
github.com/zh-five/xdaemon v0.1.1
go.uber.org/zap v1.27.1
gopkg.in/natefinch/lumberjack.v2 v2.2.1
)
require (
github.com/davecgh/go-spew v1.1.1 // indirect
github.com/gin-contrib/sse v0.1.0 // indirect
github.com/go-playground/locales v0.13.0 // indirect
github.com/go-playground/universal-translator v0.17.0 // indirect
github.com/go-playground/validator/v10 v10.4.1 // indirect
github.com/gobuffalo/envy v1.7.0 // indirect
github.com/gobuffalo/logger v1.0.0 // indirect
github.com/gobuffalo/packd v0.3.0 // indirect
github.com/golang/protobuf v1.3.3 // indirect
github.com/gorilla/context v1.1.1 // indirect
github.com/gorilla/securecookie v1.1.1 // indirect
github.com/gorilla/sessions v1.1.3 // indirect
github.com/inconshreveable/mousetrap v1.0.0 // indirect
github.com/jinzhu/inflection v1.0.0 // indirect
github.com/joho/godotenv v1.3.0 // indirect
github.com/json-iterator/go v1.1.9 // indirect
github.com/karrick/godirwalk v1.10.12 // indirect
github.com/konsorten/go-windows-terminal-sequences v1.0.2 // indirect
github.com/kr/pretty v0.3.1 // indirect
github.com/leodido/go-urn v1.2.0 // indirect
github.com/mattn/go-isatty v0.0.12 // indirect
github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd // indirect
github.com/modern-go/reflect2 v1.0.1 // indirect
github.com/pmezard/go-difflib v1.0.0 // indirect
github.com/rogpeppe/go-internal v1.12.0 // indirect
github.com/spf13/pflag v1.0.3 // indirect
github.com/tidwall/gjson v1.18.0 // indirect
github.com/tidwall/match v1.1.1 // indirect
github.com/tidwall/pretty v1.2.1 // indirect
github.com/tidwall/sjson v1.2.5 // indirect
github.com/ugorji/go/codec v1.1.7 // indirect
go.uber.org/multierr v1.10.0 // indirect
golang.org/x/crypto v0.32.0 // indirect
golang.org/x/mod v0.17.0 // indirect
golang.org/x/sync v0.10.0 // indirect
golang.org/x/sys v0.29.0 // indirect
golang.org/x/term v0.28.0 // indirect
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c // indirect
gopkg.in/yaml.v2 v2.4.0 // indirect
gopkg.in/yaml.v3 v3.0.1 // indirect
)

58
go.sum
View File

@ -9,6 +9,7 @@ github.com/coreos/etcd v3.3.10+incompatible/go.mod h1:uF7uidLiAD3TWHmW31ZFd/JWoc
github.com/coreos/go-etcd v2.0.0+incompatible/go.mod h1:Jez6KQU2B/sWsbdaef3ED8NzMklzPG4d5KIOhIy30Tk=
github.com/coreos/go-semver v0.2.0/go.mod h1:nnelYz7RCh+5ahJtPPxZlU+153eP4D4r3EedlOD2RNk=
github.com/cpuguy83/go-md2man v1.0.10/go.mod h1:SmD6nW6nTyfqj6ABTjUi3V3JVMnlJmwcJI5acqYI6dE=
github.com/creack/pty v1.1.9/go.mod h1:oKZEueFk5CKHvIhNR5MUki03XCEU+Q6VDXinZuGJ33E=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
@ -91,11 +92,14 @@ github.com/kidstuff/mongostore v0.0.0-20181113001930-e650cd85ee4b/go.mod h1:g2nV
github.com/konsorten/go-windows-terminal-sequences v1.0.1/go.mod h1:T0+1ngSBFLxvqU3pZ+m/2kptfBszLMUkC4ZK/EgS/cQ=
github.com/konsorten/go-windows-terminal-sequences v1.0.2 h1:DB17ag19krx9CFsz4o3enTrPXyIXCl+2iCXH/aMAp9s=
github.com/konsorten/go-windows-terminal-sequences v1.0.2/go.mod h1:T0+1ngSBFLxvqU3pZ+m/2kptfBszLMUkC4ZK/EgS/cQ=
github.com/kr/pretty v0.1.0 h1:L/CwN0zerZDmRFUapSPitk6f+Q3+0za1rQkzVuMiMFI=
github.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo=
github.com/kr/pretty v0.2.1/go.mod h1:ipq/a2n7PKx3OHsz4KJII5eveXtPO4qwEXGdVfWzfnI=
github.com/kr/pretty v0.3.1 h1:flRD4NNwYAUpkphVc1HcthR4KEIFJ65n8Mw5qdRn3LE=
github.com/kr/pretty v0.3.1/go.mod h1:hoEshYVHaxMs3cyo3Yncou5ZscifuDolrwPKZanG3xk=
github.com/kr/pty v1.1.1/go.mod h1:pFQYn66WHrOpPYNljwOMqo10TkYh1fy3cYio2l3bCsQ=
github.com/kr/text v0.1.0 h1:45sCR5RtlFHMR4UwH9sdQ5TC8v0qDQCHnXt+kaKSTVE=
github.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI=
github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=
github.com/leodido/go-urn v1.1.0/go.mod h1:+cyI34gQWZcE1eQU7NVgKkkzdXDQHr1dBMtdAPozLkw=
github.com/leodido/go-urn v1.2.0 h1:hpXL4XnriNwQ/ABnpepYM/1vCLWNDfUNts8dX3xTG6Y=
github.com/leodido/go-urn v1.2.0/go.mod h1:+8+nEpDfqqsY+g338gtMEUOtuK+4dEMhiQEgxpxOKII=
@ -116,13 +120,18 @@ github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd/go.mod h1:6dJ
github.com/modern-go/reflect2 v0.0.0-20180701023420-4b7aa43c6742/go.mod h1:bx2lNnkwVCuqBIxFjflWJWanXIb3RllmbCylyMrvgv0=
github.com/modern-go/reflect2 v1.0.1 h1:9f412s+6RmYXLWZSEzVVgPGK7C2PphHj5RJrvfx9AWI=
github.com/modern-go/reflect2 v1.0.1/go.mod h1:bx2lNnkwVCuqBIxFjflWJWanXIb3RllmbCylyMrvgv0=
github.com/openai/openai-go/v3 v3.17.0 h1:CfTkmQoItolSyW+bHOUF190KuX5+1Zv6MC0Gb4wAwy8=
github.com/openai/openai-go/v3 v3.17.0/go.mod h1:cdufnVK14cWcT9qA1rRtrXx4FTRsgbDPW7Ia7SS5cZo=
github.com/pelletier/go-toml v1.2.0/go.mod h1:5z9KED0ma1S8pY6P1sdut58dfprrGBbd/94hg7ilaic=
github.com/pkg/diff v0.0.0-20210226163009-20ebb0f2a09e/go.mod h1:pJLUxLENpZxwdsKMEsNbx1VGcRFpLqf3715MtcvvzbA=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/quasoft/memstore v0.0.0-20180925164028-84a050167438/go.mod h1:wTPjTepVu7uJBYgZ0SdWHQlIas582j6cn2jgk4DDdlg=
github.com/rogpeppe/go-internal v1.1.0/go.mod h1:M8bDsm7K2OlrFYOpmOWEs/qY81heoFRclV5y23lUDJ4=
github.com/rogpeppe/go-internal v1.3.0 h1:RR9dF3JtopPvtkroDZuVD7qquD0bnHlKSqaQhgwt8yk=
github.com/rogpeppe/go-internal v1.3.0/go.mod h1:M8bDsm7K2OlrFYOpmOWEs/qY81heoFRclV5y23lUDJ4=
github.com/rogpeppe/go-internal v1.9.0/go.mod h1:WtVeX8xhTBvf0smdhujwtBcq4Qrzq/fJaraNFVN+nFs=
github.com/rogpeppe/go-internal v1.12.0 h1:exVL4IDcn6na9z1rAb56Vxr+CgyK3nn3O+epU5NdKM8=
github.com/rogpeppe/go-internal v1.12.0/go.mod h1:E+RYuTGaKKdloAfM02xzb0FW3Paa99yedzYV+kq4uf4=
github.com/russross/blackfriday v1.5.2/go.mod h1:JO/DiYxRf+HjHt06OyowR9PTA263kcR/rfWxYHBV53g=
github.com/satori/go.uuid v1.2.0 h1:0uYX9dsZ2yD7q2RtLRtPSdGDWzjeM3TbMJP9utgA0ww=
github.com/satori/go.uuid v1.2.0/go.mod h1:dA0hQrYB0VpLJoorglMZABFdXlWrHn1NEOzdhQKdks0=
@ -140,9 +149,19 @@ github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+
github.com/stretchr/objx v0.1.1/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=
github.com/stretchr/testify v1.4.0 h1:2E4SXV/wtOkTonXsotYi4li6zVWxYlZuYNCXe9XRJyk=
github.com/stretchr/testify v1.4.0/go.mod h1:j7eGeouHqKxXV5pUuKE4zz7dFj8WfuZ+81PSLYec5m4=
github.com/ugorji/go v1.1.7 h1:/68gy2h+1mWMrwZFeD1kQialdSzAb432dtpeJ42ovdo=
github.com/stretchr/testify v1.10.0 h1:Xv5erBjTwe/5IxqUQTdXv5kgmIvbHo3QQyRwhJsOfJA=
github.com/stretchr/testify v1.10.0/go.mod h1:r2ic/lqez/lEtzL7wO/rwa5dbSLXVDPFyf8C91i36aY=
github.com/tidwall/gjson v1.14.2/go.mod h1:/wbyibRr2FHMks5tjHJ5F8dMZh3AcwJEMf5vlfC0lxk=
github.com/tidwall/gjson v1.18.0 h1:FIDeeyB800efLX89e5a8Y0BNH+LOngJyGrIWxG2FKQY=
github.com/tidwall/gjson v1.18.0/go.mod h1:/wbyibRr2FHMks5tjHJ5F8dMZh3AcwJEMf5vlfC0lxk=
github.com/tidwall/match v1.1.1 h1:+Ho715JplO36QYgwN9PGYNhgZvoUSc9X2c80KVTi+GA=
github.com/tidwall/match v1.1.1/go.mod h1:eRSPERbgtNPcGhD8UCthc6PmLEQXEWd3PRB5JTxsfmM=
github.com/tidwall/pretty v1.2.0/go.mod h1:ITEVvHYasfjBbM0u2Pg8T2nJnzm8xPwvNhhsoaGGjNU=
github.com/tidwall/pretty v1.2.1 h1:qjsOFOWWQl+N3RsoF5/ssm1pHmJJwhjlSbZ51I6wMl4=
github.com/tidwall/pretty v1.2.1/go.mod h1:ITEVvHYasfjBbM0u2Pg8T2nJnzm8xPwvNhhsoaGGjNU=
github.com/tidwall/sjson v1.2.5 h1:kLy8mja+1c9jlljvWTlSazM7cKDRfJuR/bOJhcY5NcY=
github.com/tidwall/sjson v1.2.5/go.mod h1:Fvgq9kS/6ociJEDnK0Fk1cpYF4FIW6ZF7LAe+6jwd28=
github.com/ugorji/go v1.1.7/go.mod h1:kZn38zHttfInRq0xu/PH0az30d+z6vm202qpg1oXVMw=
github.com/ugorji/go/codec v0.0.0-20181204163529-d75b2dcb6bc8/go.mod h1:VFNgLljTbGfSG7qAOspJ7OScBnGdDN/yBr0sguwnwf0=
github.com/ugorji/go/codec v1.1.7 h1:2SvQaVZ1ouYrrKKwoSk2pzd4A9evlKJb9oTL+OaLUSs=
@ -150,20 +169,30 @@ github.com/ugorji/go/codec v1.1.7/go.mod h1:Ax+UKWsSmolVDwsd+7N3ZtXu+yMGCf907BLY
github.com/xordataexchange/crypt v0.0.3-0.20170626215501-b2862e3d0a77/go.mod h1:aYKd//L2LvnjZzWKhF00oedf4jCCReLcmhLdhm1A27Q=
github.com/zh-five/xdaemon v0.1.1 h1:W5VyJ+5ROjjcb9vNcF/SgWPwTzIRYIsW2yZBAomqMW8=
github.com/zh-five/xdaemon v0.1.1/go.mod h1:i3cluMVOPp/UcX2KDU2qzRv25f8u4y14tHzBPQhD8lI=
go.uber.org/goleak v1.3.0 h1:2K3zAYmnTNqV73imy9J1T3WC+gmCePx2hEGkimedGto=
go.uber.org/goleak v1.3.0/go.mod h1:CoHD4mav9JJNrW/WLlf7HGZPjdw8EucARQHekz1X6bE=
go.uber.org/multierr v1.10.0 h1:S0h4aNzvfcFsC3dRF1jLoaov7oRaKqRGC/pUEJ2yvPQ=
go.uber.org/multierr v1.10.0/go.mod h1:20+QtiLqy0Nd6FdQB9TLXag12DsQkrbs3htMFfDN80Y=
go.uber.org/zap v1.27.1 h1:08RqriUEv8+ArZRYSTXy1LeBScaMpVSTBhCeaZYfMYc=
go.uber.org/zap v1.27.1/go.mod h1:GB2qFLM7cTU87MWRP2mPIjqfIDnGu+VIO4V/SdhGo2E=
golang.org/x/crypto v0.0.0-20181203042331-505ab145d0a9/go.mod h1:6SG95UA2DQfeDnfUPMdvaQW0Q7yPrPDi9nlGo2tz2b4=
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
golang.org/x/crypto v0.0.0-20190325154230-a5d413f7728c/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
golang.org/x/crypto v0.0.0-20190621222207-cc06ce4a13d4/go.mod h1:yigFU9vqHzYiE8UmvKecakEJjdnWj3jj499lnFckfCI=
golang.org/x/crypto v0.0.0-20191205180655-e7c4368fe9dd/go.mod h1:LzIPMQfyMNhhGPhUkYOs5KpL4U8rLKemX1yGLhDgUto=
golang.org/x/crypto v0.0.0-20200622213623-75b288015ac9 h1:psW17arqaxU48Z5kZ0CQnkZWQJsqcURM6tKiBApRjXI=
golang.org/x/crypto v0.0.0-20200622213623-75b288015ac9/go.mod h1:LzIPMQfyMNhhGPhUkYOs5KpL4U8rLKemX1yGLhDgUto=
golang.org/x/crypto v0.32.0 h1:euUpcYgM8WcP71gNpTqQCn6rC2t6ULUPiOzfWaXVVfc=
golang.org/x/crypto v0.32.0/go.mod h1:ZnnJkOaASj8g0AjIduWNlq2NRxL0PlBrbKVyZ6V/Ugc=
golang.org/x/mod v0.17.0 h1:zY54UmvipHiNd+pm+m0x9KhZ9hl1/7QNMyxXbc6ICqA=
golang.org/x/mod v0.17.0/go.mod h1:hTbmBsO62+eylJbnUtE2MGJUyE7QWk4xUqPFrRgJ+7c=
golang.org/x/net v0.0.0-20180218175443-cbe0f9307d01/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20190311183353-d8887717615a/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=
golang.org/x/net v0.0.0-20190404232315-eb5bcb51f2a3/go.mod h1:t9HGtf8HONx5eT2rtn7q6eTqICYqUVnKs3thJo3Qplg=
golang.org/x/net v0.0.0-20200202094626-16171245cfb2/go.mod h1:z5CRVTTTmAJ677TzLLGU+0bjPO0LkuOLi4/5GtJWs/s=
golang.org/x/net v0.0.0-20200324143707-d3edc9973b7e/go.mod h1:qpuaurCH72eLCgpAm/N6yyVIVM9cpaDIP3A8BGJEC5A=
golang.org/x/sync v0.0.0-20190423024810-112230192c58 h1:8gQV6CLnAEikrhgkHFbMAEhagSSnXWGV915qUMm9mrU=
golang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.10.0 h1:3NQrjDixjgGwUOCaF8w2+VYHv0Ve/vGYSbdkTa98gmQ=
golang.org/x/sync v0.10.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk=
golang.org/x/sys v0.0.0-20181205085412-a5c9d58dba9a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190412213103-97732733099d/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
@ -171,18 +200,27 @@ golang.org/x/sys v0.0.0-20190422165155-953cdadca894/go.mod h1:h1NjWce9XRLGQEsW7w
golang.org/x/sys v0.0.0-20190515120540-06a5c4944438/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20190813064441-fde4db37ae7a/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200116001909-b77594299b42/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200323222414-85ca7c5b95cd h1:xhmwyvizuTgC2qz7ZlMluP20uW+C3Rm0FD/WLDX8884=
golang.org/x/sys v0.0.0-20200323222414-85ca7c5b95cd/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.29.0 h1:TPYlXGxvx1MGTn2GiZDhnjPA9wZzZeGKHHmKhHYvgaU=
golang.org/x/sys v0.29.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/term v0.28.0 h1:/Ts8HFuMR2E6IP/jlo7QVLZHggjKQbhu/7H0LJFr3Gg=
golang.org/x/term v0.28.0/go.mod h1:Sw/lC2IAUZ92udQNf3WodGtn4k/XoLyZoh8v/8uiwek=
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
golang.org/x/text v0.3.2/go.mod h1:bEr9sfX3Q8Zfm5fL9x+3itogRgK3+ptLWKqgva+5dAk=
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
golang.org/x/tools v0.0.0-20190624180213-70d37148ca0c/go.mod h1:/rFqwRUd4F7ZHNgwSSTFct+R/Kf4OFW1sUzUTQQTgfc=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127 h1:qIbj1fsPNlZgppZ+VLlY7N33q108Sa+fhmuc+sWQYwY=
gopkg.in/check.v1 v1.0.0-20180628173108-788fd7840127/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c h1:Hei/4ADfdWqJk1ZMxUNpqntNwaWcugrBjAiHlqqRiVk=
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c/go.mod h1:JHkPIbrfpd72SG/EVd6muEfDQjcINNoR0C8j2r3qZ4Q=
gopkg.in/errgo.v2 v2.1.0/go.mod h1:hNsd1EY+bozCKY1Ytp96fpM3vjJbqLJn88ws8XvfDNI=
gopkg.in/go-playground/assert.v1 v1.2.1/go.mod h1:9RXL0bg/zibRAgZUYszZSwO/z8Y/a8bDuhia5mkpMnE=
gopkg.in/go-playground/validator.v9 v9.29.1/go.mod h1:+c9/zcJMFNgbLvly1L1V+PpxWdVbfP1avr/N00E2vyQ=
gopkg.in/natefinch/lumberjack.v2 v2.2.1 h1:bBRl1b0OH9s/DuPhuXpNl+VtCaJXFZ5/uEFST95x9zc=
gopkg.in/natefinch/lumberjack.v2 v2.2.1/go.mod h1:YD8tP3GAjkrDg1eZH7EGmyESg/lsYskCTPBJVb9jqSc=
gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.2.8 h1:obN1ZagJSUGI0Ek/LBmuj4SNLPfIny3KsKFopxRdj10=
gopkg.in/yaml.v2 v2.2.8/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
gopkg.in/yaml.v2 v2.4.0 h1:D8xgwECY7CYvx+Y2n4sBz93Jn9JRvxdiyyo8CTfuKaY=
gopkg.in/yaml.v2 v2.4.0/go.mod h1:RDklbk79AGWmwhnvt/jBztapEOGDOx6ZbXqjP6csGnQ=
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=

View File

@ -1,7 +1,7 @@
package main
import (
"goflylivechat/cmd"
"ai-css/cmd"
)
func main() {

204
library/logger/logger.go Normal file
View File

@ -0,0 +1,204 @@
package logger
import (
"fmt"
"os"
"runtime"
"strings"
"time"
"go.uber.org/zap"
"go.uber.org/zap/zapcore"
"gopkg.in/natefinch/lumberjack.v2"
)
var (
zlog *zap.SugaredLogger
)
type LoggerConfig struct {
Filename string `mapstructure:"filename" json:"filename" yaml:"filename"`
Level string `mapstructure:"level" json:"level" yaml:"level"`
Format string `mapstructure:"format" json:"format" yaml:"format"`
Prefix string `mapstructure:"prefix" json:"prefix" yaml:"prefix"`
Director string `mapstructure:"director" json:"director" yaml:"director"`
ShowLine bool `mapstructure:"show-line" json:"show-line" yaml:"show-line"`
EncodeLevel string `mapstructure:"encode-level" json:"encode-level" yaml:"encode-level"`
StacktraceKey string `mapstructure:"stacktrace-key" json:"stacktrace-key" yaml:"stacktrace-key"`
LogInConsole bool `mapstructure:"log-in-console" json:"log-in-console" yaml:"log-in-console"`
}
func Init(conf *LoggerConfig) {
if conf.Filename == "/dev/stdout" {
ecf := zap.NewProductionEncoderConfig()
ecf.FunctionKey = "func"
ecf.EncodeTime = zapcore.ISO8601TimeEncoder
ecf.ConsoleSeparator = " "
ecf.EncodeCaller = zapcore.ShortCallerEncoder
core := zapcore.NewCore(
zapcore.NewConsoleEncoder(ecf),
zapcore.AddSync(os.Stdout),
zap.DebugLevel,
)
zl := zap.New(core, zap.AddCallerSkip(1), zap.AddCaller())
zlog = zl.Sugar()
return
}
_, err := os.Stat(conf.Filename)
if err != nil {
if os.IsNotExist(err) {
err = os.MkdirAll(conf.Filename, os.ModePerm)
if err != nil {
panic(err)
}
} else {
fmt.Println("logger init error:", err)
}
}
if strings.LastIndex(conf.Filename, "/") != 0 {
conf.Filename = conf.Filename + "/"
}
encoder := zapcore.NewConsoleEncoder(zapcore.EncoderConfig{
MessageKey: "msg",
LevelKey: "level",
EncodeLevel: zapcore.CapitalLevelEncoder,
TimeKey: "ts",
EncodeTime: func(t time.Time, enc zapcore.PrimitiveArrayEncoder) {
enc.AppendString(t.Format("2006-01-02 15:04:05"))
},
CallerKey: "file",
EncodeCaller: zapcore.ShortCallerEncoder,
EncodeDuration: func(d time.Duration, enc zapcore.PrimitiveArrayEncoder) {
enc.AppendInt64(int64(d) / 1000000)
},
})
// 实现两个判断日志等级的interface
infoLevel := zap.LevelEnablerFunc(func(lvl zapcore.Level) bool {
return lvl >= zapcore.InfoLevel
})
errorLevel := zap.LevelEnablerFunc(func(lvl zapcore.Level) bool {
return lvl >= zapcore.ErrorLevel
})
// 获取 info、error日志文件的io.Writer 抽象 getWriter() 在下方实现
now := time.Now()
fileTime := now.Format("20060102") + "-" + now.Format("150405")
fileFormat := "%s%s-%s.log"
infoWriter := getWriter(fmt.Sprintf(fileFormat, conf.Filename, "info", fileTime))
errorWriter := getWriter(fmt.Sprintf(fileFormat, conf.Filename, "error", fileTime))
// 最后创建具体的Logger
core := zapcore.NewTee(
zapcore.NewCore(encoder, zapcore.AddSync(os.Stdout), infoLevel), //打印到控制台
zapcore.NewCore(encoder, infoWriter, infoLevel),
zapcore.NewCore(encoder, errorWriter, errorLevel),
)
zl := zap.New(core, zap.AddCallerSkip(1), zap.AddCaller())
zlog = zl.Sugar()
}
func GetDefault() *zap.SugaredLogger {
return zlog
}
func InitDefault() {
Init(&LoggerConfig{
Filename: "/dev/stdout",
})
}
func Sync() {
_ = zlog.Sync()
}
func getWriter(filename string) zapcore.WriteSyncer {
lumberJackLogger := &lumberjack.Logger{
Filename: filename, // 文件位置
MaxSize: 100, // 进行切割之前,日志文件的最大大小(MB为单位)
MaxAge: 10, // 保留旧文件的最大天数
MaxBackups: 3, // 保留旧文件的最大个数
Compress: false, // 是否压缩/归档旧文件
}
// AddSync 将 io.Writer 转换为 WriteSyncer。
// 它试图变得智能:如果 io.Writer 的具体类型实现了 WriteSyncer我们将使用现有的 Sync 方法。
// 如果没有,我们将添加一个无操作同步。
return zapcore.AddSync(lumberJackLogger)
}
func Debug(args ...interface{}) {
zlog.Debug(args...)
}
func Debugf(template string, args ...interface{}) {
zlog.Debugf(template, args...)
}
func Info(args ...interface{}) {
zlog.Info(args...)
}
func Infof(template string, args ...interface{}) {
zlog.Infof(template, args...)
}
func Warn(args ...interface{}) {
zlog.Warn(args...)
}
func Warnf(template string, args ...interface{}) {
zlog.Warnf(template, args...)
}
func Error(args ...interface{}) {
zlog.Error(args...)
}
func Errorf(template string, args ...interface{}) {
zlog.Errorf(template, args...)
}
func DPanic(args ...interface{}) {
zlog.DPanic(args...)
}
func DPanicf(template string, args ...interface{}) {
zlog.DPanicf(template, args...)
}
func Panic(args ...interface{}) {
zlog.Panic(args...)
}
func Panicf(template string, args ...interface{}) {
zlog.Panicf(template, args...)
}
func Fatal(args ...interface{}) {
zlog.Fatal(args...)
}
func Fatalf(template string, args ...interface{}) {
zlog.Fatalf(template, args...)
}
func SafeGoroutine(fn func()) {
go func() {
defer func() {
if r := recover(); r != nil {
buf := make([]byte, 1<<16) // 64KB
stackSize := runtime.Stack(buf, false)
msg := fmt.Sprintf("panic: %v\n%s\n", r, buf[:stackSize])
Errorf(msg)
}
}()
fn()
}()
}

View File

@ -0,0 +1,117 @@
package bootstrap
import (
"ai-css/library/modelprovider"
"ai-css/library/modelprovider/config"
"ai-css/library/modelprovider/consts"
"ai-css/library/modelprovider/providers"
"context"
"fmt"
"log"
)
type AIManager struct {
CfgMgr *config.Manager
Registry *providers.Registry
}
var DefaultAIManager *AIManager
func init() {
var err error
DefaultAIManager, err = Init(context.TODO(), &config.Manager{})
if err != nil {
log.Fatalf("init ai manager failed err:%v", err)
}
}
func Init(ctx context.Context, cfgMgr *config.Manager) (*AIManager, error) {
if err := cfgMgr.LoadConfigs(ctx); err != nil {
return nil, err
}
return &AIManager{
CfgMgr: cfgMgr,
Registry: providers.BuildRegistry(),
}, nil
}
func (a *AIManager) NewClient(providerName consts.ProviderName, opts ...ClientOption) (*modelprovider.Client, error) {
provider, finalOpts, err := a.resolveProvider(providerName, opts...)
if err != nil {
return nil, err
}
// model 优先级opts > config > provider
model := finalOpts.DefaultModel
if model == "" {
model = provider.GetDefaultModel()
}
return modelprovider.NewClient(provider, model), nil
}
func (a *AIManager) NewProvider(providerName consts.ProviderName, opts ...ClientOption) (modelprovider.Provider, error) {
provider, _, err := a.resolveProvider(providerName, opts...)
return provider, err
}
func (a *AIManager) resolveProvider(providerName consts.ProviderName, opts ...ClientOption) (modelprovider.Provider, *Options, error) {
// 初始化 options
o := &Options{
ProviderName: providerName,
}
for _, opt := range opts {
opt(o)
}
// Step 1: 如果直接传 Provider则直接返回
if o.Provider != nil {
return o.Provider, o, nil
}
// Step 2: 校验 ProviderName
if o.ProviderName == "" {
return nil, nil, fmt.Errorf("invalid provider name: %s", o.ProviderName)
}
// Step 3: 解析 ProviderConfigOption > DB
conf, err := a.resolveProviderConfig(o)
if err != nil {
return nil, nil, fmt.Errorf("resolve provider config failed: %w", err)
}
// Step 4: 实际创建 providerregistry lookup
provider, err := a.createProvider(o.ProviderName, conf)
if err != nil {
return nil, nil, fmt.Errorf("create provider failed: %w", err)
}
return provider, o, nil
}
func (a *AIManager) resolveProviderConfig(o *Options) (*config.ProviderConfig, error) {
if o.ProviderConfig != nil {
return o.ProviderConfig, nil
}
cfg, ok := a.CfgMgr.GetConfigByProviderName(o.ProviderName)
if !ok {
return nil, fmt.Errorf("config not found for provider: %s", o.ProviderName)
}
return &cfg, nil
}
func (a *AIManager) createProvider(providerName consts.ProviderName, conf *config.ProviderConfig) (modelprovider.Provider, error) {
creator := a.Registry.Providers[providerName]
if creator == nil {
return nil, fmt.Errorf("provider not supported: %s", providerName)
}
provider, err := creator(conf)
if err != nil {
return nil, fmt.Errorf("create provider instance failed: %w", err)
}
return provider, nil
}

View File

@ -0,0 +1,38 @@
package bootstrap
import (
"ai-css/library/modelprovider"
"ai-css/library/modelprovider/config"
"ai-css/library/modelprovider/consts"
)
type ClientOption func(*Options)
type Options struct {
ProviderName consts.ProviderName
Provider modelprovider.Provider
ProviderConfig *config.ProviderConfig
DefaultModel string
}
func WithProviderName(name consts.ProviderName) ClientOption {
return func(o *Options) {
o.ProviderName = name
}
}
func WithProvider(p modelprovider.Provider) ClientOption {
return func(o *Options) {
o.Provider = p
}
}
func WithProviderConfig(cfg *config.ProviderConfig) ClientOption {
return func(o *Options) {
o.ProviderConfig = cfg
}
}
func WithDefaultModel(model string) ClientOption {
return func(o *Options) { o.DefaultModel = model }
}

View File

@ -0,0 +1,27 @@
package bootstrap
import "ai-css/library/modelprovider/consts"
var Providers = []consts.ProviderMeta{
{ID: consts.ProviderIDOpenAI, Name: consts.ProviderOpenAI, Display: "OpenAI", Official: true},
}
var providerByName = make(map[consts.ProviderName]consts.ProviderMeta)
var providerByID = make(map[consts.ProviderID]consts.ProviderMeta)
func init() {
for _, p := range Providers {
providerByName[p.Name] = p
providerByID[p.ID] = p
}
}
func GetProviderByName(name consts.ProviderName) (consts.ProviderMeta, bool) {
p, ok := providerByName[name]
return p, ok
}
func GetProviderByID(id consts.ProviderID) (consts.ProviderMeta, bool) {
p, ok := providerByID[id]
return p, ok
}

View File

@ -0,0 +1,56 @@
package modelprovider
import (
"context"
"errors"
"time"
)
type Client struct {
provider Provider
defaultModel string
}
func NewClient(p Provider, defaultModel string) *Client {
return &Client{provider: p, defaultModel: defaultModel}
}
// Chat 对话接口
func (c *Client) Chat(ctx context.Context, req ChatRequest) (*ChatResponse, error) {
if len(req.Messages) == 0 {
return nil, errors.New("empty messages")
}
if req.Model == "" {
req.Model = c.defaultModel
}
resp, err := c.provider.InvokeCompletion(ctx, &req)
if err != nil {
return nil, err
}
if resp != nil {
if resp.Meta.Vendor == "" {
resp.Meta.Vendor = c.provider.Capabilities().Vendor
}
if resp.Meta.CreatedAt.IsZero() {
resp.Meta.CreatedAt = time.Now()
}
if resp.Model == "" {
resp.Model = req.Model
}
}
return resp, nil
}
// StreamChat 流式问答接口
func (c *Client) StreamChat(ctx context.Context, req ChatRequest, handler StreamChatCallback) error {
if len(req.Messages) == 0 {
return errors.New("empty messages")
}
if req.Model == "" {
req.Model = c.defaultModel
}
if !c.provider.Capabilities().SupportsStreaming {
return errors.New("provider does not support streaming")
}
return c.provider.StreamCompletion(ctx, &req, handler)
}

View File

@ -0,0 +1,54 @@
package config
import (
ai "ai-css/library/modelprovider/consts"
"context"
"fmt"
"sync"
)
type Manager struct {
mu sync.RWMutex
providerConfigs map[ai.ProviderName]ProviderConfig
configRepo Repo
}
func NewManager(confRepo Repo) *Manager {
return &Manager{providerConfigs: make(map[ai.ProviderName]ProviderConfig), configRepo: confRepo}
}
type Repo interface {
GetAllConfig() (map[ai.ProviderName]ProviderConfig, error) // providerName: apikeys
}
func (m *Manager) LoadConfigs(ctx context.Context) error {
if m.configRepo == nil {
return fmt.Errorf("load from database failed: repo is nil")
}
allconfigs, err := m.configRepo.GetAllConfig()
if err != nil {
return fmt.Errorf("get all config from database failed err:%w", err)
}
m.mu.Lock()
defer m.mu.Unlock()
m.providerConfigs = make(map[ai.ProviderName]ProviderConfig) // 清空
for name, conf := range allconfigs {
m.providerConfigs[name] = conf
}
return nil
}
func (m *Manager) SetConfigByProviderName(name ai.ProviderName, config ProviderConfig) {
m.mu.Lock()
defer m.mu.Unlock()
m.providerConfigs[name] = config
}
func (m *Manager) GetConfigByProviderName(name ai.ProviderName) (ProviderConfig, bool) {
m.mu.RLock()
defer m.mu.RUnlock()
conf, ok := m.providerConfigs[name]
return conf, ok
}

View File

@ -0,0 +1,90 @@
package config
import (
"sync"
"time"
"ai-css/library/logger"
)
type ProviderConfig struct {
baseURL string
apiKeys []string
BlackApiKeys map[string]struct{}
apiKeysLock *sync.Mutex
blackLock *sync.Mutex
}
func NewProviderConfig(burl string, apiKeys []string) ProviderConfig {
return ProviderConfig{baseURL: burl, apiKeys: apiKeys, BlackApiKeys: make(map[string]struct{}), apiKeysLock: new(sync.Mutex), blackLock: new(sync.Mutex)}
}
func (p *ProviderConfig) ApikeyIsBlack(apikey string) bool {
p.blackLock.Lock()
defer p.blackLock.Unlock()
_, found := p.BlackApiKeys[apikey]
return found
}
func (p *ProviderConfig) AddBlackKey(apikey string) {
p.blackLock.Lock()
defer p.blackLock.Unlock()
p.BlackApiKeys[apikey] = struct{}{}
return
}
func (p *ProviderConfig) GetBaseUrl() string {
return p.baseURL
}
func (p *ProviderConfig) GetApiKeys() []string {
p.apiKeysLock.Lock()
defer p.apiKeysLock.Unlock()
return p.apiKeys
}
func (p *ProviderConfig) SetApiKeys(keys []string) {
p.apiKeysLock.Lock()
defer p.apiKeysLock.Unlock()
p.apiKeys = keys
return
}
func (p *ProviderConfig) SetRetryPullConfig(retryFunc func(*ProviderConfig) bool, interval time.Duration) {
if interval <= 0 {
logger.Warnf("SetRetryPullConfig interval is invalid %d", interval)
return
}
go startRetryPullConfig(func() bool {
return retryFunc(p)
}, interval)
}
func startRetryPullConfig(retryFunc func() bool, interval time.Duration) {
timer := time.NewTimer(interval)
defer timer.Stop()
for {
<-timer.C
isFinish := false
func() {
defer func() {
if r := recover(); r != nil {
logger.Errorf("retry panic: %v", r)
}
}()
isFinish = retryFunc()
}()
if isFinish {
return
}
timer.Reset(interval)
}
}

View File

@ -0,0 +1,22 @@
package consts
type (
ProviderID int64
ProviderName string
)
type ProviderMeta struct {
ID ProviderID
Name ProviderName
Display string // UI展示名例如 "OpenAI"
Official bool // 是否官方支持
Icon string // 图标URL或资源标识
}
const (
ProviderOpenAI ProviderName = "openai"
)
const (
ProviderIDOpenAI ProviderID = iota + 1
)

View File

@ -0,0 +1,133 @@
package modelprovider
import (
"time"
)
type Role string
const (
RoleUser Role = "user"
RoleAssistant Role = "assistant"
RoleSystem Role = "system"
)
const (
PartText PartType = "text"
PartImage PartType = "image"
)
type Message struct {
Role Role `json:"role"`
Parts []Part `json:"parts,omitempty"` // 多模态分片(任选其一)
}
type PartType string
type Part struct {
Type PartType `json:"type"`
// 文本
Text string `json:"text,omitempty"`
// 图片任选其一URL/内联字节/已有文件ID
ImageURL string `json:"image_url,omitempty"`
ImageBytes []byte `json:"image_bytes,omitempty"`
MIMEType string `json:"mime_type,omitempty"` // "image/png" 等
}
type ChatRequest struct {
Model string `json:"model"`
Messages []Message `json:"messages"`
Temperature *float64 `json:"temperature,omitempty"`
TopP *float64 `json:"top_p,omitempty"`
MaxTokens *int `json:"max_tokens,omitempty"`
VendorExtras map[string]any `json:"vendor_extras,omitempty"`
RequestID string `json:"request_id,omitempty"`
IsStream bool `json:"is_stream,omitempty"`
}
type Usage struct {
PromptTokens int `json:"prompt_tokens"`
CompletionTokens int `json:"completion_tokens"`
TotalTokens int `json:"total_tokens"`
}
type AIError struct {
Code string `json:"code"`
Message string `json:"message"`
}
type Meta struct {
CreatedAt time.Time `json:"created_at"`
Vendor string `json:"vendor"`
ModelID string `json:"model_id,omitempty"`
Extras map[string]string `json:"extras,omitempty"`
}
type ChatResponse struct {
ID string `json:"id"`
Model string `json:"model"`
Content string `json:"content"`
Usage *Usage `json:"usage,omitempty"`
Err *AIError `json:"err,omitempty"`
Raw any `json:"raw,omitempty"`
Meta Meta `json:"meta"`
}
// 模型信息(面向统一层)
type ModelInfo struct {
// 逻辑 ID仅在 RouterProvider 聚合时回填,如 "openai/gpt-4o-2024-08-06"
LogicalID string `json:"logical_id,omitempty"`
// 供应商真实模型 ID如 "gpt-4o-2024-08-06"
RealID string `json:"real_id"`
// 供应商标识(如 "openai"
Vendor string `json:"vendor"`
// 展示名(可选)
DisplayName string `json:"display_name,omitempty"`
// 能力信息(按需精简/扩展)
ContextWindow int `json:"context_window,omitempty"` // 最大上下文
SupportsStream bool `json:"supports_stream,omitempty"`
InputModalities []string `json:"input_modalities,omitempty"` // e.g. ["text","image","audio"]
OutputModalities []string `json:"output_modalities,omitempty"` // e.g. ["text","image"]
// 定价/地区/版本等(可选)
Region string `json:"region,omitempty"`
Version string `json:"version,omitempty"`
Metadata map[string]string `json:"metadata,omitempty"`
// 供应商原始信息(调试/排障)
Raw any `json:"raw,omitempty"`
}
// 便捷构造器(业务层直接用)
func NewPartText(s string) Part { return Part{Type: PartText, Text: s} }
func NewPartImageURL(u string) Part { return Part{Type: PartImage, ImageURL: u} }
func NewPartImageBytes(b []byte, mt string) Part {
return Part{Type: PartImage, ImageBytes: b, MIMEType: mt}
}
func MakeUserMsg(p []Part) Message {
return Message{
Role: RoleUser,
Parts: p,
}
}
func MakeAssistantMsg(p []Part) Message {
return Message{
Role: RoleAssistant,
Parts: p,
}
}
func MakeSystemMsg(p []Part) Message {
return Message{
Role: RoleSystem,
Parts: p,
}
}

View File

@ -0,0 +1,43 @@
package errorswrap
import (
"errors"
"fmt"
)
type Errors struct {
Code ErrorCode `json:"code"`
Msg string `json:"msg"`
}
func (e *Errors) Error() string {
return fmt.Sprintf("error code:%s,msg:%s", e.Code, e.Msg)
}
func NewError(code ErrorCode) error {
return &Errors{Code: code}
}
type ErrorCode string
const (
ErrorUnknown ErrorCode = "provider_unknown"
ErrorProviderApiUrlInvalid ErrorCode = "provider_api_url_invalid"
ErrorProviderApiKeyInvalid ErrorCode = "provider_api_key_invalid"
)
func ErrorIsCode(err error, code ErrorCode) bool {
var e *Errors
if errors.As(err, &e) {
return e.Code == code
}
return false
}
func GetErrorCode(err error) ErrorCode {
var e *Errors
if errors.As(err, &e) && e != nil {
return e.Code
}
return ErrorUnknown
}

View File

@ -0,0 +1,13 @@
package errorswrap
import (
"github.com/stretchr/testify/require"
"testing"
)
func TestCode(t *testing.T) {
e := NewError(ErrorProviderApiUrlInvalid)
require.Equal(t, GetErrorCode(e), ErrorProviderApiUrlInvalid)
}

View File

@ -0,0 +1,20 @@
package modelprovider
import "context"
type Capability struct {
Vendor string
SupportsStreaming bool
MaxContextTokens int
}
// Provider将统一 DTO ↔ 各家云 API适配器接口
type Provider interface {
InvokeCompletion(ctx context.Context, req *ChatRequest) (*ChatResponse, error)
StreamCompletion(ctx context.Context, req *ChatRequest, h StreamChatCallback) error
Capabilities() Capability
//ListModels 列出该 provider 可用模型(返回“供应商真实模型 ID”列表及能力
ListModels(ctx context.Context) ([]ModelInfo, error)
// GetDefaultModel 默认模型
GetDefaultModel() string
}

View File

@ -0,0 +1,377 @@
package openai
import (
"github.com/openai/openai-go/v3/responses"
"ai-css/library/modelprovider/errorswrap"
"bufio"
"bytes"
"context"
"encoding/json"
"errors"
"fmt"
"io"
"net/http"
"ai-css/library/logger"
)
type EventType string
const (
StreamRespondError EventType = "response.error"
StreamRespondFailed EventType = "response.failed"
StreamRespondOutputTextDelta EventType = "response.output_text.delta"
StreamRespondComplete EventType = "response.completed"
)
var NetworkError = errors.New("network unreachable")
// OpenAIResponsesRequest models POST /v1/responses request body.
type OpenAIResponsesRequest struct {
Background *bool `json:"background,omitempty"`
Conversation json.RawMessage `json:"conversation,omitempty"` // string 或 {id: "..."} 等,用 RawMessage 保持灵活
Include []string `json:"include,omitempty"`
Input interface{} `json:"input,omitempty"` // 聊天场景我们会塞 []OpenAIChatMessage其他场景可自定义
Instructions string `json:"instructions,omitempty"`
MaxOutputTokens *int `json:"max_output_tokens,omitempty"`
MaxToolCalls *int `json:"max_tool_calls,omitempty"`
Metadata map[string]string `json:"metadata,omitempty"`
Model string `json:"model,omitempty"`
ParallelToolCalls *bool `json:"parallel_tool_calls,omitempty"`
PreviousResponseID string `json:"previous_response_id,omitempty"`
Prompt json.RawMessage `json:"prompt,omitempty"` // prompt 模板引用,结构不固定,用 RawMessage
PromptCacheKey string `json:"prompt_cache_key,omitempty"`
Reasoning json.RawMessage `json:"reasoning,omitempty"` // {effort: "..."} 等
Summary string `json:"summary,omitempty"`
SafetyIdentifier string `json:"safety_identifier,omitempty"`
ServiceTier string `json:"service_tier,omitempty"`
Store *bool `json:"store,omitempty"`
Stream bool `json:"stream,omitempty"`
StreamOptions json.RawMessage `json:"stream_options,omitempty"` // e.g. {"include_usage": true}
Temperature *float32 `json:"temperature,omitempty"`
Text json.RawMessage `json:"text,omitempty"` // 结构化输出配置等
ToolChoice json.RawMessage `json:"tool_choice,omitempty"`
Tools json.RawMessage `json:"tools,omitempty"` // 工具 / 函数 / MCP 定义
TopLogprobs *int `json:"top_logprobs,omitempty"`
TopP *float32 `json:"top_p,omitempty"`
Truncation string `json:"truncation,omitempty"`
}
type OpenAIResponsesResponse struct {
ID string `json:"id"`
Object string `json:"object"`
CreatedAt int64 `json:"created_at"`
Status string `json:"status"`
Error OpenAIErrorMessage `json:"error,omitempty"` // 可能是 null 或对象
IncompleteDetails any `json:"incomplete_details,omitempty"` // 可能是 null 或对象
Instructions *string `json:"instructions,omitempty"`
MaxOutputTokens *int `json:"max_output_tokens,omitempty"`
Model string `json:"model"`
Output []OutputItem `json:"output"`
ParallelToolCalls bool `json:"parallel_tool_calls"`
PreviousResponseID *string `json:"previous_response_id,omitempty"`
Reasoning Reasoning `json:"reasoning"`
Store bool `json:"store"`
Temperature float64 `json:"temperature"`
Text TextSpec `json:"text"`
ToolChoice string `json:"tool_choice"` // "auto" | 其他
Tools []json.RawMessage `json:"tools"` // 留作将来扩展function/tool schemas 等)
TopP float64 `json:"top_p"`
Truncation string `json:"truncation"`
Usage Usage `json:"usage"`
User *string `json:"user,omitempty"`
Metadata map[string]any `json:"metadata"`
}
type OpenAIErrorMessage struct {
Msg string `json:"message"`
Type string `json:"type"`
Param string `json:"model"`
Code string `json:"model_not_found"`
}
// ResponsesStreamEvent 流事件的通用结构
type ResponsesStreamEvent struct {
Type string `json:"type"` // e.g. "response.output_text.delta"
Delta string `json:"delta,omitempty"` // 文本增量内容(仅在 output_text.delta 事件里有)
ItemID string `json:"item_id,omitempty"` // 其他字段可以按需用
OutputIndex int `json:"output_index,omitempty"` // 这里先不用
ContentIndex int `json:"content_index,omitempty"`
// 错误事件: type = "response.error" / "response.failed"
Error *struct {
Code string `json:"code"`
Message string `json:"message"`
} `json:"error,omitempty"`
Response responses.Response `json:"response"`
}
type OutputItem struct {
Type string `json:"type"` // "message" 等
ID string `json:"id"`
Status string `json:"status"` // "completed" 等
Role string `json:"role"` // "assistant" 等
Content []ContentBlock `json:"content"`
}
type ContentBlock struct {
Type string `json:"type"` // "output_text" 等
Text string `json:"text,omitempty"` // 当 type=output_text 时存在
Annotations []any `json:"annotations,omitempty"` // 留空/数组
// 未来还可能有其他字段(如 tool_calls 等),用 RawMessage 兜底更安全:
// Raw json to keep forward-compatibility:
// Raw json.RawMessage `json:"-"`
}
type Reasoning struct {
Effort *string `json:"effort,omitempty"`
Summary *string `json:"summary,omitempty"`
}
type TextSpec struct {
Format TextFormat `json:"format"`
}
type TextFormat struct {
Type string `json:"type"` // "text"
}
type Usage struct {
InputTokens int `json:"input_tokens"`
InputTokensDetails InputTokensDetails `json:"input_tokens_details"`
OutputTokens int `json:"output_tokens"`
OutputTokensDetails OutputTokensDetail `json:"output_tokens_details"`
TotalTokens int `json:"total_tokens"`
}
type InputTokensDetails struct {
CachedTokens int `json:"cached_tokens"`
}
type OutputTokensDetail struct {
ReasoningTokens int `json:"reasoning_tokens"`
}
type OpenAIChatMessage struct {
Role string `json:"role"` // "system" / "user" / "assistant"
Content []interface{} `json:"content"` // 多模态就多个 part这里只放 text
}
// 单条内容片段(这里只演示 text
type OpenAIContentPart struct {
Type string `json:"type"` // "text"
Text string `json:"text,omitempty"` // 文本内容
}
// 文本输入
type TextInput struct {
Type string `json:"type"` // 固定为 "input_text"
Text string `json:"text"`
}
// 图片输入
type ImageInput struct {
Type string `json:"type"` // 固定为 "input_image"
ImageURL string `json:"image_url,omitempty"` // URL 或 Base64
Detail string `json:"detail,omitempty"` // high / low / auto
FileID string `json:"file_id,omitempty"` // 若图片来自文件API
}
// 文件输入
type FileInput struct {
Type string `json:"type"` // 固定为 "input_file"
FileID string `json:"file_id,omitempty"` // Files API 上传返回的 ID
FileData string `json:"file_data,omitempty"` // Base64 文件内容
FileURL string `json:"file_url,omitempty"` // 文件URL
Filename string `json:"filename,omitempty"` // 文件名(可选)
}
// Model 表示单个模型对象
type Model struct {
ID string `json:"id"`
Object string `json:"object"`
Created int64 `json:"created"`
OwnedBy string `json:"owned_by"`
}
// ModelsResponse 表示 /v1/models 的响应结构
type ModelsResponse struct {
Object string `json:"object"` // 固定为 "list"
Data []Model `json:"data"`
Error RespError `json:"error"`
}
type RespError struct {
Msg string `json:"message"`
Type string `json:"type"`
Code string `json:"code"`
}
type OpenAIClient struct {
apiKey string
baseURL string
httpClient *http.Client
}
func NewOpenaiClient(apikey, apiUrl string, httpC *http.Client) OpenAIClient {
return OpenAIClient{apikey, apiUrl, httpC}
}
// callResponses 调用openAI Responses 接口
func (o *OpenAIClient) callResponses(
ctx context.Context, req *OpenAIResponsesRequest, callback func(evt *ResponsesStreamEvent) error,
) (resp *OpenAIResponsesResponse, err error) {
reqBody, err := json.Marshal(req)
if err != nil {
err = fmt.Errorf("failed to serialize request: %w", err)
return
}
// 2. Send POST to /v1/responses
httpReq, err := http.NewRequestWithContext(
ctx,
http.MethodPost,
o.baseURL+"/v1/responses",
bytes.NewBuffer(reqBody),
)
if err != nil {
logger.Errorf("new request failed err:%v", err)
err = fmt.Errorf("failed to create HTTP request: %w", err)
return
}
httpReq.Header.Set("Content-Type", "application/json")
httpReq.Header.Set("Authorization", "Bearer "+o.apiKey)
httpReq.Header.Set("Accept", "text/event-stream")
logger.Debugf("openai callResponses req:%s", string(reqBody))
respond, err := o.httpClient.Do(httpReq)
if err != nil {
logger.Errorf("call responses api failed err:%v", err)
err = NetworkError
return
}
defer respond.Body.Close()
if respond.StatusCode != http.StatusOK {
body, _ := io.ReadAll(respond.Body)
var respondData *OpenAIResponsesResponse
json.Unmarshal(body, &respondData)
err = fmt.Errorf("OpenAI API returned error [%d]: %s", respond.StatusCode, string(body))
return
}
// 3. Parse SSE stream
reader := bufio.NewReader(respond.Body)
for {
select {
case <-ctx.Done():
err = ctx.Err()
logger.Errorf("lisent stream failed err:%v", err)
if err == io.EOF {
return
}
err = NetworkError
return
default:
}
var line []byte
line, err = reader.ReadBytes('\n')
if err != nil {
if err == io.EOF {
return
}
logger.Errorf("read body failed err:%v", err)
err = NetworkError
return
}
line = bytes.TrimSpace(line)
if len(line) == 0 {
continue
}
if !bytes.HasPrefix(line, []byte("data: ")) {
continue
}
data := bytes.TrimPrefix(line, []byte("data: "))
var event = new(ResponsesStreamEvent)
if err = json.Unmarshal(data, event); err != nil {
continue
}
if err = callback(event); err != nil {
err = fmt.Errorf("callback execution failed: %w", err)
return
}
}
}
func (o *OpenAIClient) getModels(ctx context.Context) (*ModelsResponse, error) {
req, err := http.NewRequestWithContext(
ctx,
http.MethodGet,
o.baseURL+"/v1/models",
nil,
)
if err != nil {
logger.Errorf("new request failed err:%v", err)
return nil, errorswrap.NewError(errorswrap.ErrorProviderApiUrlInvalid)
}
req.Header.Set("Authorization", "Bearer "+o.apiKey)
resp, err := o.httpClient.Do(req)
if err != nil {
logger.Infof("call openai api failed err:%v,openAIclient:%v", err, o)
return nil, errorswrap.NewError(errorswrap.ErrorProviderApiUrlInvalid)
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
logger.Errorf("status code not ok code:%d", resp.StatusCode)
var body []byte
body, err = io.ReadAll(resp.Body)
if err != nil {
logger.Errorf("read response body failed: %v", err)
return nil, errorswrap.NewError(errorswrap.ErrorProviderApiUrlInvalid)
}
logger.Errorf("status code not ok body:%s", string(body))
return nil, errorswrap.NewError(errorswrap.ErrorProviderApiKeyInvalid)
}
var result ModelsResponse
if err = json.NewDecoder(resp.Body).Decode(&result); err != nil {
return nil, err
}
if result.Error.Msg != "" {
return nil, errorswrap.NewError(errorswrap.ErrorProviderApiKeyInvalid)
}
return &result, nil
}
func NewTextPart(isInput bool, text string) TextInput {
var prefix = "output"
if isInput {
prefix = "input"
}
return TextInput{
Type: fmt.Sprintf("%s_text", prefix),
Text: text,
}
}
func NewImagePart(isInput bool, ImageURL string) ImageInput {
var prefix = "output"
if isInput {
prefix = "input"
}
return ImageInput{
Type: fmt.Sprintf("%s_image", prefix),
ImageURL: ImageURL,
}
}

View File

@ -0,0 +1,32 @@
package openai
import "sync"
var (
maxBlackApikeySize = 5000
)
type BlackkeyMgr struct {
blackApikey map[string]struct{}
lock *sync.Mutex
}
var blackKeyMgr = &BlackkeyMgr{blackApikey: make(map[string]struct{}), lock: new(sync.Mutex)}
func (b *BlackkeyMgr) AddBlackKey(k string) {
b.lock.Lock()
defer b.lock.Unlock()
if len(b.blackApikey) >= maxBlackApikeySize {
b.blackApikey = make(map[string]struct{})
}
b.blackApikey[k] = struct{}{}
return
}
func (b *BlackkeyMgr) IsBlack(k string) bool {
b.lock.Lock()
defer b.lock.Unlock()
_, ok := b.blackApikey[k]
return ok
}

View File

@ -0,0 +1,205 @@
package openai
import (
modelprovider2 "ai-css/library/modelprovider"
"ai-css/library/modelprovider/config"
"context"
"errors"
"fmt"
"io"
"math/rand"
"net/http"
"strings"
"ai-css/library/logger"
)
type Provider struct {
httpClient *http.Client
conf *config.ProviderConfig
blackApikey map[string]struct{}
}
func New(conf *config.ProviderConfig, httpc *http.Client) *Provider {
if httpc == nil {
httpc = http.DefaultClient
}
return &Provider{conf: conf, httpClient: httpc, blackApikey: make(map[string]struct{})}
}
func (p *Provider) Capabilities() modelprovider2.Capability {
return modelprovider2.Capability{
Vendor: "openai",
SupportsStreaming: true,
MaxContextTokens: 128000,
}
}
func (p *Provider) InvokeCompletion(ctx context.Context, req *modelprovider2.ChatRequest) (*modelprovider2.ChatResponse, error) {
// TODO: 将 req 映射到 OpenAI Responses/Chat API发起 HTTP解析返回
return &modelprovider2.ChatResponse{
ID: "mock-openai-id",
Model: req.Model,
Content: "hello from openai (mock)",
Meta: modelprovider2.Meta{Vendor: "openai"},
}, nil
}
func (p *Provider) StreamCompletion(ctx context.Context, req *modelprovider2.ChatRequest, h modelprovider2.StreamChatCallback) (err error) {
var (
temp = float32(0.7)
store = false
inputMessages []OpenAIChatMessage
)
for _, msg := range req.Messages {
var (
item OpenAIChatMessage
isInput bool
)
switch msg.Role {
case modelprovider2.RoleSystem:
item.Role = "system"
isInput = true
case modelprovider2.RoleAssistant:
item.Role = "assistant"
case modelprovider2.RoleUser:
item.Role = "user"
isInput = true
}
for _, part := range msg.Parts {
var data interface{}
switch part.Type {
case modelprovider2.PartText:
data = NewTextPart(isInput, part.Text)
case modelprovider2.PartImage:
data = NewImagePart(isInput, part.ImageURL)
}
item.Content = append(item.Content, data)
}
inputMessages = append(inputMessages, item)
}
var (
callreq = &OpenAIResponsesRequest{
Model: req.Model,
Input: inputMessages, // 聊天内容
Stream: req.IsStream, // 流式很关键
Store: &store, // 不持久化这次对话
}
apikeys []string
)
if IsGPT4Model(req.Model) {
callreq.Temperature = &temp
}
for _, item := range p.conf.GetApiKeys() {
if ok := blackKeyMgr.IsBlack(item); ok {
continue
}
apikeys = append(apikeys, item)
}
rand.Shuffle(len(apikeys), func(i, j int) {
apikeys[i], apikeys[j] = apikeys[j], apikeys[i]
})
logger.Debugf("call openai apikeys:%v", apikeys)
for _, ak := range apikeys {
c := NewOpenaiClient(ak, p.conf.GetBaseUrl(), p.httpClient)
_, err = c.callResponses(ctx, callreq, p.WrapStreamCallback(h))
if err != nil {
logger.Errorf("do callResponses api failed err:%v", err)
if isApikeyInvalid(err) {
blackKeyMgr.AddBlackKey(ak)
}
if errors.Is(err, NetworkError) {
break
}
if !errors.Is(err, io.EOF) {
continue
}
}
return
}
if err != nil {
logger.Errorf("call cloud model failed err:%v", err)
err = fmt.Errorf("cloud model server internal error")
}
return
}
func (p *Provider) WrapStreamCallback(h modelprovider2.StreamChatCallback) func(*ResponsesStreamEvent) error {
return func(event *ResponsesStreamEvent) error {
switch EventType(event.Type) {
case StreamRespondError, StreamRespondFailed:
if event.Error != nil {
return fmt.Errorf("OpenAI streaming error: %s (%s)", event.Error.Message, event.Error.Code)
}
return fmt.Errorf("unknown OpenAI streaming error: %v", event)
case StreamRespondOutputTextDelta:
if event.Delta != "" {
if err := h(modelprovider2.StreamEvent{
Kind: modelprovider2.StreamDelta,
Text: event.Delta,
}); err != nil {
return fmt.Errorf("callback execution failed: %w", err)
}
}
case StreamRespondComplete:
if err := h(modelprovider2.StreamEvent{
Kind: modelprovider2.StreamEnd,
OutputTokens: event.Response.Usage.OutputTokens,
}); err != nil {
return fmt.Errorf("callback execution failed: %w", err)
}
}
return nil
}
}
func (p *Provider) ListModels(ctx context.Context) (result []modelprovider2.ModelInfo, err error) {
var models *ModelsResponse
for _, ak := range p.conf.GetApiKeys() {
c := NewOpenaiClient(ak, p.conf.GetBaseUrl(), p.httpClient)
models, err = c.getModels(ctx)
if err != nil {
logger.Errorf("call responses api failed err:%v", err)
continue
}
break
}
if models == nil {
return
}
for _, model := range models.Data {
//if !FilterModel(model) {
// continue
//}
result = append(result, modelprovider2.ModelInfo{
RealID: model.ID,
Raw: model,
Vendor: model.OwnedBy,
DisplayName: model.ID,
})
}
return
}
func (p *Provider) GetDefaultModel() string {
return "gpt-4o"
}
func IsGPT4Model(model string) bool {
return strings.Contains(model, "gpt-4")
}
func isApikeyInvalid(err error) bool {
logger.Debugf("err:%v,sub:%s,contains:%v", err.Error(), "Incorrect API key provided", strings.Contains(err.Error(), "Incorrect API key provided"))
return strings.Contains(err.Error(), "Incorrect API key provided")
}

View File

@ -0,0 +1,31 @@
package providers
import (
"ai-css/library/logger"
"net/url"
"os"
)
const (
PROXY_ENV_NAME = "ai-css_CLOUD_MODEL_PROXY"
)
var envProxyUrl string
func init() {
envProxyUrl = os.Getenv(PROXY_ENV_NAME)
}
// GetRemoteProxy 国内测试时使用,方便验收
func GetRemoteProxy() *url.URL {
if envProxyUrl == "" {
return nil
}
proxyURL, err := url.Parse(envProxyUrl)
if err != nil {
logger.Errorf("cloud model get remote proxy failed url:%s", envProxyUrl)
return nil
}
return proxyURL
}

View File

@ -0,0 +1,55 @@
package providers
import (
"ai-css/library/modelprovider"
"ai-css/library/modelprovider/config"
"ai-css/library/modelprovider/consts"
"ai-css/library/modelprovider/providers/openai"
"net"
"net/http"
"time"
)
var (
HttpClientTimeout = 5 * 60 * time.Second
)
type ProviderFactory func(conf *config.ProviderConfig) (modelprovider.Provider, error)
type Registry struct {
Providers map[consts.ProviderName]ProviderFactory // name -> Provider
}
func BuildRegistry() *Registry {
var providers = map[consts.ProviderName]ProviderFactory{
consts.ProviderOpenAI: func(providerConfig *config.ProviderConfig) (modelprovider.Provider, error) {
return openai.New(providerConfig, NewHttpClient()), nil
},
}
return &Registry{Providers: providers}
}
func NewHttpClient() *http.Client {
var proxyURL = GetRemoteProxy()
transport := &http.Transport{
DialContext: (&net.Dialer{
Timeout: 90 * time.Second,
KeepAlive: 30 * time.Second,
}).DialContext,
ForceAttemptHTTP2: true,
MaxIdleConns: 100,
IdleConnTimeout: 90 * time.Second,
TLSHandshakeTimeout: 90 * time.Second,
ExpectContinueTimeout: 1 * time.Second,
}
if proxyURL != nil {
transport.Proxy = http.ProxyURL(proxyURL)
}
return &http.Client{
Transport: transport,
Timeout: HttpClientTimeout,
}
}

View File

@ -0,0 +1,21 @@
package modelprovider
type StreamEventKind int
const (
StreamStart StreamEventKind = iota
StreamDelta
StreamTool
StreamError
StreamEnd
)
type StreamEvent struct {
Kind StreamEventKind
Text string
Err error
Raw any
OutputTokens int64
}
type StreamChatCallback func(StreamEvent) error

View File

@ -1,8 +1,8 @@
package middleware
import (
"ai-css/models"
"github.com/gin-gonic/gin"
"goflylivechat/models"
)
func Ipblack(c *gin.Context) {

View File

@ -1,8 +1,8 @@
package middleware
import (
"ai-css/tools"
"github.com/gin-gonic/gin"
"goflylivechat/tools"
"time"
)

View File

@ -1,9 +1,10 @@
package middleware
import (
"github.com/gin-gonic/gin"
"goflylivechat/tools"
"ai-css/tools"
"time"
"github.com/gin-gonic/gin"
)
func NewMidLogger() gin.HandlerFunc {

View File

@ -45,7 +45,15 @@ func FindMessageByVisitorId(visitor_id string) []Message {
return messages
}
//修改消息状态
// FindLatestMessageByVisitorId 查询最近几条消息
func FindLatestMessageByVisitorId(visitor_id string, limit int) ([]Message, error) {
var messages []Message
tx := DB.Where("visitor_id=?", visitor_id).Order("id DESC").Limit(limit).Find(&messages)
return messages, tx.Error
}
// 修改消息状态
func ReadMessageByVisitorId(visitor_id string) {
message := &Message{
Status: "read",
@ -53,14 +61,14 @@ func ReadMessageByVisitorId(visitor_id string) {
DB.Model(&message).Where("visitor_id=?", visitor_id).Update(message)
}
//获取未读数
// 获取未读数
func FindUnreadMessageNumByVisitorId(visitor_id string) uint {
var count uint
DB.Where("visitor_id=? and status=?", visitor_id, "unread").Count(&count)
return count
}
//查询最后一条消息
// 查询最后一条消息
func FindLastMessage(visitorIds []string) []Message {
var messages []Message
if len(visitorIds) <= 0 {
@ -87,7 +95,7 @@ func FindLastMessage(visitorIds []string) []Message {
return messages
}
//查询最后一条消息
// 查询最后一条消息
func FindLastMessageByVisitorId(visitorId string) Message {
var m Message
DB.Select("content").Where("visitor_id=?", visitorId).Order("id desc").First(&m)
@ -99,13 +107,14 @@ func FindMessageByWhere(query interface{}, args ...interface{}) []MessageKefu {
return messages
}
//查询条数
// 查询条数
func CountMessage(query interface{}, args ...interface{}) uint {
var count uint
DB.Model(&Message{}).Where(query, args...).Count(&count)
return count
}
//分页查询
// 分页查询
func FindMessageByPage(page uint, pagesize uint, query interface{}, args ...interface{}) []*MessageKefu {
offset := (page - 1) * pagesize
if offset < 0 {

View File

@ -1,9 +1,9 @@
package models
import (
"ai-css/common"
"fmt"
"github.com/jinzhu/gorm"
"goflylivechat/common"
"log"
"time"
)

View File

@ -1,10 +1,10 @@
package router
import (
"ai-css/controller"
"ai-css/middleware"
"ai-css/ws"
"github.com/gin-gonic/gin"
"goflylivechat/controller"
"goflylivechat/middleware"
"goflylivechat/ws"
)
func InitApiRouter(engine *gin.Engine) {

View File

@ -1,9 +1,9 @@
package router
import (
"ai-css/middleware"
"ai-css/tmpl"
"github.com/gin-gonic/gin"
"goflylivechat/middleware"
"goflylivechat/tmpl"
)
func InitViewRouter(engine *gin.Engine) {

BIN
static/.DS_Store vendored

Binary file not shown.

View File

@ -1,8 +1,8 @@
package tmpl
import (
"ai-css/tools"
"github.com/gin-gonic/gin"
"goflylivechat/tools"
"html/template"
"net/http"
)

View File

@ -1,8 +1,8 @@
package tmpl
import (
"ai-css/models"
"github.com/gin-gonic/gin"
"goflylivechat/models"
"html"
"html/template"
"net/http"

View File

@ -1,8 +1,8 @@
package tmpl
import (
"ai-css/tools"
"github.com/gin-gonic/gin"
"goflylivechat/tools"
"net/http"
)

View File

@ -1,13 +1,14 @@
package ws
import (
"ai-css/models"
"ai-css/tools"
"encoding/json"
"github.com/gin-gonic/gin"
"github.com/gorilla/websocket"
"goflylivechat/models"
"goflylivechat/tools"
"log"
"time"
"github.com/gin-gonic/gin"
"github.com/gorilla/websocket"
)
func NewKefuServer(c *gin.Context) {

View File

@ -1,13 +1,14 @@
package ws
import (
"ai-css/common"
"ai-css/models"
"encoding/json"
"github.com/gin-gonic/gin"
"github.com/gorilla/websocket"
"goflylivechat/common"
"goflylivechat/models"
"log"
"time"
"github.com/gin-gonic/gin"
"github.com/gorilla/websocket"
)
func NewVisitorServer(c *gin.Context) {

View File

@ -1,28 +1,34 @@
package ws
import (
"ai-css/models"
"ai-css/tools"
"encoding/json"
"fmt"
"github.com/gin-gonic/gin"
"github.com/gorilla/websocket"
"goflylivechat/models"
"goflylivechat/tools"
"log"
"net/http"
"strconv"
"sync"
"time"
"github.com/gin-gonic/gin"
"github.com/gorilla/websocket"
)
const (
MaxAIAnswerCycleTimes int = 3
)
type User struct {
Conn *websocket.Conn
Name string
Id string
Avator string
To_id string
Role_id string
Mux sync.Mutex
UpdateTime time.Time
Conn *websocket.Conn
Name string
Id string
Avator string
To_id string
Role_id string
Mux sync.Mutex
UpdateTime time.Time
AIAnswerCycle int
}
type Message struct {
conn *websocket.Conn
@ -144,3 +150,7 @@ func UpdateVisitorUser(visitorId string, toId string) {
guest.To_id = toId
}
}
func AIAnswerAvailable(u *User) bool {
return u.AIAnswerCycle < MaxAIAnswerCycleTimes
}