Laravel × OpenAI:5 分钟解锁 AI 超能力!官方集成包全攻略
引言
随着人工智能技术的飞速发展,将 AI 能力集成到 Web 应用中已成为提升用户体验的重要方式。OpenAI 提供的 GPT 系列模型凭借其强大的自然语言处理能力,成为开发者的首选工具。本文将详细介绍如何使用官方提供的openai-php/laravel包在 Laravel 项目中快速集成 OpenAI 功能,相比通用的openai-php/client,这个专为 Laravel 优化的包提供了更符合 Laravel 生态的使用方式,包括配置文件、服务容器集成和 facade 支持等特性。我们将从安装配置到实际应用,一步一步构建一个完整的 AI 交互功能。
环境要求
在开始之前,请确保你的开发环境满足以下要求:
- PHP >= 8.2
- Laravel >= 11
- 有效的 OpenAI API 密钥(可从 OpenAI 平台 获取)
安装与配置
安装 openai-php/laravel 包
composer require openai-php/laravel
php artisan openai:install
配置 API 密钥 打开.env文件,添加你的 OpenAI API 密钥及其他可选配置:
# .env 文件
OPENAI_API_KEY=your_actual_api_key_here
OPENAI_ORGANIZATION=your_organization_id (可选)
OPENAI_PROJECT=your_project_id (可选)
注意:将your_actual_api_key_here替换为你从 OpenAI 平台获取的真实 API 密钥,确保不要将此密钥提交到版本控制系统。
配置文件详解 执行openai:install命令后,会在config目录下生成一个openai.php配置文件,内容如下:
<?php
return [
/*
* The API key used to authenticate with the OpenAI API.
*/
'api_key' => env('OPENAI_API_KEY'),
/*
* The organization used for the OpenAI API.
*/
'organization' => env('OPENAI_ORGANIZATION'),
/*
* The project used for the OpenAI API.
*/
'project' => env('OPENAI_PROJECT'),
/*
* The base URI of the OpenAI API.
*/
'base_uri' => env('OPENAI_BASE_URI', 'https://api.openai.com/v1'),
/*
* Any additional options that should be passed to the Guzzle client.
*/
'client_options' => [
// 'timeout' => 60,
// 'proxy' => 'http://localhost:8080',
],
/*
* The HTTP client factory used to create the Guzzle client.
*/
'http_client_factory' => OpenAI\Laravel\Http\Client\Factory::class,
];
使用方式详解
openai-php/laravel包提供了多种使用方式,以适应不同的开发场景:
方式一:使用 Facade(推荐)
该包提供了一个OpenAIFacade,可直接在代码中使用:
use OpenAI\Laravel\Facades\OpenAI;
// 发送聊天请求
$response = OpenAI::chat()->create([
'model' => 'gpt-4o-mini',
'messages' => [
['role' => 'user', 'content' => 'Hello, OpenAI!'],
],
]);
echo $response->choices[0]->message->content;
方式二:依赖注入
通过构造函数注入OpenAI\Client实例:
use OpenAI\Client;
class AIController extends Controller
{
protected $openai;
public function __construct(Client $openai)
{
$this->openai = $openai;
}
public function generateText()
{
$response = $this->openai->chat()->create([
'model' => 'gpt-4o-mini',
'messages' => [
['role' => 'user', 'content' => 'Write a short poem about Laravel'],
],
]);
return $response->choices[0]->message->content;
}
}
方式三:通过服务容器获取
$client = app(OpenAI\Client::class);
$response = $client->completions()->create([
'model' => 'gpt-3.5-turbo-instruct',
'prompt' => 'Generate a list of 5 programming languages',
'max_tokens' => 100,
]);
核心功能实现
创建 AI 服务类 为了更好地组织代码,我们可以创建一个服务类来封装 AI 相关的业务逻辑:
php artisan make:service AIService
<?php
namespace App\Services;
use OpenAI\Laravel\Facades\OpenAI;
use OpenAI\Responses\Chat\CreateResponse;
use OpenAI\Responses\Completions\CreateResponse as CompletionsCreateResponse;
use OpenAI\Responses\Models\ListResponse;
class AIService
{
/**
* 发送聊天消息到OpenAI
*/
public function chat(string $message, string $model = 'gpt-4o-mini'): string
{
try {
$response = OpenAI::chat()->create([
'model' => $model,
'messages' => [
['role' => 'system', 'content' => 'You are a helpful assistant integrated with a Laravel application.'],
['role' => 'user', 'content' => $message],
],
]);
return $response->choices[0]->message->content ?? 'No response received';
} catch (\Exception $e) {
\Log::error('OpenAI chat error: ' . $e->getMessage());
throw new \RuntimeException('Failed to process your request. Please try again later.');
}
}
/**
* 生成文本补全
*/
public function complete(string $prompt, string $model = 'gpt-3.5-turbo-instruct'): string
{
try {
$response = OpenAI::completions()->create([
'model' => $model,
'prompt' => $prompt,
'max_tokens' => 200,
]);
return $response->choices[0]->text ?? 'No completion received';
} catch (\Exception $e) {
\Log::error('OpenAI completion error: ' . $e->getMessage());
throw new \RuntimeException('Failed to generate completion. Please try again later.');
}
}
/**
* 获取可用模型列表
*/
public function listModels(): array
{
try {
$response = OpenAI::models()->list();
return $response->data;
} catch (\Exception $e) {
\Log::error('OpenAI models list error: ' . $e->getMessage());
return [];
}
}
/**
* 生成流式响应
*/
public function streamChat(string $message, string $model = 'gpt-4o-mini')
{
try {
$stream = OpenAI::chat()->createStreamed([
'model' => $model,
'messages' => [
['role' => 'system', 'content' => 'You are a helpful assistant.'],
['role' => 'user', 'content' => $message],
],
]);
foreach ($stream as $response) {
yield $response->choices[0]->delta->content ?? '';
}
} catch (\Exception $e) {
\Log::error('OpenAI stream error: ' . $e->getMessage());
yield 'An error occurred while processing your request.';
}
}
}
构建聊天机器人控制器
php artisan make:controller ChatbotController
<?php
namespace App\Http\Controllers;
use App\Services\AIService;
use Illuminate\Http\Request;
use Illuminate\Support\Facades\Log;
class ChatbotController extends Controller
{
protected $aiService;
public function __construct(AIService $aiService)
{
$this->aiService = $aiService;
}
// 显示聊天界面
public function index()
{
return view('chatbot');
}
// 处理聊天请求
public function sendMessage(Request $request)
{
$request->validate([
'message' => 'required|string|max:2000',
'model' => 'nullable|string|in:gpt-4o-mini,gpt-4o,gpt-3.5-turbo',
]);
$message = $request->input('message');
$model = $request->input('model', 'gpt-4o-mini');
try {
$response = $this->aiService->chat($message, $model);
return response()->json([
'success' => true,
'response' => $response
]);
} catch (\Exception $e) {
Log::error('Chatbot error: ' . $e->getMessage());
return response()->json([
'success' => false,
'error' => $e->getMessage()
], 500);
}
}
// 处理流式聊天请求
public function streamMessage(Request $request)
{
$request->validate([
'message' => 'required|string|max:2000',
'model' => 'nullable|string|in:gpt-4o-mini,gpt-4o,gpt-3.5-turbo',
]);
$message = $request->input('message');
$model = $request->input('model', 'gpt-4o-mini');
return response()->stream(function () use ($message, $model) {
foreach ($this->aiService->streamChat($message, $model) as $chunk) {
echo "data: " . json_encode(['chunk' => $chunk]) . "\n\n";
flush();
// 防止超时
if (connection_aborted()) {
break;
}
}
}, 200, [
'Content-Type' => 'text/event-stream',
'Cache-Control' => 'no-cache',
'X-Accel-Buffering' => 'no',
]);
}
}
创建聊天界面视图 在resources/views目录下创建chatbot.blade.php:
<!DOCTYPE html>
<html lang="{{ str_replace('_', '-', app()->getLocale()) }}">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Laravel OpenAI Chatbot</title>
<script src="https://cdn.tailwindcss.com"></script>
</head>
<body class="bg-gray-100">
<div class="max-w-3xl mx-auto p-4">
<h1 class="text-2xl font-bold mb-4">Laravel AI Chatbot</h1>
<div id="chat-messages" class="bg-white rounded-lg shadow-md p-4 h-96 overflow-y-auto mb-4"></div>
<div class="flex gap-2">
<select id="model" class="border rounded p-2 flex-1 max-w-[150px]">
<option value="gpt-4o-mini">GPT-4o Mini</option>
<option value="gpt-3.5-turbo">GPT-3.5 Turbo</option>
<option value="gpt-4o">GPT-4o</option>
</select>
<input type="text" id="message-input"
class="border rounded p-2 flex-1"
placeholder="Type your message...">
<button id="send-button" class="bg-blue-500 text-white p-2 rounded">
Send
</button>
<button id="stream-button" class="bg-green-500 text-white p-2 rounded">
Stream
</button>
</div>
</div>
<script>
document.addEventListener('DOMContentLoaded', () => {
const chatMessages = document.getElementById('chat-messages');
const messageInput = document.getElementById('message-input');
const sendButton = document.getElementById('send-button');
const streamButton = document.getElementById('stream-button');
const modelSelect = document.getElementById('model');
// 添加消息到聊天界面
const addMessage = (content, isUser = false) => {
const messageDiv = document.createElement('div');
messageDiv.className = `mb-2 p-2 rounded ${isUser ? 'bg-blue-100 text-right' : 'bg-gray-100'}`;
messageDiv.textContent = content;
chatMessages.appendChild(messageDiv);
chatMessages.scrollTop = chatMessages.scrollHeight;
};
// 发送普通请求
const sendMessage = async () => {
const message = messageInput.value.trim();
if (!message) return;
const model = modelSelect.value;
addMessage(message, true);
messageInput.value = '';
addMessage('Thinking...');
try {
const response = await fetch('{{ route('chatbot.send') }}', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'X-CSRF-TOKEN': '{{ csrf_token() }}'
},
body: JSON.stringify({ message, model })
});
const data = await response.json();
// 移除"Thinking..."消息
chatMessages.removeChild(chatMessages.lastChild);
if (data.success) {
addMessage(data.response);
} else {
addMessage(`Error: ${data.error}`, false);
}
} catch (error) {
chatMessages.removeChild(chatMessages.lastChild);
addMessage(`Error: ${error.message}`, false);
}
};
// 发送流式请求
const streamMessage = () => {
const message = messageInput.value.trim();
if (!message) return;
const model = modelSelect.value;
addMessage(message, true);
messageInput.value = '';
const responseDiv = document.createElement('div');
responseDiv.className = 'mb-2 p-2 rounded bg-gray-100';
chatMessages.appendChild(responseDiv);
chatMessages.scrollTop = chatMessages.scrollHeight;
const eventSource = new EventSource(`{{ route('chatbot.stream') }}?message=${encodeURIComponent(message)}&model=${model}&_token={{ csrf_token() }}`);
eventSource.onmessage = (event) => {
const data = JSON.parse(event.data);
responseDiv.textContent += data.chunk;
chatMessages.scrollTop = chatMessages.scrollHeight;
};
eventSource.onerror = () => {
eventSource.close();
};
};
sendButton.addEventListener('click', sendMessage);
streamButton.addEventListener('click', streamMessage);
messageInput.addEventListener('keypress', (e) => {
if (e.key === 'Enter') {
sendMessage();
}
});
});
</script>
</body>
</html>
添加路由 在routes/web.php中添加以下路由:
<?php
use Illuminate\Support\Facades\Route;
use App\Http\Controllers\ChatbotController;
Route::get('/chatbot', [ChatbotController::class, 'index'])->name('chatbot.index');
Route::post('/chatbot/send', [ChatbotController::class, 'sendMessage'])->name('chatbot.send');
Route::get('/chatbot/stream', [ChatbotController::class, 'streamMessage'])->name('chatbot.stream');
高级功能实现
模型管理功能
扩展AIService类,添加模型详情和删除功能:
/**
* 获取模型详情
*/
public function getModel(string $modelId): ?object
{
try {
return OpenAI::models()->retrieve($modelId);
} catch (\Exception $e) {
\Log::error("OpenAI get model {$modelId} error: " . $e->getMessage());
return null;
}
}
/**
* 删除微调模型
*/
public function deleteFineTunedModel(string $modelId): bool
{
try {
$response = OpenAI::models()->delete($modelId);
return $response->deleted ?? false;
} catch (\Exception $e) {
\Log::error("OpenAI delete model {$modelId} error: " . $e->getMessage());
return false;
}
}
图像生成功能
添加 DALL・E 图像生成功能:
/**
* 生成图像
*/
public function generateImage(string $prompt, int $n = 1, string $size = '1024x1024'): array
{
try {
$response = OpenAI::images()->create([
'prompt' => $prompt,
'n' => $n,
'size' => $size,
'response_format' => 'url',
]);
return $response->data;
} catch (\Exception $e) {
\Log::error('OpenAI image generation error: ' . $e->getMessage());
throw new \RuntimeException('Failed to generate image. Please try again later.');
}
}
/**
* 编辑图像
*/
public function editImage(string $imagePath, string $maskPath, string $prompt, int $n = 1, string $size = '1024x1024'): array
{
try {
$response = OpenAI::images()->edit([
'image' => fopen($imagePath, 'r'),
'mask' => fopen($maskPath, 'r'),
'prompt' => $prompt,
'n' => $n,
'size' => $size,
'response_format' => 'url',
]);
return $response->data;
} catch (\Exception $e) {
\Log::error('OpenAI image edit error: ' . $e->getMessage());
throw new \RuntimeException('Failed to edit image. Please try again later.');
}
}
音频转文字功能
添加 Whisper 语音转文字功能:
/**
* 音频转文字
*/
public function transcribeAudio(string $audioPath, string $model = 'whisper-1'): string
{
try {
$response = OpenAI::audio()->transcribe([
'model' => $model,
'file' => fopen($audioPath, 'r'),
]);
return $response->text;
} catch (\Exception $e) {
\Log::error('OpenAI audio transcription error: ' . $e->getMessage());
throw new \RuntimeException('Failed to transcribe audio. Please try again later.');
}
}
/**
* 音频翻译
*/
public function translateAudio(string $audioPath, string $model = 'whisper-1'): string
{
try {
$response = OpenAI::audio()->translate([
'model' => $model,
'file' => fopen($audioPath, 'r'),
]);
return $response->text;
} catch (\Exception $e) {
\Log::error('OpenAI audio translation error: ' . $e->getMessage());
throw new \RuntimeException('Failed to translate audio. Please try again later.');
}
}
高级配置与优化
自定义 HTTP 客户端
如果你需要自定义 HTTP 客户端(如设置超时、代理等),可以修改config/openai.php中的client_options:
'client_options' => [
'timeout' => 60,
'connect_timeout' => 10,
// 'proxy' => 'http://localhost:8080',
],
缓存响应(可选)
对于频繁的相同请求,可以添加缓存层来减少 API 调用次数和成本:
use Illuminate\Support\Facades\Cache;
public function chat(string $message, string $model = 'gpt-4o-mini'): string
{
// 创建缓存键
$cacheKey = 'openai_' . md5($message . $model);
// 尝试从缓存获取
if (Cache::has($cacheKey)) {
return Cache::get($cacheKey);
}
// 实际API调用...
$response = OpenAI::chat()->create([
// ...参数
]);
$result = $response->choices[0]->message->content ?? '';
// 缓存结果(例如缓存10分钟)
Cache::put($cacheKey, $result, 600);
return $result;
}
处理速率限制
OpenAI API 有速率限制,你可以添加重试机制:
use GuzzleHttp\Exception\ClientException;
public function chat(string $message, string $model = 'gpt-4o-mini', $retries = 3): string
{
try {
// API调用代码...
} catch (ClientException $e) {
// 检查是否是速率限制错误 (429)
if ($e->getResponse()->getStatusCode() === 429 && $retries > 0) {
// 等待指数退避时间后重试
$delay = (4 - $retries) * 1000; // 1s, 2s, 3s
usleep($delay * 1000);
return $this->chat($message, $model, $retries - 1);
}
throw $e;
}
}
结论
通过使用openai-php/laravel包,我们可以非常方便地在 Laravel 项目中集成 OpenAI 功能。这个包专为 Laravel 设计,提供了优雅的配置方式、服务容器集成和 Facade 支持,让我们能够专注于业务逻辑而非底层实现。 无论是构建智能客服、内容生成工具还是其他 AI 驱动的功能,openai-php/laravel都是 Laravel 开发者的理想选择,它让强大的 AI 能力触手可及。随着 OpenAI 不断推出新的模型和功能,这个包也会持续更新,为我们提供更多可能性。