diff --git a/fern/chat/web-widget.mdx b/fern/chat/web-widget.mdx
index 8b1665df5..a23f748a0 100644
--- a/fern/chat/web-widget.mdx
+++ b/fern/chat/web-widget.mdx
@@ -471,7 +471,7 @@ Enhance your widget integration:
* **[Assistant customization](/assistants)** - Fine-tune your assistant's behavior
-The widget automatically handles microphone permissions, audio processing, and cross-browser compatibility. For custom implementations, consider using the [Web SDK](/sdk/web) directly.
+The widget automatically handles microphone permissions, audio processing, and cross-browser compatibility. For custom implementations, consider using the [Web SDK](/quickstart/web) directly.
diff --git a/fern/docs.yml b/fern/docs.yml
index 929f7ddc0..b24839045 100644
--- a/fern/docs.yml
+++ b/fern/docs.yml
@@ -1046,9 +1046,9 @@ redirects:
- source: /welcome
destination: /quickstart/introduction
- source: /sdks
- destination: /sdk/web
+ destination: /quickstart/web
- source: /server-sdks
- destination: /sdk/web
+ destination: /quickstart/web
- source: /overview
destination: /quickstart
- source: /assistants
diff --git a/fern/overview.mdx b/fern/overview.mdx
index 1ee54c954..9a57a219c 100644
--- a/fern/overview.mdx
+++ b/fern/overview.mdx
@@ -142,7 +142,7 @@ Our SDKs are open source, and available on [our GitHub](https://github.com/VapiA
icon="window"
iconType="duotone"
color="#ffdd03"
- href="/sdk/web"
+ href="/quickstart/web"
>
Add a Vapi assistant to your web application.
diff --git a/fern/sdk/web.mdx b/fern/sdk/web.mdx
deleted file mode 100644
index 3ab7e0603..000000000
--- a/fern/sdk/web.mdx
+++ /dev/null
@@ -1,269 +0,0 @@
----
-title: Web SDK
-subtitle: Integrate Vapi into your web application.
-slug: sdk/web
----
-
-The Vapi Web SDK provides web developers a simple API for interacting with the realtime call functionality of Vapi.
-
-### Installation
-
-
-
-### Importing
-
-
-
-
-
----
-
-## Usage
-
-### `.start()`
-
-You can start a web call by calling the `.start()` function. The `start` function can either accept:
-
-1. **a string**, representing an assistant ID
-2. **an object**, representing a set of assistant configs (see [Create Assistant](/api-reference/assistants/create-assistant))
-
-The `start` function returns a promise that resolves to a call object. For example:
-
-```javascript
-const call = await vapi.start(assistantId);
-// { "id": "bd2184a1-bdea-4d4f-9503-b09ca8b185e6", "orgId": "6da6841c-0fca-4604-8941-3d5d65f43a17", "createdAt": "2024-11-13T19:20:24.606Z", "updatedAt": "2024-11-13T19:20:24.606Z", "type": "webCall", ... }
-```
-
-#### Passing an Assistant ID
-
-If you already have an assistant that you created (either via [the Dashboard](/quickstart/phone) or [the API](/api-reference/assistants/create-assistant)), you can start the call with the assistant's ID:
-
-```javascript
-vapi.start("79f3XXXX-XXXX-XXXX-XXXX-XXXXXXXXce48");
-```
-
-#### Passing Assistant Configuration Inline
-
-You can also specify configuration for your assistant inline.
-
-This will not create a [persistent assistant](/assistants/persistent-assistants) that is saved to your account, rather it will create an ephemeral assistant only used for this call specifically.
-
-You can pass the assistant's configuration in an object (see [Create Assistant](/api-reference/assistants/create-assistant) for a list of acceptable fields):
-
-```javascript
-vapi.start({
- transcriber: {
- provider: "deepgram",
- model: "nova-2",
- language: "en-US",
- },
- model: {
- provider: "openai",
- model: "gpt-4o",
- messages: [
- {
- role: "system",
- content: "You are a helpful assistant.",
- },
- ],
- },
- voice: {
- provider: "playht",
- voiceId: "jennifer",
- },
- name: "My Inline Assistant",
- ...
-});
-```
-
-#### Overriding Assistant Configurations
-
-To override assistant settings or set template variables, you can pass `assistantOverrides` as the second argument.
-
-For example, if the first message is "Hello `{{name}}`", set `assistantOverrides` to the following to replace `{{name}}` with `John`:
-
-```javascript
-const assistantOverrides = {
- transcriber: {
- provider: "deepgram",
- model: "nova-2",
- language: "en-US",
- },
- recordingEnabled: false,
- variableValues: {
- name: "Alice",
- },
-};
-
-vapi.start("79f3XXXX-XXXX-XXXX-XXXX-XXXXXXXXce48", assistantOverrides);
-```
-
-### `.send()`
-
-During the call, you can send intermediate messages to the assistant (like [background messages](/assistants/background-messages)).
-
-- `type` will always be `"add-message"`
-- the `message` field will have 2 items, `role` and `content`.
-
-```javascript
-vapi.send({
- type: "add-message",
- message: {
- role: "system",
- content: "The user has pressed the button, say peanuts",
- },
-});
-```
-
-
- Possible values for role are `system`, `user`, `assistant`, `tool` or
- `function`.
-
-
-### `.stop()`
-
-You can stop the call session by calling the `stop` method:
-
-```javascript
-vapi.stop();
-```
-
-This will stop the recording and close the connection.
-
-### `.isMuted()`
-
-Check if the user's microphone is muted:
-
-```javascript
-vapi.isMuted();
-```
-
-### `.setMuted(muted: boolean)`
-
-You can mute & unmute the user's microphone with `setMuted`:
-
-```javascript
-vapi.isMuted(); // false
-vapi.setMuted(true);
-vapi.isMuted(); // true
-```
-
-### `say(message: string, endCallAfterSpoken?: boolean)`
-
-The `say` method can be used to invoke speech and gracefully terminate the call if needed
-
-```javascript
-vapi.say("Our time's up, goodbye!", true)
-```
-
-## Events
-
-You can listen on the `vapi` instance for events. These events allow you to react to changes in the state of the call or user speech.
-
-#### `speech-start`
-
-Occurs when your AI assistant has started speaking.
-
-```javascript
-vapi.on("speech-start", () => {
- console.log("Assistant speech has started.");
-});
-```
-
-#### `speech-end`
-
-Occurs when your AI assistant has finished speaking.
-
-```javascript
-vapi.on("speech-end", () => {
- console.log("Assistant speech has ended.");
-});
-```
-
-#### `call-start`
-
-Occurs when the call has connected & begins.
-
-```javascript
-vapi.on("call-start", () => {
- console.log("Call has started.");
-});
-```
-
-#### `call-end`
-
-Occurs when the call has disconnected & ended.
-
-```javascript
-vapi.on("call-end", () => {
- console.log("Call has ended.");
-});
-```
-
-#### `volume-level`
-
-Realtime volume level updates for the assistant. A floating-point number between `0` & `1`.
-
-```javascript
-vapi.on("volume-level", (volume) => {
- console.log(`Assistant volume level: ${volume}`);
-});
-```
-
-#### `message`
-
-Various assistant messages can be sent back to the client during the call. These are the same messages that your [server](/server-url) would receive.
-
-At [assistant creation time](/api-reference/assistants/create-assistant), you can specify on the `clientMessages` field the set of messages you'd like the assistant to send back to the client.
-
-Those messages will come back via the `message` event:
-
-```javascript
-// Various assistant messages can come back (like function calls, transcripts, etc)
-vapi.on("message", (message) => {
- console.log(message);
-});
-```
-
-#### `error`
-
-Handle errors that occur during the call.
-
-```javascript
-vapi.on("error", (e) => {
- console.error(e);
-});
-```
-
----
-
-## Resources
-
-
-
- View the package on NPM.
-
-
- View the package on GitHub.
-
-
-
-
- Get up and running quickly with the our SDKs.
-
-
diff --git a/fern/sdks.mdx b/fern/sdks.mdx
index 79bc79776..96b1860c8 100644
--- a/fern/sdks.mdx
+++ b/fern/sdks.mdx
@@ -9,7 +9,7 @@ The Vapi Client SDKs automatically configure audio streaming to and from the cli
The SDKs are open source, and available on GitHub:
-
+
Add a Vapi assistant to your web application.
(
-
+
Add a Vapi assistant to your web application.