Skip to content

Getting Started

Welcome to the BWS X SDK documentation! This guide will help you get started quickly.

What is BWS X SDK?

BWS X SDK is a Node.js library for interacting with X/Twitter that prioritizes cost optimization by using web scraping with Playwright for read operations, while maintaining API support for write operations and fallback scenarios.

Key Features

  • 💰 Cost-Optimized: 90%+ cost reduction vs official X API
  • 🔄 Hybrid Mode: Automatic fallback between scraping and API
  • 🔐 Multi-Account Support: Account rotation with cooldown management
  • 🎯 KOL Identification: AI-powered influencer discovery
  • 📊 Real-time Webhooks: HMAC-signed notifications
  • 🌐 Proxy Support: Built-in Oxylabs, BrightData support

Installation

Install the package using your preferred package manager:

bash
npm install @blockchain-web-services/bws-x-sdk-node
bash
yarn add @blockchain-web-services/bws-x-sdk-node
bash
pnpm add @blockchain-web-services/bws-x-sdk-node

Quick Start

1. Setup Environment Variables

Create a .env file in your project root:

bash
# Crawler Accounts (Primary - for cost optimization)
X_ACCOUNTS='[{
  "id": "account1",
  "username": "@myaccount",
  "cookies": {
    "auth_token": "your-auth-token-here",
    "ct0": "your-ct0-token-here"
  }
}]'

# Twitter API (For posting and fallback)
TWITTER_ACCOUNTS='[{
  "name": "main",
  "apiKey": "your-api-key",
  "apiSecret": "your-api-secret",
  "accessToken": "your-access-token",
  "accessSecret": "your-access-secret"
}]'

# Proxy (Required for production scraping)
PROXY_CONFIG='{"enabled":true,"provider":"oxylabs","username":"your-proxy-user","password":"your-proxy-pass"}'

# SDK Options
SDK_OPTIONS='{"mode":"hybrid","preferredMode":"crawler"}'

2. Initialize the Client

typescript
import { XTwitterClient } from '@blockchain-web-services/bws-x-sdk-node';
import 'dotenv/config';

// Initialize client (auto-loads from environment variables)
const client = new XTwitterClient();

3. Start Using the SDK

typescript
// Get a tweet
const tweet = await client.getTweet('1234567890123456789');
console.log(tweet.text);
console.log('Likes:', tweet.metrics.likeCount);

// Get user profile
const profile = await client.getProfile('vitalikbuterin');
console.log(profile.name);
console.log('Followers:', profile.metrics.followersCount);

// Search tweets
const results = await client.searchTweets('#bitcoin', { maxResults: 10 });
results.forEach(tweet => {
  console.log(`@${tweet.authorUsername}: ${tweet.text}`);
});

// Post a reply
const reply = await client.postReply(
  '1234567890123456789',
  'Great insight! 🚀'
);
console.log('Reply posted:', reply.id);

Getting Credentials

Crawler Account Cookies

To use web scraping, you need X/Twitter authentication cookies.

Use our automated cookie extraction tool:

bash
npx bws-x-setup-cookies

This interactive wizard will:

  • ✅ Open X/Twitter in a real browser
  • ✅ Wait for you to login
  • ✅ Automatically extract required cookies (auth_token, ct0, guest_id)
  • ✅ Validate that cookies work
  • ✅ Warn about expiration (~30 days)
  • ✅ Support multiple accounts
  • ✅ Save to .env file
  • ✅ Export to multiple formats (JSON, YAML, TypeScript)

Security features:

  • Automated validation
  • Expiration tracking
  • No manual copy-paste errors

For detailed usage, see the Cookie Setup Guide.

Option 2: Manual Extraction

If the automated tool doesn't work, you can extract cookies manually:

  1. Login to X/Twitter in your browser
  2. Open Developer Tools (F12 or right-click → Inspect)
  3. Go to Application/Storage tab → Cookies → https://x.com
  4. Copy these cookies:
    • auth_token
    • ct0
  5. Add to .env file (see format above)

Twitter API Credentials

For API operations (posting, API fallback):

  1. Go to Twitter Developer Portal
  2. Create an app or use existing
  3. Copy credentials:
    • API Key (Consumer Key)
    • API Secret (Consumer Secret)
    • Access Token
    • Access Secret

Proxy Credentials

For production web scraping:

  1. Sign up at Oxylabs or BrightData
  2. Get credentials: username and password
  3. Add to .env file

Operating Modes

The SDK supports three operating modes:

Uses scraping first, API as fallback:

typescript
const client = new XTwitterClient({
  mode: 'hybrid',
  preferredMode: 'crawler' // Try crawler first
});

Best for: Cost optimization with reliability

Crawler-Only Mode

Uses only web scraping:

typescript
const client = new XTwitterClient({
  mode: 'crawler'
});

Best for: Maximum cost savings, read-only operations

API-Only Mode

Uses only Twitter API:

typescript
const client = new XTwitterClient({
  mode: 'api'
});

Best for: Simplicity, when cost is not a concern

Next Steps

Need Help?

Released under the MIT License.