.
This project is a full-stack web application that ingests LAS (Log ASCII Standard) files, stores the original files securely, and enables interactive visualization of depth-indexed well-log curves. Users can upload files, select curves, filter depth ranges, and explore multi-track plots with zoom and pan support. The system performs deterministic statistical analysis on selected intervals and generates AI-assisted interpretations, along with a contextual chatbot for data-driven queries, all within a clean and securely separated frontend–backend architecture.
.
The application follows a modular full-stack architecture with clear separation of concerns.
-
Frontend (React + Vite)
Handles user interaction, curve selection, depth filtering, visualization using Plotly, and rendering AI interpretations and chat responses. -
Backend (Node.js + Express)
Exposes REST APIs for file upload, metadata retrieval, depth-based queries, statistical analysis, AI interpretation, and chat interaction. -
Database (PostgreSQL – Supabase Hosted)
Stores structured depth-indexed well-log data using JSONB for flexible curve storage and indexed depth queries for efficient filtering. -
Cloud Storage (AWS S3)
Stores original LAS files securely outside the database. -
AI Layer (Groq LLM API)
Generates interval-based interpretations and contextual chat responses. Deterministic statistical computations are performed server-side before invoking the LLM.
User -> Frontend -> Backend API
Backend -> PostgreSQL (log data)
Backend -> AWS S3 (original LAS files)
Backend -> Groq LLM (interpretation & chat)
.
- React (Vite)
- JavaScript (ES6+)
- Zustand (State Management)
- Plotly.js / react-plotly.js
- Tailwind CSS
- Axios
- Node.js
- Express.js
- PostgreSQL (Supabase Hosted)
- AWS S3
- Multer
- Custom LAS Parser
- Groq LLM API (
openai/gpt-oss-20b) - Custom Statistical Engine (Mean, Std Dev, Trend, Correlation)
.
-
Original LAS files are stored in AWS S3 to avoid bloating the database and to ensure secure, scalable object storage.
-
Parsed well-log data is stored in PostgreSQL (Supabase hosted).
Each depth-indexed row is stored with curve values in a JSONB column, allowing flexible handling of varying curve sets across different wells. -
Depth-based indexing is applied to enable efficient range queries for visualization and interval analysis.
-
Statistical computations are performed deterministically on the backend, while the LLM is used only for natural language interpretation.
.
For detailed backend architecture and internal structure, refer to:
--> /backend/README.md
git clone https://github.com/Jenil1105/LAS-file-visualizer.git
cd backendCreate a .env file inside the backend directory and add:
DATABASE_URL=
PORT=
AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=
AWS_REGION=
S3_BUCKET_NAME=
GROQ_API_KEY=- Create a project at https://supabase.com
- Navigate to Project Settings → Database
- Copy the PostgreSQL connection string and paste it into:
DATABASE_URL=- Create required tables:
CREATE TABLE wells (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL,
s3_url TEXT,
created_at TIMESTAMP DEFAULT NOW()
);
CREATE TABLE well_logs (
id SERIAL PRIMARY KEY,
well_id INTEGER REFERENCES wells(id) ON DELETE CASCADE,
depth DOUBLE PRECISION,
curves JSONB
);
CREATE INDEX idx_depth ON well_logs(depth);
CREATE INDEX idx_well_id ON well_logs(well_id);npm installnode app.jsThe backend will run on:
http://localhost:<PORT>For detailed frontend architecture and component structure, refer to:
--> /frontend/README.md
cd frontendOpen the following file:
src/api/axios.js
You will see:
import axios from "axios";
// deployed backend
const url = `${import.meta.env.VITE_API_URL}`;
// to run locally
// const url = "http://localhost:<backend-port>/api/wells"
const api = axios.create({
baseURL: url
});
export default api;To run locally:
- Comment the deployed backend line
- Uncomment the local backend line
- Replace
<backend-port>with your backend PORT value
Example:
const url = "http://localhost:5000/api/wells"npm installnpm run devFrontend will run on:
http://localhost:5173