Skip to content

Jenil1105/LAS-file-visualizer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 

Repository files navigation

Well Log Visualization & AI Interpretation System

.

=> Overview

This project is a full-stack web application that ingests LAS (Log ASCII Standard) files, stores the original files securely, and enables interactive visualization of depth-indexed well-log curves. Users can upload files, select curves, filter depth ranges, and explore multi-track plots with zoom and pan support. The system performs deterministic statistical analysis on selected intervals and generates AI-assisted interpretations, along with a contextual chatbot for data-driven queries, all within a clean and securely separated frontend–backend architecture.

Live Deployment

las-file-visualizer

⚠️ Note: Since the application is hosted on Render's free tier, the server may take a few seconds to wake up on the first request. If it does not load immediately, please wait briefly or refresh once.


.


=> System Architecture

The application follows a modular full-stack architecture with clear separation of concerns.

  • Frontend (React + Vite)
    Handles user interaction, curve selection, depth filtering, visualization using Plotly, and rendering AI interpretations and chat responses.

  • Backend (Node.js + Express)
    Exposes REST APIs for file upload, metadata retrieval, depth-based queries, statistical analysis, AI interpretation, and chat interaction.

  • Database (PostgreSQL – Supabase Hosted)
    Stores structured depth-indexed well-log data using JSONB for flexible curve storage and indexed depth queries for efficient filtering.

  • Cloud Storage (AWS S3)
    Stores original LAS files securely outside the database.

  • AI Layer (Groq LLM API)
    Generates interval-based interpretations and contextual chat responses. Deterministic statistical computations are performed server-side before invoking the LLM.

High-Level Flow

User -> Frontend -> Backend API
Backend -> PostgreSQL (log data)
Backend -> AWS S3 (original LAS files)
Backend -> Groq LLM (interpretation & chat)


.


=> Tech Stack

Frontend

  • React (Vite)
  • JavaScript (ES6+)
  • Zustand (State Management)
  • Plotly.js / react-plotly.js
  • Tailwind CSS
  • Axios

Backend

  • Node.js
  • Express.js
  • PostgreSQL (Supabase Hosted)
  • AWS S3
  • Multer
  • Custom LAS Parser

AI & Analytics

  • Groq LLM API (openai/gpt-oss-20b)
  • Custom Statistical Engine (Mean, Std Dev, Trend, Correlation)

.


=> Data & Storage Strategy

  • Original LAS files are stored in AWS S3 to avoid bloating the database and to ensure secure, scalable object storage.

  • Parsed well-log data is stored in PostgreSQL (Supabase hosted).
    Each depth-indexed row is stored with curve values in a JSONB column, allowing flexible handling of varying curve sets across different wells.

  • Depth-based indexing is applied to enable efficient range queries for visualization and interval analysis.

  • Statistical computations are performed deterministically on the backend, while the LLM is used only for natural language interpretation.


.


=> Running the Backend Locally

For detailed backend architecture and internal structure, refer to:
--> /backend/README.md


1. Clone the Repository

git clone https://github.com/Jenil1105/LAS-file-visualizer.git
cd backend

2. Configure Environment Variables

Create a .env file inside the backend directory and add:

DATABASE_URL=
PORT=
AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=
AWS_REGION=
S3_BUCKET_NAME=
GROQ_API_KEY=

3. Setup PostgreSQL (Supabase)

  1. Create a project at https://supabase.com
  2. Navigate to Project Settings → Database
  3. Copy the PostgreSQL connection string and paste it into:
DATABASE_URL=
  1. Create required tables:
CREATE TABLE wells (
    id SERIAL PRIMARY KEY,
    name TEXT NOT NULL,
    s3_url TEXT,
    created_at TIMESTAMP DEFAULT NOW()
);

CREATE TABLE well_logs (
    id SERIAL PRIMARY KEY,
    well_id INTEGER REFERENCES wells(id) ON DELETE CASCADE,
    depth DOUBLE PRECISION,
    curves JSONB
);

CREATE INDEX idx_depth ON well_logs(depth);
CREATE INDEX idx_well_id ON well_logs(well_id);

4. Install Dependencies

npm install

5. Start the Server

node app.js

The backend will run on:

http://localhost:<PORT>

=> Running the Frontend Locally

For detailed frontend architecture and component structure, refer to:
--> /frontend/README.md


1. Navigate to Frontend Directory

cd frontend

2. Configure Backend API URL

Open the following file:

src/api/axios.js

You will see:

import axios from "axios";

// deployed backend
const url = `${import.meta.env.VITE_API_URL}`;

// to run locally
// const url = "http://localhost:<backend-port>/api/wells"

const api = axios.create({
  baseURL: url
});

export default api;

To run locally:

  • Comment the deployed backend line
  • Uncomment the local backend line
  • Replace <backend-port> with your backend PORT value

Example:

const url = "http://localhost:5000/api/wells"

3. Install Dependencies

npm install

4. Start the Frontend

npm run dev

Frontend will run on:

http://localhost:5173

About

A full-stack web platform for ingesting and analyzing well-log (LAS) data. It enables users to upload LAS files, store and query depth-indexed data efficiently, visualize selected log curves with interactive controls, and generate AI-assisted interpretations for specific depth ranges.

Topics

Resources

Stars

Watchers

Forks

Contributors

Languages