My Challanging Work

 Challenging work i have done in my previous project


✅ 1️⃣ One-minute interview explanation (you can speak this)

“We were reading millions of records from one system and inserting them into another database. Initially, we faced performance issues and OutOfMemoryError because the entire data was loaded into memory.
To solve this, I implemented batch processing with multithreading. I processed records in chunks of 1000, performed transformation, inserted them into the database, cleared memory, and then processed the next batch. This made the system memory-efficient and scalable

public void processLargeData(List<String> records) {

        for (int i = 0; i < records.size(); i += BATCH_SIZE) {

            int end = Math.min(i + BATCH_SIZE, records.size());

            List<String> batch = records.subList(i, end);

            executorService.submit(() -> processBatch(batch));

        }

        executorService.shutdown();

    }

    private void processBatch(List<String> batch) {

        // Data manipulation

        for (String record : batch) {

            // transform record

        }

        // Insert batch into DB

        saveToDatabase(batch);

        // Help GC

        batch.clear();

    }

✅ One-line opening (start with this)

“I am comfortable working without AI, but I use AI tools like ChatGPT and Copilot to work faster and smarter.”

🧠 How I Use AI Tools in My Development (Point-wise)

1️⃣ Strong Foundation Before AI

I started my career before AI tools existed

I am comfortable with manual coding and problem solving

AI is an assistant, not a dependency

“I already had strong Java and Spring Boot experience before AI came.”


✅ One-line opening

“Apart from development, I actively contribute by giving KT sessions and mentoring junior developers.

🧠 Point-wise Explanation (Simple English)

1️⃣ Knowledge Transfer (KT)

  • I regularly give KT sessions to new joiners

  • I explain:

    • Project architecture

    • Business flow

    • Coding standards

  • This helps new team members become productive faster

“I ensure smooth onboarding by giving proper KT sessions.”


✅ One-line opening (start with this)

“In my last project, we designed a three-component architecture to receive, process, store, and stream financial data.”


🧩 Overall Project Design (High Level)

  • The project is divided into three independent components

  • Each component has a clear responsibility

  • This makes the system scalable, maintainable, and loosely coupled

“The system follows a clear separation of concerns.”


🔹 Component 1: Connect Component

What it does

  • Connects to the SWIFT network

  • Receives incoming financial files

  • Pushes received data to Azure Queue

Why it exists

  • Handles only connectivity and ingestion

  • Keeps external communication separate

“The connect component is responsible only for receiving data and pushing it to Azure Queue.”


🔹 Component 2: Processor Component (Core Business Logic)

What it does

  • Reads messages from Azure Queue

  • Processes two types of data:

    1. CAMT records (XML format)

    2. MT940 / MT942 records (text format)

Processing logic

  • Parses the incoming file

  • Extracts required fields

  • Applies business validation

  • Inserts data into multiple database tables

Tables involved

  • Statement table

  • Balance table

  • Transaction table

“The processor component handles all business logic, parsing, and database insertion.”


🔹 Component 3: Streamer Component

What it does

  • Reads processed data from database

  • Publishes data to upstream (Dial) system

  • Works based on status codes

Status handling

  • READY_FOR_STREAMING

  • WAIT_FOR_STREAMING

  • FAILED or HOLD

“The streamer component publishes processed data to the upstream system based on status.”


🔄 End-to-End Flow (Very Important – Explain Like This)

“Data comes from the SWIFT network to the Connect component, then moves to Azure Queue.
The Processor component consumes the data, processes CAMT or MT files, and stores them in the database.
Finally, the Streamer component publishes the data to the upstream Dial system based on streaming status.”


 90-Second Polished Interview Speech (You can memorize this)

“In my last project, we had a three-component architecture. The first was a Connect component that connected to the SWIFT network and pushed incoming data to Azure Queue. The second was a Processor component, which handled business logic and processed two types of files: CAMT XML and MT940/MT942 text files. It parsed the data and stored it into multiple database tables like statement, balance, and transaction tables. The third was a Streamer component, which published the processed data to an upstream Dial system based on status codes like ready or waiting for streaming. This separation helped us build a scalable and maintainable system.”





Comments

Popular posts from this blog

JunitTest

Log4j2 Setup