19 Commits

Author SHA1 Message Date
34b5dd928a Split the function 2026-02-09 20:12:22 +09:00
d0a6b88eba Split the function 2026-02-09 18:29:35 +09:00
48369486a3 test 2026-02-09 17:36:40 +09:00
f55e29f0cf Merge pull request 'modified_date -> updated_dttm 수정' (#1) from feat/modify-createdateyn-260209 into main
Reviewed-on: #1
2026-02-09 17:13:35 +09:00
2a3fa7e895 kamco-make-dataset-generation
TrainingDataReviewJobRepository.updateLearnDataGeomFileCreateYn

modified_date -> updated_dttm 수정
2026-02-09 17:11:22 +09:00
731dbb4170 jenkinsfile 2026-02-09 11:20:15 +09:00
e50295e929 jenkinsfile 2026-02-09 10:56:16 +09:00
84d66b57ea jenkinsfile 2026-02-09 10:55:10 +09:00
038a4fabc6 jenkinsfile 2026-02-09 10:52:55 +09:00
4836d09320 jenkinsfile 2026-02-09 10:50:04 +09:00
1ce49fb793 jenkinsfile 2026-02-09 10:43:00 +09:00
dee09ad16a jenkinsfile 2026-02-09 10:37:35 +09:00
596b6d4d84 jenkinsfile 2026-02-09 10:27:17 +09:00
135de03c21 .gitignore 2026-02-08 20:54:20 +09:00
cd284d94ae add make dataset 2026-02-08 20:53:45 +09:00
bf5537c384 add make dataset 2026-02-08 20:29:19 +09:00
045e3da923 add make dataset 2026-02-08 20:21:57 +09:00
5bfde5798f add make dataset 2026-02-08 20:21:35 +09:00
d29c7cf816 add make dataset 2026-02-08 20:19:09 +09:00
47 changed files with 7142 additions and 0 deletions

0
imagery-make-dataset/dev.backup Executable file → Normal file
View File

View File

@@ -0,0 +1,32 @@
# Gradle
.gradle/
build/
!build/libs/
!build/libs/*.jar
out/
# IntelliJ IDEA
.idea/
*.iml
*.iws
*.ipr
# macOS
.DS_Store
# Compiled class files
*.class
# Log files
*.log
# Package files (gradle wrapper jar is intentionally tracked)
*.war
*.nar
*.ear
*.zip
*.tar.gz
*.rar
# Kotlin
*.kotlin_module

View File

@@ -0,0 +1,117 @@
pipeline {
agent any
parameters {
string(name: 'SPRING_PROFILES_ACTIVE', defaultValue: 'prod', description: 'Spring Profile (dev/prod)')
string(name: 'BATCH_DATE', defaultValue: '', description: 'Batch Date (YYYY-MM-DD, empty = today)')
string(name: 'ADDITIONAL_PARAMS', defaultValue: '', description: 'Additional Parameters (e.g., limit=100)')
choice(name: 'ACTION', choices: ['RUN', 'VERIFY_ONLY'], description: 'Action to perform')
}
tools {
jdk 'jdk21'
}
environment {
BRANCH = 'main'
GIT_REPO = 'https://kamco.git.gs.dabeeo.com/MVPTeam/kamco-cd-cron.git'
JAR_NAME = 'generator-dataset-for-training.jar'
TODAY = sh(script: "date +%Y-%m-%d", returnStdout: true).trim()
}
// NOTE: Pre-built JAR is included in the repository
// To update the JAR:
// 1. On a machine with internet, run: ./gradlew clean bootJar
// 2. Commit the updated build/libs/generator-dataset-for-training.jar
// 3. Push to repository
stages {
stage('Checkout') {
steps {
checkout([
$class : 'GitSCM',
branches : [[name: "${env.BRANCH}"]],
userRemoteConfigs: [[
url : "${env.GIT_REPO}",
credentialsId: 'jenkins-dev-token'
]]
])
}
}
stage('Get Commit Hash') {
steps {
script {
env.COMMIT_HASH = sh(script: "git rev-parse --short HEAD", returnStdout: true).trim()
echo "Current commit hash: ${env.COMMIT_HASH}"
}
}
}
stage('Verify JAR') {
steps {
dir("kamco-make-dataset-generation") {
script {
def jarPath = "build/libs/${env.JAR_NAME}"
if (!fileExists(jarPath)) {
error("JAR file not found: ${jarPath}")
}
echo "JAR file verified: ${jarPath}"
// Display JAR info
sh "ls -lh ${jarPath}"
}
}
}
}
stage('Run JAR') {
when {
expression { params.ACTION == 'RUN' }
}
steps {
dir("kamco-make-dataset-generation") {
script {
def jarPath = "build/libs/${env.JAR_NAME}"
// Determine batch date: use parameter if provided, otherwise use today
def batchDate = params.BATCH_DATE ?: env.TODAY
echo "========================================="
echo "Running JAR: ${jarPath}"
echo "Profile: ${params.SPRING_PROFILES_ACTIVE}"
echo "Batch Date: ${batchDate}"
echo "Additional Params: ${params.ADDITIONAL_PARAMS}"
echo "========================================="
// Build Java command
def javaCmd = "java -jar ${jarPath}"
// Add Spring profile
if (params.SPRING_PROFILES_ACTIVE) {
javaCmd += " --spring.profiles.active=${params.SPRING_PROFILES_ACTIVE}"
}
// Add batch date parameter
javaCmd += " date=${batchDate}"
// Add additional parameters
if (params.ADDITIONAL_PARAMS) {
javaCmd += " ${params.ADDITIONAL_PARAMS}"
}
echo "Executing: ${javaCmd}"
// Execute JAR
sh """
${javaCmd}
"""
echo "JAR execution completed successfully"
}
}
}
}
}
}

View File

@@ -0,0 +1,687 @@
# KAMCO Dataset Generation Batch System
KAMCO 학습 데이터 생성 및 처리를 위한 Spring Batch 시스템입니다.
## 목차
- [시스템 개요](#시스템-개요)
- [배치 작업 구조](#배치-작업-구조)
- [데이터베이스 스키마](#데이터베이스-스키마)
- [실행 흐름](#실행-흐름)
- [설정 방법](#설정-방법)
- [모니터링 및 로그](#모니터링-및-로그)
- [트러블슈팅](#트러블슈팅)
---
## 시스템 개요
### 주요 기능
- 검수 완료된 라벨링 데이터를 GeoJSON 형식으로 변환
- Docker 컨테이너를 통한 학습 데이터 생성 파이프라인 실행
- 생성된 결과물을 ZIP 파일로 압축
- 각 처리 단계별 성공/실패 이력을 DB에 자동 기록
### 기술 스택
- **Java 17+**
- **Spring Boot 3.x**
- **Spring Batch 5.x**
- **PostgreSQL**
- **Docker**
---
## 배치 작업 구조
### 1. Parent Job: `exportGeoJsonJob`
Parent Job은 진행 중인 모든 분석 회차를 조회하여 각 회차별로 Child Job을 실행합니다.
```
exportGeoJsonJob (Parent Job)
└─ Step: launchChildJobsStep
└─ Tasklet: LaunchChildJobsTasklet
├─ AnalCntInfo 리스트 조회
└─ 각 AnalCntInfo마다 Child Job 실행
```
**실행 조건:**
- `tb_map_sheet_anal_inference` 테이블의 `anal_state = 'ING'` (진행 중)
- 검수 완료(`COMPLETE`) 건수가 1개 이상 존재
- `all_cnt != file_cnt` (아직 파일 생성이 완료되지 않은 경우)
---
### 2. Child Job: `processAnalCntInfoJob`
각 AnalCntInfo(분석 회차)마다 독립적으로 실행되는 서브 작업입니다.
```
processAnalCntInfoJob (Child Job)
├─ Step 1: makeGeoJsonStep
│ └─ Tasklet: MakeGeoJsonTasklet
│ └─ 검수 완료된 라벨링 데이터를 GeoJSON 파일로 생성
│ → /dataset/request/{resultUid}/*.geojson
├─ Step 2: dockerRunStep
│ └─ Tasklet: DockerRunTasklet
│ └─ Docker 컨테이너 실행 (학습 데이터 생성 파이프라인)
│ → /dataset/response/{resultUid}/*
└─ Step 3: zipResponseStep
└─ Tasklet: ZipResponseTasklet
└─ 생성된 결과물을 ZIP으로 압축
→ /dataset/response/{resultUid}.zip
```
**JobParameters:**
- `analUid` (Long): 분석 회차 UID
- `resultUid` (String): 결과물 고유 ID (UUID)
- `timestamp` (Long): 고유성 보장을 위한 타임스탬프
---
## 데이터베이스 스키마
### 1. `batch_history` 테이블
전체 배치 작업(Parent Job) 실행 이력을 기록합니다.
```sql
CREATE TABLE public.batch_history (
uuid UUID PRIMARY KEY, -- 배치 실행 고유 ID
job VARCHAR(255) NOT NULL, -- 배치 작업 이름 (exportGeoJsonJob)
id VARCHAR(255) NOT NULL, -- 비즈니스 ID
created_dttm TIMESTAMP NOT NULL, -- 생성 일시
updated_dttm TIMESTAMP NOT NULL, -- 수정 일시
status VARCHAR(50) NOT NULL, -- 상태 (STARTED/COMPLETED/FAILED)
completed_dttm TIMESTAMP -- 완료 일시
);
```
**인덱스:**
- `idx_batch_history_job` (job)
- `idx_batch_history_status` (status)
- `idx_batch_history_created` (created_dttm DESC)
---
### 2. `batch_step_history` 테이블
각 AnalCntInfo의 Step별 실행 이력을 기록합니다.
```sql
CREATE TABLE public.batch_step_history (
id BIGSERIAL PRIMARY KEY, -- Step 이력 고유 ID
anal_uid BIGINT NOT NULL, -- 분석 UID
result_uid VARCHAR(255) NOT NULL, -- 결과 UID
step_name VARCHAR(100) NOT NULL, -- Step 이름
status VARCHAR(50) NOT NULL, -- 상태 (STARTED/SUCCESS/FAILED)
error_message TEXT, -- 에러 메시지 (최대 1000자)
started_dttm TIMESTAMP NOT NULL, -- Step 시작 일시
completed_dttm TIMESTAMP, -- Step 완료 일시
created_dttm TIMESTAMP NOT NULL, -- 생성 일시
updated_dttm TIMESTAMP NOT NULL -- 수정 일시
);
```
**Step 이름:**
- `makeGeoJsonStep`: GeoJSON 파일 생성
- `dockerRunStep`: Docker 컨테이너 실행
- `zipResponseStep`: 결과물 ZIP 압축
**인덱스:**
- `idx_batch_step_history_anal_uid` (anal_uid)
- `idx_batch_step_history_result_uid` (result_uid)
- `idx_batch_step_history_status` (status)
- `idx_batch_step_history_step_name` (step_name)
---
## 실행 흐름
### 전체 프로세스
```
1. Parent Job 시작
2. 진행 중인 AnalCntInfo 리스트 조회
3. 각 AnalCntInfo마다 반복:
├─ 3.1. Child Job 실행 (processAnalCntInfoJob)
│ ↓
│ ├─ Step 1: makeGeoJsonStep
│ │ - beforeStep: DB에 STARTED 기록
│ │ - Tasklet 실행: GeoJSON 파일 생성
│ │ - afterStep: DB에 SUCCESS/FAILED 기록
│ │
│ ├─ Step 2: dockerRunStep
│ │ - beforeStep: DB에 STARTED 기록
│ │ - Tasklet 실행: Docker 컨테이너 실행
│ │ - afterStep: DB에 SUCCESS/FAILED 기록
│ │
│ └─ Step 3: zipResponseStep
│ - beforeStep: DB에 STARTED 기록
│ - Tasklet 실행: 결과물 ZIP 압축
│ - afterStep: DB에 SUCCESS/FAILED 기록
└─ 3.2. 다음 AnalCntInfo 처리
4. Parent Job 종료 (부분 성공 허용)
```
---
### Step 1: makeGeoJsonStep
**목적:** 검수 완료된 라벨링 데이터를 GeoJSON 파일로 변환
**처리 과정:**
1. `findCompletedAnalMapSheetList()`: 검수 완료된 도엽 목록 조회
2. 각 도엽별로:
- `findCompletedYesterdayLabelingList()`: 어제까지 검수 완료된 데이터 조회
- GeoJSON Feature 생성
- `/dataset/request/{resultUid}/{filename}.geojson` 저장
- `updateLearnDataGeomFileCreateYn()`: DB에 파일 생성 완료 플래그 업데이트
**출력 파일 형식:**
```
{resultUid_8자}_{compareYyyy}_{targetYyyy}_{mapSheetNum}_D15.geojson
```
**예시:**
```
ED80D700_2022_2023_3724036_D15.geojson
```
---
### Step 2: dockerRunStep
**목적:** Docker 컨테이너를 통해 학습 데이터 생성 파이프라인 실행
**Docker 명령어:**
```bash
docker run --rm \
--user {dockerUser} \
-v {datasetVolume} \
-v {imagesVolume} \
--entrypoint python \
{dockerImage} \
code/kamco_full_pipeline.py \
--labelling-folder request/{resultUid} \
--output-folder response/{resultUid} \
--input_root {inputRoot} \
--output_root {outputRoot} \
--patch_size {patchSize} \
--overlap_pct {overlapPct} \
--train_val_test_ratio {train} {val} {test} \
--keep_empty_ratio {keepEmptyRatio}
```
**에러 처리:**
- Docker 프로세스의 `exitCode != 0``RuntimeException` 발생
- Step 실패로 처리되어 DB에 `FAILED` 상태 기록
- 에러 메시지와 exit code가 `error_message` 컬럼에 저장됨
---
### Step 3: zipResponseStep
**목적:** 생성된 학습 데이터를 ZIP 파일로 압축
**처리 과정:**
1. `/dataset/response/{resultUid}/` 디렉토리 검증
2. 디렉토리 내 모든 파일과 서브디렉토리를 재귀적으로 압축
3. `/dataset/response/{resultUid}.zip` 파일 생성
**압축 설정:**
- Hidden 파일 제외
- 디렉토리 구조 유지
- 버퍼 크기: 1024 bytes
---
## 설정 방법
### application.yml 설정
```yaml
# 학습 데이터 디렉토리 경로
training-data:
geojson-dir: /kamco-nfs/dataset
# Docker 설정
docker:
user: "1000:1000"
image: "kamco/dataset-generator:latest"
dataset-volume: "/kamco-nfs/dataset:/dataset"
images-volume: "/kamco-nfs/images:/images"
input-root: "/dataset"
output-root: "/dataset"
patch-size: 512
overlap-pct: 0.2
train-val-test-ratio:
- "0.7"
- "0.2"
- "0.1"
keep-empty-ratio: 0.5
```
---
### 환경 변수
| 환경 변수 | 설명 | 기본값 |
|----------|------|--------|
| `TRAINING_DATA_GEOJSON_DIR` | GeoJSON 파일 저장 경로 | `/kamco-nfs/dataset` |
| `DOCKER_USER` | Docker 컨테이너 실행 유저 | `1000:1000` |
| `DOCKER_IMAGE` | Docker 이미지 이름 | `kamco/dataset-generator:latest` |
---
## 모니터링 및 로그
### 로그 레벨
```yaml
logging:
level:
com.kamco.cd.geojsonscheduler: INFO
com.kamco.cd.geojsonscheduler.batch: DEBUG
org.springframework.batch: INFO
```
---
### 주요 로그 포인트
#### Parent Job 로그
```log
[INFO] Parent Job 시작: AnalCntInfo 리스트 조회 및 Child Job 실행
[INFO] 진행중인 회차 목록 조회 중...
[INFO] 진행중인 회차 수: 3
[INFO] 회차 검토: AnalUid=100, ResultUid=ED80D700...
[INFO] Child Job 실행 중... (AnalUid=100, ResultUid=ED80D700...)
[INFO] Child Job 실행 완료 (AnalUid=100, ResultUid=ED80D700...)
[INFO] Parent Job 완료 - 성공: 2, 건너뜀: 1, 실패: 0
```
#### Step 로그
```log
[INFO] ========================================
[INFO] GeoJSON 생성 시작 (AnalUid=100, ResultUid=ED80D700...)
[INFO] 검수 완료된 도엽 수: 5
[INFO] 도엽 처리 중: MapSheetNum=3724036
[INFO] 완료된 라벨링 데이터 수: 150
[INFO] GeoJSON 파일 저장 완료: /dataset/request/ED80D700.../ED80D700_2022_2023_3724036_D15.geojson
[INFO] GeoJSON 생성 완료 (ResultUid=ED80D700...) - 처리된 도엽 수: 5, 생성된 파일 수: 5
[INFO] ========================================
```
#### Docker 실행 로그
```log
[INFO] ========================================
[INFO] Docker 컨테이너 실행 시작 (ResultUid=ED80D700...)
[INFO] Running docker command: docker run --rm --user 1000:1000...
[INFO] [docker] Loading configuration...
[INFO] [docker] Processing pipeline...
[INFO] [docker] Pipeline completed successfully
[INFO] Docker process completed successfully for resultUid: ED80D700...
[INFO] ========================================
```
---
### DB 조회 쿼리
#### 1. 특정 회차의 모든 Step 실행 이력
```sql
SELECT
step_name,
status,
started_dttm,
completed_dttm,
EXTRACT(EPOCH FROM (completed_dttm - started_dttm)) AS duration_seconds,
error_message
FROM batch_step_history
WHERE anal_uid = 100
AND result_uid = 'ED80D700A0F5482BB0EC11A366DEA8DE'
ORDER BY started_dttm;
```
#### 2. 최근 실패한 Step 조회
```sql
SELECT
anal_uid,
result_uid,
step_name,
error_message,
started_dttm,
completed_dttm
FROM batch_step_history
WHERE status = 'FAILED'
ORDER BY started_dttm DESC
LIMIT 10;
```
#### 3. Step별 성공률 통계
```sql
SELECT
step_name,
COUNT(*) AS total_executions,
SUM(CASE WHEN status = 'SUCCESS' THEN 1 ELSE 0 END) AS success_count,
SUM(CASE WHEN status = 'FAILED' THEN 1 ELSE 0 END) AS failed_count,
ROUND(
SUM(CASE WHEN status = 'SUCCESS' THEN 1 ELSE 0 END)::NUMERIC / COUNT(*) * 100,
2
) AS success_rate_pct
FROM batch_step_history
GROUP BY step_name;
```
#### 4. 평균 실행 시간 (Step별)
```sql
SELECT
step_name,
COUNT(*) AS total_executions,
AVG(EXTRACT(EPOCH FROM (completed_dttm - started_dttm))) AS avg_duration_seconds,
MIN(EXTRACT(EPOCH FROM (completed_dttm - started_dttm))) AS min_duration_seconds,
MAX(EXTRACT(EPOCH FROM (completed_dttm - started_dttm))) AS max_duration_seconds
FROM batch_step_history
WHERE status = 'SUCCESS'
AND completed_dttm IS NOT NULL
GROUP BY step_name;
```
#### 5. 특정 기간 동안 처리된 회차 수
```sql
SELECT
DATE(started_dttm) AS execution_date,
COUNT(DISTINCT result_uid) AS processed_count
FROM batch_step_history
WHERE step_name = 'makeGeoJsonStep'
AND started_dttm >= CURRENT_DATE - INTERVAL '7 days'
GROUP BY DATE(started_dttm)
ORDER BY execution_date DESC;
```
---
## 트러블슈팅
### 1. Docker 컨테이너 실행 실패
**증상:**
```log
[ERROR] Docker process exited with code 1 for resultUid: ED80D700...
FileNotFoundError: Missing training pairs root at /dataset/response/.../tifs/train
```
**원인:**
- GeoJSON 파일이 생성되지 않았거나 잘못된 형식
- Docker 볼륨 마운트 경로 불일치
- 파이프라인 실행 중 필수 파일 누락
**해결 방법:**
1. `batch_step_history` 테이블에서 `makeGeoJsonStep` 상태 확인
```sql
SELECT * FROM batch_step_history
WHERE result_uid = 'ED80D700...'
AND step_name = 'makeGeoJsonStep';
```
2. GeoJSON 파일 존재 여부 확인
```bash
ls -la /kamco-nfs/dataset/request/ED80D700.../
```
3. Docker 볼륨 설정 확인
```yaml
docker:
dataset-volume: "/kamco-nfs/dataset:/dataset" # 호스트:컨테이너
```
---
### 2. GeoJSON 파일이 생성되지 않음
**증상:**
```log
[WARN] 검수 완료된 도엽이 없음. 작업 건너뜀.
```
**원인:**
- 검수 완료(`COMPLETE`) 상태의 데이터가 없음
- `inspect_stat_dttm`이 오늘 이후 (어제까지만 조회)
**해결 방법:**
1. 검수 완료 데이터 확인
```sql
SELECT COUNT(*)
FROM tb_labeling_assignment
WHERE anal_uid = 100
AND inspect_state = 'COMPLETE';
```
2. 검수 완료 시간 확인
```sql
SELECT MAX(inspect_stat_dttm)
FROM tb_labeling_assignment
WHERE anal_uid = 100
AND inspect_state = 'COMPLETE';
```
---
### 3. ZIP 파일 생성 실패
**증상:**
```log
[ERROR] Response 디렉토리가 존재하지 않음: /dataset/response/ED80D700...
```
**원인:**
- Docker Step에서 결과물이 생성되지 않음
- Docker 컨테이너가 중간에 실패했지만 감지되지 않음
**해결 방법:**
1. Docker Step 상태 확인
```sql
SELECT * FROM batch_step_history
WHERE result_uid = 'ED80D700...'
AND step_name = 'dockerRunStep';
```
2. Response 디렉토리 확인
```bash
ls -la /kamco-nfs/dataset/response/ED80D700.../
```
3. Docker 컨테이너 로그 확인 (DB의 `error_message` 컬럼)
---
### 4. Child Job이 실행되지 않음
**증상:**
```log
[INFO] 모든 파일이 이미 처리됨. 건너뜀.
```
**원인:**
- `all_cnt == file_cnt` (이미 모든 파일이 생성됨)
- 재실행이 필요한 경우 플래그를 초기화하지 않음
**해결 방법:**
1. 파일 생성 플래그 초기화
```sql
UPDATE tb_map_sheet_learn_data_geom
SET file_create_yn = false,
updated_dttm = NOW()
WHERE geo_uid IN (
SELECT inference_geom_uid
FROM tb_labeling_assignment
WHERE anal_uid = 100
);
```
2. 배치 재실행
---
### 5. 배치 작업 전체가 실패함
**증상:**
```log
[ERROR] Child Job 실행 실패 (AnalUid=100, ResultUid=ED80D700...): ...
[WARN] 3 개의 Child Job 실행이 실패했습니다.
```
**원인:**
- 여러 회차에서 동시에 실패 발생
- 현재는 부분 실패를 허용하도록 설정됨
**해결 방법:**
1. 실패한 회차 확인
```sql
SELECT DISTINCT anal_uid, result_uid
FROM batch_step_history
WHERE status = 'FAILED'
AND started_dttm >= CURRENT_DATE;
```
2. 각 회차별로 실패 원인 분석
```sql
SELECT step_name, error_message
FROM batch_step_history
WHERE anal_uid = 100
AND status = 'FAILED';
```
3. 실패 정책 변경 (필요 시)
- `LaunchChildJobsTasklet.java:87-89` 주석 해제하여 하나라도 실패 시 Parent Job 실패 처리
---
## 개발자 가이드
### 새로운 Step 추가하기
1. **Tasklet 생성**
```java
@Component
@RequiredArgsConstructor
public class NewStepTasklet implements Tasklet {
@Value("#{jobParameters['analUid']}")
private Long analUid;
@Value("#{jobParameters['resultUid']}")
private String resultUid;
@Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) {
// 로직 구현
return RepeatStatus.FINISHED;
}
}
```
2. **JobConfig에 Step 등록**
```java
@Bean
public Step newStep() {
return new StepBuilder("newStep", jobRepository)
.tasklet(newStepTasklet, transactionManager)
.listener(stepHistoryListener) // 이력 기록
.build();
}
```
3. **Job Flow에 추가**
```java
@Bean
public Job processAnalCntInfoJob() {
return new JobBuilder("processAnalCntInfoJob", jobRepository)
.start(makeGeoJsonStep())
.next(dockerRunStep())
.next(zipResponseStep())
.next(newStep()) // 새로운 Step 추가
.build();
}
```
---
### 로그 커스터마이징
**로그 레벨 변경:**
```yaml
logging:
level:
com.kamco.cd.geojsonscheduler.batch.DockerRunTasklet: DEBUG
```
**특정 Step만 로그 출력:**
```java
@Slf4j
public class CustomTasklet implements Tasklet {
@Override
public RepeatStatus execute(...) {
if (log.isDebugEnabled()) {
log.debug("상세 디버그 정보: {}", details);
}
return RepeatStatus.FINISHED;
}
}
```
---
## 배포 및 운영
### 배포 절차
1. **빌드**
```bash
./gradlew clean build
```
2. **Docker 이미지 빌드**
```bash
docker build -t kamco-batch:latest .
```
3. **실행**
```bash
java -jar build/libs/kamco-geojson-scheduler-1.0.0.jar
```
---
### 스케줄링 설정
Spring Scheduler를 사용한 정기 실행:
```java
@Scheduled(cron = "0 0 2 * * *") // 매일 새벽 2시
public void runBatch() {
JobParameters jobParameters = new JobParametersBuilder()
.addLong("timestamp", System.currentTimeMillis())
.toJobParameters();
jobLauncher.run(exportGeoJsonJob, jobParameters);
}
```
---
## 라이선스
Copyright (c) 2024 KAMCO. All rights reserved.
---
## 문의
기술 지원: tech-support@kamco.co.kr

View File

@@ -0,0 +1,45 @@
plugins {
id 'java'
id 'org.springframework.boot' version '3.5.7'
id 'io.spring.dependency-management' version '1.1.7'
}
group = 'com.kamco.cd'
version = '0.0.1-SNAPSHOT'
java {
toolchain {
languageVersion = JavaLanguageVersion.of(21)
}
}
configurations {
compileOnly {
extendsFrom annotationProcessor
}
}
repositories {
mavenCentral()
}
dependencies {
implementation 'org.springframework.boot:spring-boot-starter-batch'
implementation 'org.springframework.boot:spring-boot-starter-jdbc'
compileOnly 'org.projectlombok:lombok'
runtimeOnly 'org.postgresql:postgresql'
annotationProcessor 'org.projectlombok:lombok'
testImplementation 'org.springframework.boot:spring-boot-starter-test'
testRuntimeOnly 'org.junit.platform:junit-platform-launcher'
// JSON
implementation 'com.fasterxml.jackson.core:jackson-databind'
}
tasks.named('test') {
useJUnitPlatform()
}
bootJar {
archiveFileName = 'generator-dataset-for-training.jar'
}

View File

@@ -0,0 +1,257 @@
# 데이터베이스 설정 가이드
## 개요
이 애플리케이션은 PostgreSQL 데이터베이스를 사용하며, 다음 테이블이 필요합니다:
1. Spring Batch 메타데이터 테이블 (자동 생성)
2. batch_history 테이블 (수동 생성 필요)
## 필수 테이블
### 1. Spring Batch 메타데이터 테이블
Spring Batch가 자동으로 생성합니다:
- BATCH_JOB_INSTANCE
- BATCH_JOB_EXECUTION
- BATCH_JOB_EXECUTION_PARAMS
- BATCH_STEP_EXECUTION
- BATCH_STEP_EXECUTION_CONTEXT
- BATCH_JOB_EXECUTION_CONTEXT
**설정**:
```yaml
spring:
batch:
jdbc:
initialize-schema: always
```
### 2. batch_history 테이블 (커스텀)
배치 작업 실행 이력을 저장하는 커스텀 테이블입니다.
**용도**:
- 배치 작업 시작/종료 시간 기록
- 배치 실행 상태 추적 (STARTED/COMPLETED/FAILED)
- 비즈니스 ID별 배치 이력 관리
## 새 환경 데이터베이스 초기 설정
### Option 1: SQL 스크립트 실행 (권장)
**1. batch_history 테이블 생성**:
```bash
# PostgreSQL에 연결
psql -h [host] -U [username] -d [database]
# schema.sql 실행
\i src/main/resources/sql/schema.sql
```
**또는 직접 SQL 실행**:
```sql
-- batch_history 테이블 생성
CREATE TABLE IF NOT EXISTS public.batch_history (
uuid UUID PRIMARY KEY,
job VARCHAR(255) NOT NULL,
id VARCHAR(255) NOT NULL,
created_dttm TIMESTAMP NOT NULL,
updated_dttm TIMESTAMP NOT NULL,
status VARCHAR(50) NOT NULL,
completed_dttm TIMESTAMP
);
-- 인덱스 생성
CREATE INDEX IF NOT EXISTS idx_batch_history_job ON public.batch_history(job);
CREATE INDEX IF NOT EXISTS idx_batch_history_status ON public.batch_history(status);
CREATE INDEX IF NOT EXISTS idx_batch_history_created ON public.batch_history(created_dttm DESC);
```
**2. 권한 설정** (필요한 경우):
```sql
-- 애플리케이션 사용자에게 권한 부여
GRANT ALL PRIVILEGES ON TABLE public.batch_history TO [app_user];
GRANT USAGE, SELECT ON ALL SEQUENCES IN SCHEMA public TO [app_user];
```
### Option 2: 자동 초기화 활성화 (개발 환경만)
**application-local.yml 또는 application-dev.yml**:
```yaml
spring:
sql:
init:
mode: always
schema-locations: classpath:sql/schema.sql
```
**주의**:
- 운영 환경에서는 `mode: never` 사용 권장
- 테이블이 이미 존재하면 권한 에러 발생 가능
## 테이블 구조
### batch_history
| 컬럼명 | 데이터 타입 | NULL | 설명 |
|--------|------------|------|------|
| uuid | UUID | NOT NULL | 배치 실행 고유 ID (Primary Key) |
| job | VARCHAR(255) | NOT NULL | 배치 작업 이름 |
| id | VARCHAR(255) | NOT NULL | 비즈니스 ID |
| created_dttm | TIMESTAMP | NOT NULL | 생성 일시 |
| updated_dttm | TIMESTAMP | NOT NULL | 수정 일시 |
| status | VARCHAR(50) | NOT NULL | 상태 (STARTED/COMPLETED/FAILED) |
| completed_dttm | TIMESTAMP | NULL | 완료 일시 |
**인덱스**:
- `idx_batch_history_job`: job 컬럼 인덱스
- `idx_batch_history_status`: status 컬럼 인덱스
- `idx_batch_history_created`: created_dttm 컬럼 인덱스 (DESC)
## 환경별 설정
### Local 환경
```yaml
spring:
datasource:
url: jdbc:postgresql://localhost:5432/kamco_local
username: dev_user
password: dev_password
sql:
init:
mode: always # 자동 초기화 활성화
```
### Production 환경
```yaml
spring:
datasource:
url: jdbc:postgresql://prod-db:5432/kamco_prod
username: app_user
password: ${DB_PASSWORD}
sql:
init:
mode: never # 자동 초기화 비활성화
```
## 트러블슈팅
### 에러: "must be owner of table batch_history"
**원인**: 테이블이 이미 존재하지만, 현재 DB 사용자가 owner가 아님
**해결**:
1. SQL 자동 초기화 비활성화:
```yaml
spring:
sql:
init:
mode: never
```
2. 또는 테이블 소유권 변경:
```sql
ALTER TABLE public.batch_history OWNER TO [app_user];
```
### 에러: "relation batch_history does not exist"
**원인**: batch_history 테이블이 생성되지 않음
**해결**:
1. SQL 스크립트 수동 실행:
```bash
psql -h [host] -U [user] -d [database] -f src/main/resources/sql/schema.sql
```
2. 또는 자동 초기화 활성화 (개발 환경만):
```yaml
spring:
sql:
init:
mode: always
schema-locations: classpath:sql/schema.sql
```
### 에러: "permission denied for schema public"
**원인**: DB 사용자에게 public 스키마 권한 없음
**해결**:
```sql
GRANT ALL ON SCHEMA public TO [app_user];
GRANT ALL PRIVILEGES ON ALL TABLES IN SCHEMA public TO [app_user];
```
## 데이터베이스 마이그레이션
### 기존 환경에서 테이블 확인
```sql
-- batch_history 테이블 존재 확인
SELECT EXISTS (
SELECT FROM information_schema.tables
WHERE table_schema = 'public'
AND table_name = 'batch_history'
);
-- 테이블 구조 확인
\d public.batch_history
-- 인덱스 확인
\di public.idx_batch_history_*
```
### 테이블 재생성 (데이터 삭제 주의!)
```sql
-- 기존 테이블 삭제 (주의: 모든 데이터 손실!)
DROP TABLE IF EXISTS public.batch_history CASCADE;
-- schema.sql 재실행
\i src/main/resources/sql/schema.sql
```
## 백업 및 복구
### 테이블 백업
```bash
pg_dump -h [host] -U [user] -d [database] -t batch_history > batch_history_backup.sql
```
### 테이블 복구
```bash
psql -h [host] -U [user] -d [database] < batch_history_backup.sql
```
## 모니터링 쿼리
### 최근 배치 실행 이력 조회
```sql
SELECT
uuid,
job,
id,
created_dttm,
completed_dttm,
status,
EXTRACT(EPOCH FROM (completed_dttm - created_dttm)) as duration_seconds
FROM public.batch_history
ORDER BY created_dttm DESC
LIMIT 10;
```
### 실패한 배치 조회
```sql
SELECT *
FROM public.batch_history
WHERE status = 'FAILED'
ORDER BY created_dttm DESC;
```
### 배치 작업별 성공률
```sql
SELECT
job,
COUNT(*) as total_runs,
SUM(CASE WHEN status = 'COMPLETED' THEN 1 ELSE 0 END) as successful_runs,
ROUND(100.0 * SUM(CASE WHEN status = 'COMPLETED' THEN 1 ELSE 0 END) / COUNT(*), 2) as success_rate
FROM public.batch_history
GROUP BY job
ORDER BY total_runs DESC;
```

View File

@@ -0,0 +1,347 @@
# 폐쇄망 환경 사전 빌드 JAR 배포 가이드
## 개요
인터넷이 차단된 Jenkins 서버에서 빌드 없이 사전에 빌드된 JAR 파일을 사용하여 배포합니다.
## 아키텍처
### 기존 방식 (실패)
```
Jenkins 서버 → Git Clone → Gradle Build (의존성 다운로드 필요) → JAR 생성 → 배포
❌ 인터넷 필요
```
### 새로운 방식 (성공)
```
로컬 PC → Gradle Build → JAR 생성 → Git Commit
Jenkins 서버 → Git Clone → JAR 사용 (빌드 불필요) → 배포
✅ 인터넷 불필요
```
## 장점
1. **인터넷 불필요**: Jenkins 서버가 완전히 폐쇄망이어도 동작
2. **빠른 배포**: 빌드 시간 불필요 (~3분 → ~10초)
3. **간단한 설정**: 복잡한 Gradle cache나 local repository 불필요
4. **일관성**: 로컬에서 테스트한 동일한 JAR 파일 배포
## 단점
1. **Git 용량 증가**: JAR 파일(~19MB)이 버전 관리됨
2. **로컬 빌드 필요**: 코드 변경 시 로컬에서 빌드 후 커밋 필요
3. **바이너리 커밋**: Git에 바이너리 파일 포함
## 사용 방법
### 1. 로컬 환경 (인터넷 연결 가능한 PC)
#### 코드 변경 시
```bash
cd kamco-make-dataset-generation
# 1. 코드 수정
# ... 코드 변경 ...
# 2. JAR 빌드
./gradlew clean bootJar
# 3. JAR 파일 확인
ls -lh build/libs/generator-dataset-for-training.jar
# 4. Git 커밋
git add .
git add -f build/libs/generator-dataset-for-training.jar
git commit -m "Update: 기능 추가 및 JAR 업데이트"
git push
```
### 2. Jenkins 서버 (폐쇄망 환경)
#### Jenkins 파라미터
파이프라인 실행 시 다음 파라미터를 설정할 수 있습니다:
| 파라미터 | 타입 | 기본값 | 설명 |
|---------|------|--------|------|
| SPRING_PROFILES_ACTIVE | String | prod | Spring Profile (dev/prod) |
| BATCH_DATE | String | (공백) | Batch 날짜 (YYYY-MM-DD, 공백 = 오늘) |
| ADDITIONAL_PARAMS | String | (공백) | 추가 파라미터 (예: limit=100) |
| ACTION | Choice | RUN | 실행 모드 (RUN / VERIFY_ONLY) |
#### 실행 예시
**1. 기본 실행 (오늘 날짜)**:
```
ACTION: RUN
SPRING_PROFILES_ACTIVE: prod
BATCH_DATE: (공백) ← 자동으로 오늘 날짜 사용
ADDITIONAL_PARAMS: (공백)
→ java -jar generator-dataset-for-training.jar --spring.profiles.active=prod date=2026-02-09
```
**2. 특정 날짜로 실행**:
```
ACTION: RUN
SPRING_PROFILES_ACTIVE: prod
BATCH_DATE: 2024-01-15
ADDITIONAL_PARAMS: (공백)
→ java -jar generator-dataset-for-training.jar --spring.profiles.active=prod date=2024-01-15
```
**3. 오늘 날짜 + 추가 파라미터**:
```
ACTION: RUN
SPRING_PROFILES_ACTIVE: prod
BATCH_DATE: (공백)
ADDITIONAL_PARAMS: limit=100 debug=true
→ java -jar generator-dataset-for-training.jar --spring.profiles.active=prod date=2026-02-09 limit=100 debug=true
```
**4. 개발 환경 테스트**:
```
ACTION: RUN
SPRING_PROFILES_ACTIVE: dev
BATCH_DATE: 2024-02-01
ADDITIONAL_PARAMS: dryRun=true
→ java -jar generator-dataset-for-training.jar --spring.profiles.active=dev date=2024-02-01 dryRun=true
```
**5. JAR 검증만 (실행 안 함)**:
```
ACTION: VERIFY_ONLY
→ JAR 파일 존재만 확인하고 종료
```
#### 날짜 파라미터 동작 방식
**BATCH_DATE 파라미터**:
- **비어있음 (기본값)**: Jenkins 서버의 현재 날짜 자동 사용 (YYYY-MM-DD)
- **값 입력**: 입력한 날짜 사용 (예: 2024-01-15)
**예시**:
```bash
# BATCH_DATE가 비어있으면
TODAY=$(date +%Y-%m-%d) # 2026-02-09
java -jar app.jar date=2026-02-09
# BATCH_DATE에 2024-01-15 입력하면
java -jar app.jar date=2024-01-15
```
#### Jenkins 파이프라인 단계
Jenkins는 자동으로:
1. **Environment 설정**: 오늘 날짜 계산 (TODAY 변수)
2. **Checkout**: Git에서 코드 체크아웃
3. **Verify JAR**: JAR 파일 존재 확인
4. **Run JAR** (ACTION=RUN인 경우만):
- BATCH_DATE가 비어있으면 TODAY 사용
- JAR 실행 with 파라미터
## 파일 구조
```
kamco-make-dataset-generation/
├── build/
│ └── libs/
│ └── generator-dataset-for-training.jar ← Git에 포함됨
├── src/
├── build.gradle
├── Jenkinsfile
└── .gitignore ← build/libs/*.jar 예외 처리
```
## .gitignore 설정
```gitignore
# Gradle
.gradle/
build/
!build/libs/ # libs 디렉토리는 추적
!build/libs/*.jar # jar 파일은 추적
out/
```
## Jenkinsfile 구조
```groovy
parameters {
string(name: 'SPRING_PROFILES_ACTIVE', defaultValue: 'prod', ...)
string(name: 'BATCH_DATE', defaultValue: '', ...)
string(name: 'ADDITIONAL_PARAMS', defaultValue: '', ...)
choice(name: 'ACTION', choices: ['RUN', 'VERIFY_ONLY'], ...)
}
environment {
TODAY = sh(script: "date +%Y-%m-%d", returnStdout: true).trim()
}
stage('Verify JAR') {
// JAR 파일 존재 확인
// 파일 정보 출력
}
stage('Run JAR') {
when {
expression { params.ACTION == 'RUN' }
}
// 날짜 결정: BATCH_DATE가 비어있으면 TODAY 사용
def batchDate = params.BATCH_DATE ?: env.TODAY
// JAR 실행
// java -jar generator-dataset-for-training.jar --spring.profiles.active=prod date=${batchDate} ${ADDITIONAL_PARAMS}
}
```
## 배포 설정
Deploy stage에 실제 배포 명령을 추가하세요:
```groovy
stage('Deploy') {
steps {
dir("kamco-make-dataset-generation") {
script {
def jarPath = "build/libs/${env.JAR_NAME}"
// 예시 1: SCP로 서버 전송
sh "scp ${jarPath} user@target-server:/app/lib/"
// 예시 2: SSH로 서비스 재시작
sh "ssh user@target-server 'systemctl restart kamco-app'"
// 예시 3: Docker 이미지 빌드
sh "docker build -t kamco-app:${env.COMMIT_HASH} ."
sh "docker push kamco-app:${env.COMMIT_HASH}"
// 예시 4: Kubernetes 배포
sh "kubectl set image deployment/kamco-app kamco-app=kamco-app:${env.COMMIT_HASH}"
}
}
}
}
```
## 트러블슈팅
### JAR 파일이 Git에 추가되지 않음
**문제**: `git status`에서 JAR 파일이 보이지 않음
**해결**:
```bash
# Force add로 추가
git add -f build/libs/generator-dataset-for-training.jar
# .gitignore 확인
cat .gitignore | grep build
```
### Jenkins에서 JAR 파일을 찾을 수 없음
**문제**: "JAR file not found" 에러
**해결**:
```bash
# 로컬에서 JAR 커밋 확인
git log --all -- build/libs/generator-dataset-for-training.jar
# JAR가 커밋되었는지 확인
git ls-tree -r HEAD --name-only | grep jar
# 커밋되지 않았다면 다시 커밋
git add -f build/libs/generator-dataset-for-training.jar
git commit -m "Add pre-built JAR file"
git push
```
### JAR 파일 크기가 너무 큼
**문제**: Git repository 크기 증가 우려
**해결**:
1. 이 방법이 적절하지 않다면 Gradle cache 방법 사용
2. Git LFS(Large File Storage) 사용 고려
3. Artifactory/Nexus 같은 artifact repository 구축
## 의존성 업데이트
새로운 의존성 추가 시:
```bash
# 1. build.gradle 수정
# 2. 로컬에서 빌드
./gradlew clean bootJar
# 3. 새 JAR 커밋
git add build.gradle
git add -f build/libs/generator-dataset-for-training.jar
git commit -m "Add new dependency: spring-boot-starter-security"
git push
```
## 버전 관리 전략
### Git 커밋 메시지 예시
```bash
# 기능 추가
git commit -m "feat: 사용자 인증 기능 추가 및 JAR 업데이트"
# 버그 수정
git commit -m "fix: 데이터 처리 버그 수정 및 JAR 업데이트"
# JAR만 업데이트
git commit -m "build: JAR 재빌드 (의존성 업데이트)"
```
### 태그 사용
```bash
# 릴리스 태그
git tag -a v1.0.0 -m "Release version 1.0.0"
git push origin v1.0.0
# Jenkins에서 특정 버전 배포
git checkout v1.0.0
```
## 성능 비교
| 항목 | 기존 방식 | 새로운 방식 |
|------|----------|------------|
| 빌드 시간 | ~3분 | 없음 (~10초) |
| 인터넷 필요 | 필요 | 불필요 |
| 디스크 사용 | ~500MB (Gradle cache) | ~19MB (JAR) |
| 설정 복잡도 | 높음 | 낮음 |
| 일관성 | 빌드 환경에 따라 다름 | 동일한 JAR |
## 주의사항
1. **JAR 파일 크기**: 매 커밋마다 19MB 추가 (Git history 증가)
2. **빌드 책임**: 개발자가 로컬에서 빌드 책임
3. **테스트 중요성**: 로컬에서 충분한 테스트 후 커밋
4. **Git LFS 고려**: JAR 파일이 자주 변경되면 Git LFS 사용 검토
## 대안 방법
이 방법이 적합하지 않은 경우:
### 1. Gradle Cache 방법
- ~/.gradle 디렉토리를 Jenkins 서버로 복사
- 한 번만 설정, 이후 offline 빌드
### 2. Artifact Repository
- Nexus, Artifactory 구축
- 폐쇄망 내부에 Maven repository 운영
### 3. Docker Image
- Docker image에 JAR 포함
- Docker registry 사용
## 참고 자료
- Spring Boot Documentation: https://docs.spring.io/spring-boot/
- Gradle Documentation: https://docs.gradle.org/
- Git LFS: https://git-lfs.github.com/

View File

@@ -0,0 +1,550 @@
#!/bin/bash
# pack_offline_bundle_airgap.sh
# ============================================================================
# Gradle Offline Bundle Packer
# ============================================================================
# Version: 4.0
#
# WORKFLOW:
# 1. [ONLINE] Build project (./gradlew bootJar) - downloads all deps
# 2. [ONLINE] Test run (./gradlew bootRun) - verify app works
# 3. [OFFLINE TEST] Verify offline build works
# 4. Create bundle with all cached dependencies
#
# REQUIREMENTS:
# - Internet connection (for initial build)
# - Project with gradlew
# ============================================================================
set -e
# ============================================================================
# Configuration
# ============================================================================
WRAPPER_SEED_PATH="wrapper_jar_seed"
OFFLINE_HOME_NAME="_offline_gradle_home"
BOOTRUN_TIMEOUT_SECONDS=60
# Color codes
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
CYAN='\033[0;36m'
GRAY='\033[0;90m'
WHITE='\033[1;37m'
NC='\033[0m' # No Color
echo ""
echo -e "${CYAN}============================================================${NC}"
echo -e "${CYAN} Gradle Offline Bundle Packer v4.0${NC}"
echo -e "${CYAN}============================================================${NC}"
echo ""
echo -e "${WHITE} This script will:${NC}"
echo -e "${GRAY} 1. Build project with internet (download dependencies)${NC}"
echo -e "${GRAY} 2. Test run application (verify it works)${NC}"
echo -e "${GRAY} 3. Test offline build (verify cache is complete)${NC}"
echo -e "${GRAY} 4. Create offline bundle for air-gapped environment${NC}"
echo ""
echo -e "${CYAN}============================================================${NC}"
echo ""
# ============================================================================
# [1/20] Check Current Directory
# ============================================================================
echo -e "${YELLOW}==[1/20] Check Current Directory ==${NC}"
ROOT="$(pwd)"
echo "ROOT_DIR: $ROOT"
echo ""
# ============================================================================
# [2/20] Check Required Files
# ============================================================================
echo -e "${YELLOW}==[2/20] Check Required Files ==${NC}"
if [ ! -f "./gradlew" ]; then
echo -e "${RED}ERROR: gradlew not found. Run from project root.${NC}"
exit 1
fi
chmod +x ./gradlew
echo -e "${GREEN}[OK] gradlew${NC}"
BUILD_FILE=""
if [ -f "./build.gradle" ]; then
BUILD_FILE="build.gradle"
elif [ -f "./build.gradle.kts" ]; then
BUILD_FILE="build.gradle.kts"
else
echo -e "${RED}ERROR: build.gradle(.kts) not found.${NC}"
exit 1
fi
echo -e "${GREEN}[OK] $BUILD_FILE${NC}"
SETTINGS_FILE=""
if [ -f "./settings.gradle" ]; then
SETTINGS_FILE="settings.gradle"
echo -e "${GREEN}[OK] $SETTINGS_FILE${NC}"
elif [ -f "./settings.gradle.kts" ]; then
SETTINGS_FILE="settings.gradle.kts"
echo -e "${GREEN}[OK] $SETTINGS_FILE${NC}"
fi
echo ""
# ============================================================================
# [3/20] Check Gradle Wrapper
# ============================================================================
echo -e "${YELLOW}==[3/20] Check Gradle Wrapper ==${NC}"
WRAPPER_DIR="$ROOT/gradle/wrapper"
WRAPPER_JAR="$WRAPPER_DIR/gradle-wrapper.jar"
WRAPPER_PROP="$WRAPPER_DIR/gradle-wrapper.properties"
mkdir -p "$WRAPPER_DIR"
if [ ! -f "$WRAPPER_PROP" ]; then
echo -e "${RED}ERROR: gradle-wrapper.properties not found.${NC}"
exit 1
fi
if [ ! -f "$WRAPPER_JAR" ]; then
SEED_JAR="$ROOT/$WRAPPER_SEED_PATH/gradle-wrapper.jar"
if [ -f "$SEED_JAR" ]; then
cp "$SEED_JAR" "$WRAPPER_JAR"
echo -e "${GREEN}[OK] Wrapper jar injected from seed${NC}"
else
echo -e "${RED}ERROR: gradle-wrapper.jar missing${NC}"
exit 1
fi
else
echo -e "${GREEN}[OK] gradle-wrapper.jar exists${NC}"
fi
# Create seed backup
SEED_DIR="$ROOT/$WRAPPER_SEED_PATH"
if [ ! -d "$SEED_DIR" ]; then
mkdir -p "$SEED_DIR"
cp "$WRAPPER_JAR" "$SEED_DIR/gradle-wrapper.jar"
fi
echo ""
# ============================================================================
# [4/20] Set GRADLE_USER_HOME (Project Local)
# ============================================================================
echo -e "${YELLOW}==[4/20] Set GRADLE_USER_HOME ==${NC}"
OFFLINE_HOME="$ROOT/$OFFLINE_HOME_NAME"
mkdir -p "$OFFLINE_HOME"
export GRADLE_USER_HOME="$OFFLINE_HOME"
echo -e "${CYAN}GRADLE_USER_HOME = $GRADLE_USER_HOME${NC}"
echo -e "${GRAY}[INFO] All dependencies will be cached in project folder${NC}"
echo ""
# ============================================================================
# [5/20] Check Internet Connection
# ============================================================================
echo -e "${YELLOW}==[5/20] Check Internet Connection ==${NC}"
HAS_INTERNET=false
TEST_HOSTS=("plugins.gradle.org" "repo.maven.apache.org" "repo1.maven.org")
for TEST_HOST in "${TEST_HOSTS[@]}"; do
if ping -c 1 -W 3 "$TEST_HOST" &>/dev/null; then
HAS_INTERNET=true
echo -e "${GREEN}[OK] Connected to $TEST_HOST${NC}"
break
fi
done
if [ "$HAS_INTERNET" = false ]; then
# Try DNS resolution as fallback
if nslookup google.com &>/dev/null || host google.com &>/dev/null; then
HAS_INTERNET=true
echo -e "${GREEN}[OK] Internet available (DNS)${NC}"
fi
fi
if [ "$HAS_INTERNET" = false ]; then
echo ""
echo -e "${RED}============================================================${NC}"
echo -e "${RED} ERROR: No Internet Connection!${NC}"
echo -e "${RED}============================================================${NC}"
echo ""
echo -e "${YELLOW}This script requires internet for initial build.${NC}"
echo -e "${YELLOW}Please connect to internet and run again.${NC}"
echo ""
exit 1
fi
echo ""
# ============================================================================
# [6/20] Initial Gradle Setup
# ============================================================================
echo -e "${YELLOW}==[6/20] Initial Gradle Setup ==${NC}"
echo -e "${GRAY}[INFO] Downloading Gradle distribution...${NC}"
if ./gradlew --version &>/dev/null; then
GRADLE_VERSION=$(./gradlew --version 2>&1 | grep "^Gradle" | awk '{print $2}')
echo -e "${GREEN}[OK] Gradle $GRADLE_VERSION${NC}"
else
echo -e "${RED}[ERROR] Gradle setup failed${NC}"
exit 1
fi
echo ""
# ============================================================================
# [7/20] ONLINE BUILD - bootJar (Download All Dependencies)
# ============================================================================
echo -e "${YELLOW}==[7/20] ONLINE BUILD - bootJar ==${NC}"
echo ""
echo -e "${CYAN}============================================================${NC}"
echo -e "${CYAN} ONLINE BUILD (with Internet)${NC}"
echo -e "${CYAN} Downloading all dependencies to local cache${NC}"
echo -e "${CYAN}============================================================${NC}"
echo ""
BUILD_SUCCESS=false
./gradlew clean bootJar --no-daemon
if [ $? -eq 0 ]; then
BUILD_SUCCESS=true
echo ""
echo -e "${GREEN}============================================================${NC}"
echo -e "${GREEN} ONLINE BUILD SUCCESS!${NC}"
echo -e "${GREEN}============================================================${NC}"
echo ""
if [ -d "./build/libs" ]; then
echo -e "${CYAN}JAR files:${NC}"
ls -lh ./build/libs/*.jar 2>/dev/null | awk '{print " " $9 " (" $5 ")"}'
fi
else
echo ""
echo -e "${RED}============================================================${NC}"
echo -e "${RED} BUILD FAILED!${NC}"
echo -e "${RED}============================================================${NC}"
echo ""
echo -e "${YELLOW}Build failed. Cannot continue.${NC}"
exit 1
fi
echo ""
# ============================================================================
# [8/20] Stop Daemons
# ============================================================================
echo -e "${YELLOW}==[8/20] Stop Daemons ==${NC}"
./gradlew --stop &>/dev/null || true
sleep 2
echo -e "${GREEN}[OK] Daemons stopped${NC}"
echo ""
# ============================================================================
# [9/20] ONLINE TEST - bootRun (Verify Application Works)
# ============================================================================
echo -e "${YELLOW}==[9/20] ONLINE TEST - bootRun ==${NC}"
echo ""
echo -e "${CYAN}============================================================${NC}"
echo -e "${CYAN} Testing application startup (timeout: ${BOOTRUN_TIMEOUT_SECONDS}s)${NC}"
echo -e "${CYAN} Will automatically stop after successful startup${NC}"
echo -e "${CYAN}============================================================${NC}"
echo ""
BOOTRUN_SUCCESS=false
timeout ${BOOTRUN_TIMEOUT_SECONDS}s ./gradlew bootRun --no-daemon &
BOOTRUN_PID=$!
sleep 10
if ps -p $BOOTRUN_PID &>/dev/null; then
BOOTRUN_SUCCESS=true
echo ""
echo -e "${GREEN}[OK] Application started successfully${NC}"
kill $BOOTRUN_PID &>/dev/null || true
sleep 2
else
echo ""
echo -e "${YELLOW}[WARN] Application may not have started properly${NC}"
fi
# Cleanup
pkill -f "gradle.*bootRun" &>/dev/null || true
sleep 2
echo ""
# ============================================================================
# [10/20] Stop Daemons Again
# ============================================================================
echo -e "${YELLOW}==[10/20] Stop Daemons Again ==${NC}"
./gradlew --stop &>/dev/null || true
sleep 2
echo -e "${GREEN}[OK] Daemons stopped${NC}"
echo ""
# ============================================================================
# [11/20] OFFLINE BUILD TEST (Verify Cache Completeness)
# ============================================================================
echo -e "${YELLOW}==[11/20] OFFLINE BUILD TEST ==${NC}"
echo ""
echo -e "${CYAN}============================================================${NC}"
echo -e "${CYAN} OFFLINE BUILD TEST (--offline flag)${NC}"
echo -e "${CYAN} Verifying all dependencies are cached${NC}"
echo -e "${CYAN}============================================================${NC}"
echo ""
OFFLINE_SUCCESS=false
./gradlew clean bootJar --offline --no-daemon
if [ $? -eq 0 ]; then
OFFLINE_SUCCESS=true
echo ""
echo -e "${GREEN}============================================================${NC}"
echo -e "${GREEN} OFFLINE BUILD TEST PASSED!${NC}"
echo -e "${GREEN}============================================================${NC}"
echo ""
echo -e "${GREEN}[OK] All dependencies are cached${NC}"
else
echo ""
echo -e "${RED}============================================================${NC}"
echo -e "${RED} OFFLINE BUILD TEST FAILED!${NC}"
echo -e "${RED}============================================================${NC}"
echo ""
echo -e "${YELLOW}Some dependencies may be missing from cache.${NC}"
echo -e "${YELLOW}The bundle may not work in air-gapped environment.${NC}"
echo ""
read -p "Continue anyway? (y/N): " -n 1 -r
echo
if [[ ! $REPLY =~ ^[Yy]$ ]]; then
exit 1
fi
fi
echo ""
# ============================================================================
# [12/20] Stop Daemons Before Archive
# ============================================================================
echo -e "${YELLOW}==[12/20] Stop Daemons Before Archive ==${NC}"
./gradlew --stop &>/dev/null || true
sleep 2
echo -e "${GREEN}[OK] Daemons stopped${NC}"
echo ""
# ============================================================================
# [13/20] Verify settings.gradle for Offline
# ============================================================================
echo -e "${YELLOW}==[13/20] Verify settings.gradle ==${NC}"
if [ -n "$SETTINGS_FILE" ]; then
if grep -q "mavenLocal()" "$SETTINGS_FILE" && grep -q "pluginManagement" "$SETTINGS_FILE"; then
echo -e "${GREEN}[OK] settings.gradle configured for offline${NC}"
else
echo -e "${YELLOW}[WARN] settings.gradle may need offline configuration${NC}"
echo -e "${GRAY}[INFO] Consider adding mavenLocal() to pluginManagement and repositories${NC}"
fi
else
echo -e "${GRAY}[INFO] No settings.gradle found${NC}"
fi
echo ""
# ============================================================================
# [14/20] Create Helper Scripts
# ============================================================================
echo -e "${YELLOW}==[14/20] Create Helper Scripts ==${NC}"
# run_offline_build.sh
cat > "$ROOT/run_offline_build.sh" << 'EOF'
#!/bin/bash
# run_offline_build.sh - Build JAR offline
export GRADLE_USER_HOME="$(pwd)/_offline_gradle_home"
echo "GRADLE_USER_HOME = $GRADLE_USER_HOME"
echo ""
./gradlew --offline bootJar --no-daemon
if [ $? -eq 0 ]; then
echo ""
echo "BUILD SUCCESS!"
echo ""
echo "JAR files:"
ls -lh ./build/libs/*.jar 2>/dev/null | awk '{print " " $9}'
else
echo "BUILD FAILED"
fi
EOF
chmod +x "$ROOT/run_offline_build.sh"
echo -e "${GREEN}[OK] run_offline_build.sh${NC}"
# run_offline_bootrun.sh
cat > "$ROOT/run_offline_bootrun.sh" << 'EOF'
#!/bin/bash
# run_offline_bootrun.sh - Run application offline
export GRADLE_USER_HOME="$(pwd)/_offline_gradle_home"
echo "GRADLE_USER_HOME = $GRADLE_USER_HOME"
echo ""
echo "Starting application (Ctrl+C to stop)..."
echo ""
./gradlew --offline bootRun --no-daemon
EOF
chmod +x "$ROOT/run_offline_bootrun.sh"
echo -e "${GREEN}[OK] run_offline_bootrun.sh${NC}"
echo ""
# ============================================================================
# [15/20] Final Daemon Cleanup
# ============================================================================
echo -e "${YELLOW}==[15/20] Final Daemon Cleanup ==${NC}"
./gradlew --stop &>/dev/null || true
sleep 2
echo -e "${GREEN}[OK] Daemons stopped${NC}"
echo ""
# ============================================================================
# [16/20] Clean Lock Files
# ============================================================================
echo -e "${YELLOW}==[16/20] Clean Lock Files ==${NC}"
DAEMON_DIR="$OFFLINE_HOME/daemon"
if [ -d "$DAEMON_DIR" ]; then
rm -rf "$DAEMON_DIR" 2>/dev/null || true
fi
find "$OFFLINE_HOME" -type f \( -name "*.lock" -o -name "*.log" -o -name "*.tmp" \) -delete 2>/dev/null || true
echo -e "${GREEN}[OK] Lock files cleaned${NC}"
echo ""
# ============================================================================
# [17/20] Calculate Cache Size
# ============================================================================
echo -e "${YELLOW}==[17/20] Cache Summary ==${NC}"
CACHES_DIR="$OFFLINE_HOME/caches"
WRAPPER_DISTS="$OFFLINE_HOME/wrapper/dists"
TOTAL_SIZE=0
if [ -d "$CACHES_DIR" ]; then
SIZE=$(du -sb "$CACHES_DIR" 2>/dev/null | cut -f1)
TOTAL_SIZE=$((TOTAL_SIZE + SIZE))
SIZE_MB=$(echo "scale=2; $SIZE / 1048576" | bc)
echo -e "${CYAN}[INFO] Dependencies: ${SIZE_MB} MB${NC}"
fi
if [ -d "$WRAPPER_DISTS" ]; then
SIZE=$(du -sb "$WRAPPER_DISTS" 2>/dev/null | cut -f1)
TOTAL_SIZE=$((TOTAL_SIZE + SIZE))
SIZE_MB=$(echo "scale=2; $SIZE / 1048576" | bc)
echo -e "${CYAN}[INFO] Gradle dist: ${SIZE_MB} MB${NC}"
fi
TOTAL_MB=$(echo "scale=2; $TOTAL_SIZE / 1048576" | bc)
echo -e "${CYAN}[INFO] Total cache: ${TOTAL_MB} MB${NC}"
echo ""
# ============================================================================
# [18/20] Create Archive
# ============================================================================
echo -e "${YELLOW}==[18/20] Create Archive ==${NC}"
BASE_NAME=$(basename "$ROOT")
TIMESTAMP=$(date +"%Y%m%d_%H%M%S")
PARENT=$(dirname "$ROOT")
ARCHIVE_PATH="${PARENT}/${BASE_NAME}_offline_bundle_${TIMESTAMP}.tar.gz"
echo "Archive: $ARCHIVE_PATH"
echo -e "${GRAY}[INFO] Creating archive (this may take several minutes)...${NC}"
tar -czf "$ARCHIVE_PATH" \
--exclude=".git" \
--exclude=".idea" \
--exclude=".DS_Store" \
--exclude="*.log" \
--exclude="*.lock" \
--exclude="_offline_gradle_home/daemon" \
--exclude="_offline_gradle_home/native" \
--exclude="_offline_gradle_home/jdks" \
--exclude="build" \
--exclude="out" \
--exclude=".gradle" \
-C "$ROOT" .
if [ $? -ne 0 ]; then
echo -e "${RED}ERROR: tar failed${NC}"
exit 1
fi
ARCHIVE_SIZE=$(stat -f%z "$ARCHIVE_PATH" 2>/dev/null || stat -c%s "$ARCHIVE_PATH" 2>/dev/null)
ARCHIVE_SIZE_MB=$(echo "scale=2; $ARCHIVE_SIZE / 1048576" | bc)
echo -e "${GREEN}[OK] Archive created: ${ARCHIVE_SIZE_MB} MB${NC}"
echo ""
# ============================================================================
# [19/20] Verify Archive
# ============================================================================
echo -e "${YELLOW}==[19/20] Verify Archive ==${NC}"
CHECKS=(
"gradle/wrapper/gradle-wrapper.jar"
"gradlew"
"_offline_gradle_home/caches"
"run_offline_build.sh"
)
for CHECK in "${CHECKS[@]}"; do
if tar -tzf "$ARCHIVE_PATH" | grep -q "$CHECK"; then
echo -e " ${GREEN}[OK] $CHECK${NC}"
else
echo -e " ${YELLOW}[WARN] $CHECK${NC}"
fi
done
echo ""
# ============================================================================
# [20/20] Complete
# ============================================================================
echo -e "${GREEN}============================================================${NC}"
echo -e "${GREEN} BUNDLE CREATION COMPLETE!${NC}"
echo -e "${GREEN}============================================================${NC}"
echo ""
echo -e "${CYAN}Archive: $ARCHIVE_PATH${NC}"
echo -e "${CYAN}Size: ${ARCHIVE_SIZE_MB} MB${NC}"
echo ""
echo -e "${CYAN}============================================================${NC}"
echo -e "${CYAN} Test Results${NC}"
echo -e "${CYAN}============================================================${NC}"
if [ "$BUILD_SUCCESS" = true ]; then
echo -e " Online build (bootJar): ${GREEN}PASSED${NC}"
else
echo -e " Online build (bootJar): ${RED}FAILED${NC}"
fi
if [ "$BOOTRUN_SUCCESS" = true ]; then
echo -e " Online test (bootRun): ${GREEN}PASSED${NC}"
else
echo -e " Online test (bootRun): ${YELLOW}SKIPPED${NC}"
fi
if [ "$OFFLINE_SUCCESS" = true ]; then
echo -e " Offline build test: ${GREEN}PASSED${NC}"
else
echo -e " Offline build test: ${RED}FAILED${NC}"
fi
echo ""
echo -e "${YELLOW}============================================================${NC}"
echo -e "${YELLOW} Usage in Air-gapped Environment${NC}"
echo -e "${YELLOW}============================================================${NC}"
echo ""
echo -e "${WHITE}Option 1: Use unpack script${NC}"
echo -e "${GRAY} ./unpack_and_offline_build_airgap.sh${NC}"
echo ""
echo -e "${WHITE}Option 2: Manual extraction${NC}"
echo -e "${GRAY} tar -xzf <archive>.tar.gz${NC}"
echo -e "${GRAY} cd <project>${NC}"
echo -e "${GRAY} ./run_offline_build.sh${NC}"
echo ""
echo -e "${WHITE}Option 3: Direct commands${NC}"
echo -e "${GRAY} export GRADLE_USER_HOME=\"./_offline_gradle_home\"${NC}"
echo -e "${GRAY} ./gradlew --offline bootJar --no-daemon${NC}"
echo ""

View File

@@ -0,0 +1,347 @@
#!/bin/bash
# unpack_and_offline_build_airgap.sh
# ============================================================================
# Execution Environment: OFFLINE (Air-gapped, No Internet)
# Purpose: Extract bundle and run offline build
# ============================================================================
# Linux Bash Script
# Version: 3.1
#
# IMPORTANT: This script automatically:
# 1. Extracts the archive
# 2. Sets GRADLE_USER_HOME to project local cache
# 3. Configures settings.gradle for offline resolution
# 4. Runs build with --offline flag
# ============================================================================
set -e
# ============================================================================
# Configuration
# ============================================================================
WRAPPER_SEED_PATH="wrapper_jar_seed"
OFFLINE_HOME_NAME="_offline_gradle_home"
# Color codes
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
CYAN='\033[0;36m'
GRAY='\033[0;90m'
WHITE='\033[1;37m'
NC='\033[0m' # No Color
echo ""
echo -e "${CYAN}============================================================${NC}"
echo -e "${CYAN} Gradle Offline Build Runner${NC}"
echo -e "${CYAN} Environment: AIR-GAPPED (No Internet)${NC}"
echo -e "${CYAN} Mode: Fully Offline (--offline enforced)${NC}"
echo -e "${CYAN}============================================================${NC}"
echo ""
# ============================================================================
# [1/16] Check Current Directory
# ============================================================================
echo -e "${YELLOW}==[1/16] Check Current Directory ==${NC}"
START_DIR="$(pwd)"
echo "PWD: $START_DIR"
echo ""
# ============================================================================
# [2/16] Select Archive
# ============================================================================
echo -e "${YELLOW}==[2/16] Select Archive ==${NC}"
ARCHIVE=""
if [ $# -ge 1 ]; then
ARCHIVE="$1"
fi
if [ -z "$ARCHIVE" ]; then
# Auto-detect most recent .tar.gz file
ARCHIVE=$(find "$START_DIR" -maxdepth 1 -type f \( -name "*.tar.gz" -o -name "*.tgz" \) -printf '%T@ %p\n' 2>/dev/null | sort -rn | head -1 | cut -d' ' -f2-)
if [ -z "$ARCHIVE" ]; then
echo -e "${RED}[ERROR] No archive found${NC}"
ls -lh "$START_DIR"
exit 1
fi
echo -e "${CYAN}[AUTO] $(basename "$ARCHIVE")${NC}"
else
if [ ! -f "$ARCHIVE" ]; then
ARCHIVE="$START_DIR/$ARCHIVE"
fi
echo -e "${CYAN}[USER] $(basename "$ARCHIVE")${NC}"
fi
if [ ! -f "$ARCHIVE" ]; then
echo -e "${RED}ERROR: Archive not found: $ARCHIVE${NC}"
exit 1
fi
ARCHIVE_SIZE=$(stat -f%z "$ARCHIVE" 2>/dev/null || stat -c%s "$ARCHIVE" 2>/dev/null)
ARCHIVE_SIZE_MB=$(echo "scale=2; $ARCHIVE_SIZE / 1048576" | bc)
echo "Size: ${ARCHIVE_SIZE_MB} MB"
echo ""
# ============================================================================
# [3/16] Check tar
# ============================================================================
echo -e "${YELLOW}==[3/16] Check tar ==${NC}"
if ! command -v tar &>/dev/null; then
echo -e "${RED}ERROR: tar not found${NC}"
exit 1
fi
echo -e "${GREEN}[OK] tar found${NC}"
echo ""
# ============================================================================
# [4/16] Extract Archive
# ============================================================================
echo -e "${YELLOW}==[4/16] Extract Archive ==${NC}"
echo -e "${GRAY}[INFO] Extracting...${NC}"
tar -xzf "$ARCHIVE" -C "$START_DIR"
if [ $? -ne 0 ]; then
echo -e "${RED}ERROR: Extraction failed${NC}"
exit 1
fi
echo -e "${GREEN}[OK] Extracted${NC}"
echo ""
# ============================================================================
# [5/16] Set Permissions
# ============================================================================
echo -e "${YELLOW}==[5/16] Set Permissions ==${NC}"
chmod -R u+rw "$START_DIR" 2>/dev/null || true
echo -e "${GREEN}[OK] Permissions set${NC}"
echo ""
# ============================================================================
# [6/16] Find Project Root
# ============================================================================
echo -e "${YELLOW}==[6/16] Find Project Root ==${NC}"
GRADLEW=$(find "$START_DIR" -name "gradlew" -type f 2>/dev/null | sort | head -1)
if [ -z "$GRADLEW" ]; then
echo -e "${RED}ERROR: gradlew not found${NC}"
exit 1
fi
PROJECT_DIR=$(dirname "$GRADLEW")
echo -e "${CYAN}Project: $PROJECT_DIR${NC}"
cd "$PROJECT_DIR"
echo ""
# ============================================================================
# [7/16] Fix Permissions
# ============================================================================
echo -e "${YELLOW}==[7/16] Fix Permissions ==${NC}"
chmod +x ./gradlew
find . -name "*.sh" -type f -exec chmod +x {} \; 2>/dev/null || true
echo -e "${GREEN}[OK] Permissions fixed${NC}"
echo ""
# ============================================================================
# [8/16] Verify Wrapper
# ============================================================================
echo -e "${YELLOW}==[8/16] Verify Wrapper ==${NC}"
WRAPPER_DIR="$PROJECT_DIR/gradle/wrapper"
WRAPPER_JAR="$WRAPPER_DIR/gradle-wrapper.jar"
WRAPPER_PROP="$WRAPPER_DIR/gradle-wrapper.properties"
if [ ! -f "$WRAPPER_PROP" ]; then
echo -e "${RED}ERROR: gradle-wrapper.properties missing${NC}"
exit 1
fi
if [ ! -f "$WRAPPER_JAR" ]; then
SEED_JAR="$PROJECT_DIR/$WRAPPER_SEED_PATH/gradle-wrapper.jar"
if [ -f "$SEED_JAR" ]; then
mkdir -p "$WRAPPER_DIR"
cp "$SEED_JAR" "$WRAPPER_JAR"
echo -e "${GREEN}[OK] Injected from seed${NC}"
else
echo -e "${RED}ERROR: wrapper jar missing${NC}"
exit 1
fi
else
echo -e "${GREEN}[OK] Wrapper verified${NC}"
fi
echo ""
# ============================================================================
# [9/16] Set GRADLE_USER_HOME
# ============================================================================
echo -e "${YELLOW}==[9/16] Set GRADLE_USER_HOME ==${NC}"
OFFLINE_HOME="$PROJECT_DIR/$OFFLINE_HOME_NAME"
if [ ! -d "$OFFLINE_HOME" ]; then
echo -e "${RED}ERROR: _offline_gradle_home not found in archive${NC}"
exit 1
fi
export GRADLE_USER_HOME="$OFFLINE_HOME"
echo -e "${CYAN}GRADLE_USER_HOME = $GRADLE_USER_HOME${NC}"
# Check cache
CACHES_DIR="$OFFLINE_HOME/caches"
if [ -d "$CACHES_DIR" ]; then
CACHE_SIZE=$(du -sb "$CACHES_DIR" 2>/dev/null | cut -f1)
CACHE_SIZE_MB=$(echo "scale=2; $CACHE_SIZE / 1048576" | bc)
echo -e "${CYAN}[INFO] Cache size: ${CACHE_SIZE_MB} MB${NC}"
else
echo -e "${YELLOW}[WARN] No cache folder found${NC}"
fi
echo ""
# ============================================================================
# [10/16] Verify settings.gradle
# ============================================================================
echo -e "${YELLOW}==[10/16] Verify settings.gradle ==${NC}"
SETTINGS_FILE=""
if [ -f "./settings.gradle" ]; then
SETTINGS_FILE="settings.gradle"
elif [ -f "./settings.gradle.kts" ]; then
SETTINGS_FILE="settings.gradle.kts"
fi
if [ -n "$SETTINGS_FILE" ]; then
if grep -q "mavenLocal()" "$SETTINGS_FILE" && grep -q "pluginManagement" "$SETTINGS_FILE"; then
echo -e "${GREEN}[OK] settings.gradle configured for offline${NC}"
else
echo -e "${YELLOW}[WARN] settings.gradle may not be configured for offline${NC}"
echo -e "${GRAY}[INFO] Build may fail if plugins not cached${NC}"
fi
fi
echo ""
# ============================================================================
# [11/16] Test Gradle
# ============================================================================
echo -e "${YELLOW}==[11/16] Test Gradle ==${NC}"
GRADLE_WORKS=false
if ./gradlew --offline --version &>/dev/null; then
GRADLE_WORKS=true
echo -e "${GREEN}[OK] Gradle working in offline mode${NC}"
else
echo -e "${YELLOW}[WARN] Gradle --version failed${NC}"
fi
echo ""
# ============================================================================
# [12/16] Stop Daemon
# ============================================================================
echo -e "${YELLOW}==[12/16] Stop Daemon ==${NC}"
./gradlew --stop &>/dev/null || true
sleep 2
echo -e "${GREEN}[OK] Daemon stopped${NC}"
echo ""
# ============================================================================
# [13/16] Run Offline Build
# ============================================================================
echo -e "${YELLOW}==[13/16] Run Offline Build ==${NC}"
echo ""
echo -e "${CYAN}============================================================${NC}"
echo -e "${CYAN} Building with --offline flag${NC}"
echo -e "${CYAN} All dependencies from local cache${NC}"
echo -e "${CYAN}============================================================${NC}"
echo ""
BUILD_SUCCESS=false
BUILD_TASK=""
# Try bootJar
echo -e "${GRAY}[TRY] --offline bootJar...${NC}"
if ./gradlew --offline clean bootJar --no-daemon; then
BUILD_SUCCESS=true
BUILD_TASK="bootJar"
fi
# Try jar
if [ "$BUILD_SUCCESS" = false ]; then
echo -e "${GRAY}[TRY] --offline jar...${NC}"
if ./gradlew --offline clean jar --no-daemon; then
BUILD_SUCCESS=true
BUILD_TASK="jar"
fi
fi
# Try build
if [ "$BUILD_SUCCESS" = false ]; then
echo -e "${GRAY}[TRY] --offline build...${NC}"
if ./gradlew --offline build --no-daemon; then
BUILD_SUCCESS=true
BUILD_TASK="build"
fi
fi
echo ""
if [ "$BUILD_SUCCESS" = true ]; then
echo -e "${GREEN}============================================================${NC}"
echo -e "${GREEN} BUILD SUCCESS! (task: $BUILD_TASK)${NC}"
echo -e "${GREEN}============================================================${NC}"
else
echo -e "${RED}============================================================${NC}"
echo -e "${RED} BUILD FAILED!${NC}"
echo -e "${RED}============================================================${NC}"
echo ""
echo -e "${YELLOW}Possible causes:${NC}"
echo -e "${WHITE} - Dependencies not in cache${NC}"
echo -e "${WHITE} - Plugin resolution failed${NC}"
echo -e "${WHITE} - Need complete build in online env first${NC}"
exit 1
fi
echo ""
# ============================================================================
# [14/16] Show Build Output
# ============================================================================
echo -e "${YELLOW}==[14/16] Build Output ==${NC}"
LIBS_DIR="$PROJECT_DIR/build/libs"
if [ -d "$LIBS_DIR" ]; then
echo -e "${CYAN}build/libs contents:${NC}"
ls -lh "$LIBS_DIR"/*.jar 2>/dev/null | awk '{printf " %-40s %10s\n", $9, $5}'
MAIN_JAR=$(find "$LIBS_DIR" -name "*.jar" -type f ! -name "*-plain.jar" ! -name "*-sources.jar" ! -name "*-javadoc.jar" 2>/dev/null | head -1)
else
echo -e "${YELLOW}[WARN] build/libs not found${NC}"
fi
echo ""
# ============================================================================
# [15/16] Run Instructions
# ============================================================================
echo -e "${YELLOW}==[15/16] Run Instructions ==${NC}"
echo ""
if [ -n "$MAIN_JAR" ]; then
echo -e "${CYAN}To run the application:${NC}"
echo -e "${WHITE} java -jar $(basename "$MAIN_JAR")${NC}"
echo ""
fi
echo -e "${CYAN}To rebuild:${NC}"
echo -e "${WHITE} export GRADLE_USER_HOME=\"./_offline_gradle_home\"${NC}"
echo -e "${WHITE} ./gradlew --offline bootJar --no-daemon${NC}"
echo ""
# ============================================================================
# [16/16] Complete
# ============================================================================
echo -e "${GREEN}============================================================${NC}"
echo -e "${GREEN} Offline Build Complete!${NC}"
echo -e "${GREEN}============================================================${NC}"
echo ""
echo -e "${CYAN}Project: $PROJECT_DIR${NC}"
echo ""

View File

@@ -0,0 +1,571 @@
#!/bin/bash
# pack_offline_bundle_airgap_macos.sh
# ============================================================================
# Gradle Offline Bundle Packer (macOS)
# ============================================================================
# Version: 4.0
#
# WORKFLOW:
# 1. [ONLINE] Build project (./gradlew bootJar) - downloads all deps
# 2. [ONLINE] Test run (./gradlew bootRun) - verify app works
# 3. [OFFLINE TEST] Verify offline build works
# 4. Create bundle with all cached dependencies
#
# REQUIREMENTS:
# - Internet connection (for initial build)
# - Project with gradlew
# - macOS 10.13+ (High Sierra or later)
# ============================================================================
set -e
# ============================================================================
# Configuration
# ============================================================================
WRAPPER_SEED_PATH="wrapper_jar_seed"
OFFLINE_HOME_NAME="_offline_gradle_home"
BOOTRUN_TIMEOUT_SECONDS=60
# Color codes
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
CYAN='\033[0;36m'
GRAY='\033[0;90m'
WHITE='\033[1;37m'
NC='\033[0m' # No Color
echo ""
echo -e "${CYAN}============================================================${NC}"
echo -e "${CYAN} Gradle Offline Bundle Packer v4.0 (macOS)${NC}"
echo -e "${CYAN}============================================================${NC}"
echo ""
echo -e "${WHITE} This script will:${NC}"
echo -e "${GRAY} 1. Build project with internet (download dependencies)${NC}"
echo -e "${GRAY} 2. Test run application (verify it works)${NC}"
echo -e "${GRAY} 3. Test offline build (verify cache is complete)${NC}"
echo -e "${GRAY} 4. Create offline bundle for air-gapped environment${NC}"
echo ""
echo -e "${CYAN}============================================================${NC}"
echo ""
# ============================================================================
# [1/20] Check Current Directory
# ============================================================================
echo -e "${YELLOW}==[1/20] Check Current Directory ==${NC}"
ROOT="$(pwd)"
echo "ROOT_DIR: $ROOT"
echo ""
# ============================================================================
# [2/20] Check Required Files
# ============================================================================
echo -e "${YELLOW}==[2/20] Check Required Files ==${NC}"
if [ ! -f "./gradlew" ]; then
echo -e "${RED}ERROR: gradlew not found. Run from project root.${NC}"
exit 1
fi
chmod +x ./gradlew
echo -e "${GREEN}[OK] gradlew${NC}"
BUILD_FILE=""
if [ -f "./build.gradle" ]; then
BUILD_FILE="build.gradle"
elif [ -f "./build.gradle.kts" ]; then
BUILD_FILE="build.gradle.kts"
else
echo -e "${RED}ERROR: build.gradle(.kts) not found.${NC}"
exit 1
fi
echo -e "${GREEN}[OK] $BUILD_FILE${NC}"
SETTINGS_FILE=""
if [ -f "./settings.gradle" ]; then
SETTINGS_FILE="settings.gradle"
echo -e "${GREEN}[OK] $SETTINGS_FILE${NC}"
elif [ -f "./settings.gradle.kts" ]; then
SETTINGS_FILE="settings.gradle.kts"
echo -e "${GREEN}[OK] $SETTINGS_FILE${NC}"
fi
echo ""
# ============================================================================
# [3/20] Check Gradle Wrapper
# ============================================================================
echo -e "${YELLOW}==[3/20] Check Gradle Wrapper ==${NC}"
WRAPPER_DIR="$ROOT/gradle/wrapper"
WRAPPER_JAR="$WRAPPER_DIR/gradle-wrapper.jar"
WRAPPER_PROP="$WRAPPER_DIR/gradle-wrapper.properties"
mkdir -p "$WRAPPER_DIR"
if [ ! -f "$WRAPPER_PROP" ]; then
echo -e "${RED}ERROR: gradle-wrapper.properties not found.${NC}"
exit 1
fi
if [ ! -f "$WRAPPER_JAR" ]; then
SEED_JAR="$ROOT/$WRAPPER_SEED_PATH/gradle-wrapper.jar"
if [ -f "$SEED_JAR" ]; then
cp "$SEED_JAR" "$WRAPPER_JAR"
echo -e "${GREEN}[OK] Wrapper jar injected from seed${NC}"
else
echo -e "${RED}ERROR: gradle-wrapper.jar missing${NC}"
exit 1
fi
else
echo -e "${GREEN}[OK] gradle-wrapper.jar exists${NC}"
fi
# Create seed backup
SEED_DIR="$ROOT/$WRAPPER_SEED_PATH"
if [ ! -d "$SEED_DIR" ]; then
mkdir -p "$SEED_DIR"
cp "$WRAPPER_JAR" "$SEED_DIR/gradle-wrapper.jar"
fi
echo ""
# ============================================================================
# [4/20] Set GRADLE_USER_HOME (Project Local)
# ============================================================================
echo -e "${YELLOW}==[4/20] Set GRADLE_USER_HOME ==${NC}"
OFFLINE_HOME="$ROOT/$OFFLINE_HOME_NAME"
mkdir -p "$OFFLINE_HOME"
export GRADLE_USER_HOME="$OFFLINE_HOME"
echo -e "${CYAN}GRADLE_USER_HOME = $GRADLE_USER_HOME${NC}"
echo -e "${GRAY}[INFO] All dependencies will be cached in project folder${NC}"
echo ""
# ============================================================================
# [5/20] Check Internet Connection
# ============================================================================
echo -e "${YELLOW}==[5/20] Check Internet Connection ==${NC}"
HAS_INTERNET=false
TEST_HOSTS=("plugins.gradle.org" "repo.maven.apache.org" "repo1.maven.org")
for TEST_HOST in "${TEST_HOSTS[@]}"; do
# macOS ping doesn't have -W, use -t instead
if ping -c 1 -t 3 "$TEST_HOST" &>/dev/null; then
HAS_INTERNET=true
echo -e "${GREEN}[OK] Connected to $TEST_HOST${NC}"
break
fi
done
if [ "$HAS_INTERNET" = false ]; then
# Try DNS resolution as fallback
if nslookup google.com &>/dev/null || host google.com &>/dev/null; then
HAS_INTERNET=true
echo -e "${GREEN}[OK] Internet available (DNS)${NC}"
fi
fi
if [ "$HAS_INTERNET" = false ]; then
echo ""
echo -e "${RED}============================================================${NC}"
echo -e "${RED} ERROR: No Internet Connection!${NC}"
echo -e "${RED}============================================================${NC}"
echo ""
echo -e "${YELLOW}This script requires internet for initial build.${NC}"
echo -e "${YELLOW}Please connect to internet and run again.${NC}"
echo ""
exit 1
fi
echo ""
# ============================================================================
# [6/20] Initial Gradle Setup
# ============================================================================
echo -e "${YELLOW}==[6/20] Initial Gradle Setup ==${NC}"
echo -e "${GRAY}[INFO] Downloading Gradle distribution...${NC}"
if ./gradlew --version &>/dev/null; then
GRADLE_VERSION=$(./gradlew --version 2>&1 | grep "^Gradle" | awk '{print $2}')
echo -e "${GREEN}[OK] Gradle $GRADLE_VERSION${NC}"
else
echo -e "${RED}[ERROR] Gradle setup failed${NC}"
exit 1
fi
echo ""
# ============================================================================
# [7/20] ONLINE BUILD - bootJar (Download All Dependencies)
# ============================================================================
echo -e "${YELLOW}==[7/20] ONLINE BUILD - bootJar ==${NC}"
echo ""
echo -e "${CYAN}============================================================${NC}"
echo -e "${CYAN} ONLINE BUILD (with Internet)${NC}"
echo -e "${CYAN} Downloading all dependencies to local cache${NC}"
echo -e "${CYAN}============================================================${NC}"
echo ""
BUILD_SUCCESS=false
./gradlew clean bootJar --no-daemon
if [ $? -eq 0 ]; then
BUILD_SUCCESS=true
echo ""
echo -e "${GREEN}============================================================${NC}"
echo -e "${GREEN} ONLINE BUILD SUCCESS!${NC}"
echo -e "${GREEN}============================================================${NC}"
echo ""
if [ -d "./build/libs" ]; then
echo -e "${CYAN}JAR files:${NC}"
ls -lh ./build/libs/*.jar 2>/dev/null | awk '{print " " $9 " (" $5 ")"}'
fi
else
echo ""
echo -e "${RED}============================================================${NC}"
echo -e "${RED} BUILD FAILED!${NC}"
echo -e "${RED}============================================================${NC}"
echo ""
echo -e "${YELLOW}Build failed. Cannot continue.${NC}"
exit 1
fi
echo ""
# ============================================================================
# [8/20] Stop Daemons
# ============================================================================
echo -e "${YELLOW}==[8/20] Stop Daemons ==${NC}"
./gradlew --stop &>/dev/null || true
sleep 2
echo -e "${GREEN}[OK] Daemons stopped${NC}"
echo ""
# ============================================================================
# [9/20] ONLINE TEST - bootRun (Verify Application Works)
# ============================================================================
echo -e "${YELLOW}==[9/20] ONLINE TEST - bootRun ==${NC}"
echo ""
echo -e "${CYAN}============================================================${NC}"
echo -e "${CYAN} Testing application startup (timeout: ${BOOTRUN_TIMEOUT_SECONDS}s)${NC}"
echo -e "${CYAN} Will automatically stop after successful startup${NC}"
echo -e "${CYAN}============================================================${NC}"
echo ""
BOOTRUN_SUCCESS=false
# macOS uses gtimeout if available, otherwise perl-based timeout
if command -v gtimeout &>/dev/null; then
gtimeout ${BOOTRUN_TIMEOUT_SECONDS}s ./gradlew bootRun --no-daemon &
else
# Fallback: start in background and kill after timeout
./gradlew bootRun --no-daemon &
fi
BOOTRUN_PID=$!
sleep 10
if ps -p $BOOTRUN_PID &>/dev/null; then
BOOTRUN_SUCCESS=true
echo ""
echo -e "${GREEN}[OK] Application started successfully${NC}"
kill $BOOTRUN_PID &>/dev/null || true
sleep 2
else
echo ""
echo -e "${YELLOW}[WARN] Application may not have started properly${NC}"
fi
# Cleanup - macOS process cleanup
pkill -f "gradle.*bootRun" &>/dev/null || true
sleep 2
echo ""
# ============================================================================
# [10/20] Stop Daemons Again
# ============================================================================
echo -e "${YELLOW}==[10/20] Stop Daemons Again ==${NC}"
./gradlew --stop &>/dev/null || true
sleep 2
echo -e "${GREEN}[OK] Daemons stopped${NC}"
echo ""
# ============================================================================
# [11/20] OFFLINE BUILD TEST (Verify Cache Completeness)
# ============================================================================
echo -e "${YELLOW}==[11/20] OFFLINE BUILD TEST ==${NC}"
echo ""
echo -e "${CYAN}============================================================${NC}"
echo -e "${CYAN} OFFLINE BUILD TEST (--offline flag)${NC}"
echo -e "${CYAN} Verifying all dependencies are cached${NC}"
echo -e "${CYAN}============================================================${NC}"
echo ""
OFFLINE_SUCCESS=false
./gradlew clean bootJar --offline --no-daemon
if [ $? -eq 0 ]; then
OFFLINE_SUCCESS=true
echo ""
echo -e "${GREEN}============================================================${NC}"
echo -e "${GREEN} OFFLINE BUILD TEST PASSED!${NC}"
echo -e "${GREEN}============================================================${NC}"
echo ""
echo -e "${GREEN}[OK] All dependencies are cached${NC}"
else
echo ""
echo -e "${RED}============================================================${NC}"
echo -e "${RED} OFFLINE BUILD TEST FAILED!${NC}"
echo -e "${RED}============================================================${NC}"
echo ""
echo -e "${YELLOW}Some dependencies may be missing from cache.${NC}"
echo -e "${YELLOW}The bundle may not work in air-gapped environment.${NC}"
echo ""
read -p "Continue anyway? (y/N): " -n 1 -r
echo
if [[ ! $REPLY =~ ^[Yy]$ ]]; then
exit 1
fi
fi
echo ""
# ============================================================================
# [12/20] Stop Daemons Before Archive
# ============================================================================
echo -e "${YELLOW}==[12/20] Stop Daemons Before Archive ==${NC}"
./gradlew --stop &>/dev/null || true
sleep 2
echo -e "${GREEN}[OK] Daemons stopped${NC}"
echo ""
# ============================================================================
# [13/20] Verify settings.gradle for Offline
# ============================================================================
echo -e "${YELLOW}==[13/20] Verify settings.gradle ==${NC}"
if [ -n "$SETTINGS_FILE" ]; then
if grep -q "mavenLocal()" "$SETTINGS_FILE" && grep -q "pluginManagement" "$SETTINGS_FILE"; then
echo -e "${GREEN}[OK] settings.gradle configured for offline${NC}"
else
echo -e "${YELLOW}[WARN] settings.gradle may need offline configuration${NC}"
echo -e "${GRAY}[INFO] Consider adding mavenLocal() to pluginManagement and repositories${NC}"
fi
else
echo -e "${GRAY}[INFO] No settings.gradle found${NC}"
fi
echo ""
# ============================================================================
# [14/20] Create Helper Scripts
# ============================================================================
echo -e "${YELLOW}==[14/20] Create Helper Scripts ==${NC}"
# run_offline_build.sh
cat > "$ROOT/run_offline_build.sh" << 'EOF'
#!/bin/bash
# run_offline_build.sh - Build JAR offline
export GRADLE_USER_HOME="$(pwd)/_offline_gradle_home"
echo "GRADLE_USER_HOME = $GRADLE_USER_HOME"
echo ""
./gradlew --offline bootJar --no-daemon
if [ $? -eq 0 ]; then
echo ""
echo "BUILD SUCCESS!"
echo ""
echo "JAR files:"
ls -lh ./build/libs/*.jar 2>/dev/null | awk '{print " " $9}'
else
echo "BUILD FAILED"
fi
EOF
chmod +x "$ROOT/run_offline_build.sh"
echo -e "${GREEN}[OK] run_offline_build.sh${NC}"
# run_offline_bootrun.sh
cat > "$ROOT/run_offline_bootrun.sh" << 'EOF'
#!/bin/bash
# run_offline_bootrun.sh - Run application offline
export GRADLE_USER_HOME="$(pwd)/_offline_gradle_home"
echo "GRADLE_USER_HOME = $GRADLE_USER_HOME"
echo ""
echo "Starting application (Ctrl+C to stop)..."
echo ""
./gradlew --offline bootRun --no-daemon
EOF
chmod +x "$ROOT/run_offline_bootrun.sh"
echo -e "${GREEN}[OK] run_offline_bootrun.sh${NC}"
echo ""
# ============================================================================
# [15/20] Final Daemon Cleanup
# ============================================================================
echo -e "${YELLOW}==[15/20] Final Daemon Cleanup ==${NC}"
./gradlew --stop &>/dev/null || true
sleep 2
echo -e "${GREEN}[OK] Daemons stopped${NC}"
echo ""
# ============================================================================
# [16/20] Clean Lock Files
# ============================================================================
echo -e "${YELLOW}==[16/20] Clean Lock Files ==${NC}"
DAEMON_DIR="$OFFLINE_HOME/daemon"
if [ -d "$DAEMON_DIR" ]; then
rm -rf "$DAEMON_DIR" 2>/dev/null || true
fi
find "$OFFLINE_HOME" -type f \( -name "*.lock" -o -name "*.log" -o -name "*.tmp" \) -delete 2>/dev/null || true
echo -e "${GREEN}[OK] Lock files cleaned${NC}"
echo ""
# ============================================================================
# [17/20] Calculate Cache Size
# ============================================================================
echo -e "${YELLOW}==[17/20] Cache Summary ==${NC}"
CACHES_DIR="$OFFLINE_HOME/caches"
WRAPPER_DISTS="$OFFLINE_HOME/wrapper/dists"
TOTAL_SIZE=0
if [ -d "$CACHES_DIR" ]; then
# macOS uses different options for du
if du -k "$CACHES_DIR" &>/dev/null; then
SIZE=$(du -sk "$CACHES_DIR" 2>/dev/null | cut -f1)
SIZE=$((SIZE * 1024)) # Convert to bytes
else
SIZE=0
fi
TOTAL_SIZE=$((TOTAL_SIZE + SIZE))
SIZE_MB=$(awk "BEGIN {printf \"%.2f\", $SIZE / 1048576}")
echo -e "${CYAN}[INFO] Dependencies: ${SIZE_MB} MB${NC}"
fi
if [ -d "$WRAPPER_DISTS" ]; then
if du -k "$WRAPPER_DISTS" &>/dev/null; then
SIZE=$(du -sk "$WRAPPER_DISTS" 2>/dev/null | cut -f1)
SIZE=$((SIZE * 1024))
else
SIZE=0
fi
TOTAL_SIZE=$((TOTAL_SIZE + SIZE))
SIZE_MB=$(awk "BEGIN {printf \"%.2f\", $SIZE / 1048576}")
echo -e "${CYAN}[INFO] Gradle dist: ${SIZE_MB} MB${NC}"
fi
TOTAL_MB=$(awk "BEGIN {printf \"%.2f\", $TOTAL_SIZE / 1048576}")
echo -e "${CYAN}[INFO] Total cache: ${TOTAL_MB} MB${NC}"
echo ""
# ============================================================================
# [18/20] Create Archive
# ============================================================================
echo -e "${YELLOW}==[18/20] Create Archive ==${NC}"
BASE_NAME=$(basename "$ROOT")
TIMESTAMP=$(date +"%Y%m%d_%H%M%S")
PARENT=$(dirname "$ROOT")
ARCHIVE_PATH="${PARENT}/${BASE_NAME}_offline_bundle_${TIMESTAMP}.tar.gz"
echo "Archive: $ARCHIVE_PATH"
echo -e "${GRAY}[INFO] Creating archive (this may take several minutes)...${NC}"
# macOS tar with BSD options
tar -czf "$ARCHIVE_PATH" \
--exclude=".git" \
--exclude=".idea" \
--exclude=".DS_Store" \
--exclude="*.log" \
--exclude="*.lock" \
--exclude="_offline_gradle_home/daemon" \
--exclude="_offline_gradle_home/native" \
--exclude="_offline_gradle_home/jdks" \
--exclude="build" \
--exclude="out" \
--exclude=".gradle" \
-C "$ROOT" .
if [ $? -ne 0 ]; then
echo -e "${RED}ERROR: tar failed${NC}"
exit 1
fi
# macOS stat command
ARCHIVE_SIZE=$(stat -f%z "$ARCHIVE_PATH" 2>/dev/null)
ARCHIVE_SIZE_MB=$(awk "BEGIN {printf \"%.2f\", $ARCHIVE_SIZE / 1048576}")
echo -e "${GREEN}[OK] Archive created: ${ARCHIVE_SIZE_MB} MB${NC}"
echo ""
# ============================================================================
# [19/20] Verify Archive
# ============================================================================
echo -e "${YELLOW}==[19/20] Verify Archive ==${NC}"
CHECKS=(
"gradle/wrapper/gradle-wrapper.jar"
"gradlew"
"_offline_gradle_home/caches"
"run_offline_build.sh"
)
for CHECK in "${CHECKS[@]}"; do
if tar -tzf "$ARCHIVE_PATH" | grep -q "$CHECK"; then
echo -e " ${GREEN}[OK] $CHECK${NC}"
else
echo -e " ${YELLOW}[WARN] $CHECK${NC}"
fi
done
echo ""
# ============================================================================
# [20/20] Complete
# ============================================================================
echo -e "${GREEN}============================================================${NC}"
echo -e "${GREEN} BUNDLE CREATION COMPLETE!${NC}"
echo -e "${GREEN}============================================================${NC}"
echo ""
echo -e "${CYAN}Archive: $ARCHIVE_PATH${NC}"
echo -e "${CYAN}Size: ${ARCHIVE_SIZE_MB} MB${NC}"
echo ""
echo -e "${CYAN}============================================================${NC}"
echo -e "${CYAN} Test Results${NC}"
echo -e "${CYAN}============================================================${NC}"
if [ "$BUILD_SUCCESS" = true ]; then
echo -e " Online build (bootJar): ${GREEN}PASSED${NC}"
else
echo -e " Online build (bootJar): ${RED}FAILED${NC}"
fi
if [ "$BOOTRUN_SUCCESS" = true ]; then
echo -e " Online test (bootRun): ${GREEN}PASSED${NC}"
else
echo -e " Online test (bootRun): ${YELLOW}SKIPPED${NC}"
fi
if [ "$OFFLINE_SUCCESS" = true ]; then
echo -e " Offline build test: ${GREEN}PASSED${NC}"
else
echo -e " Offline build test: ${RED}FAILED${NC}"
fi
echo ""
echo -e "${YELLOW}============================================================${NC}"
echo -e "${YELLOW} Usage in Air-gapped Environment${NC}"
echo -e "${YELLOW}============================================================${NC}"
echo ""
echo -e "${WHITE}Option 1: Use unpack script${NC}"
echo -e "${GRAY} ./unpack_and_offline_build_airgap.sh${NC}"
echo ""
echo -e "${WHITE}Option 2: Manual extraction${NC}"
echo -e "${GRAY} tar -xzf <archive>.tar.gz${NC}"
echo -e "${GRAY} cd <project>${NC}"
echo -e "${GRAY} ./run_offline_build.sh${NC}"
echo ""
echo -e "${WHITE}Option 3: Direct commands${NC}"
echo -e "${GRAY} export GRADLE_USER_HOME=\"./_offline_gradle_home\"${NC}"
echo -e "${GRAY} ./gradlew --offline bootJar --no-daemon${NC}"
echo ""

View File

@@ -0,0 +1,359 @@
#!/bin/bash
# unpack_and_offline_build_airgap_macos.sh
# ============================================================================
# Execution Environment: OFFLINE (Air-gapped, No Internet)
# Purpose: Extract bundle and run offline build
# ============================================================================
# macOS Bash Script
# Version: 3.1
#
# IMPORTANT: This script automatically:
# 1. Extracts the archive
# 2. Sets GRADLE_USER_HOME to project local cache
# 3. Configures settings.gradle for offline resolution
# 4. Runs build with --offline flag
# ============================================================================
set -e
# ============================================================================
# Configuration
# ============================================================================
WRAPPER_SEED_PATH="wrapper_jar_seed"
OFFLINE_HOME_NAME="_offline_gradle_home"
# Color codes
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
CYAN='\033[0;36m'
GRAY='\033[0;90m'
WHITE='\033[1;37m'
NC='\033[0m' # No Color
echo ""
echo -e "${CYAN}============================================================${NC}"
echo -e "${CYAN} Gradle Offline Build Runner (macOS)${NC}"
echo -e "${CYAN} Environment: AIR-GAPPED (No Internet)${NC}"
echo -e "${CYAN} Mode: Fully Offline (--offline enforced)${NC}"
echo -e "${CYAN}============================================================${NC}"
echo ""
# ============================================================================
# [1/16] Check Current Directory
# ============================================================================
echo -e "${YELLOW}==[1/16] Check Current Directory ==${NC}"
START_DIR="$(pwd)"
echo "PWD: $START_DIR"
echo ""
# ============================================================================
# [2/16] Select Archive
# ============================================================================
echo -e "${YELLOW}==[2/16] Select Archive ==${NC}"
ARCHIVE=""
if [ $# -ge 1 ]; then
ARCHIVE="$1"
fi
if [ -z "$ARCHIVE" ]; then
# Auto-detect most recent .tar.gz file (macOS compatible)
ARCHIVE=$(find "$START_DIR" -maxdepth 1 -type f \( -name "*.tar.gz" -o -name "*.tgz" \) -exec stat -f "%m %N" {} \; 2>/dev/null | sort -rn | head -1 | cut -d' ' -f2-)
if [ -z "$ARCHIVE" ]; then
echo -e "${RED}[ERROR] No archive found${NC}"
ls -lh "$START_DIR"
exit 1
fi
echo -e "${CYAN}[AUTO] $(basename "$ARCHIVE")${NC}"
else
if [ ! -f "$ARCHIVE" ]; then
ARCHIVE="$START_DIR/$ARCHIVE"
fi
echo -e "${CYAN}[USER] $(basename "$ARCHIVE")${NC}"
fi
if [ ! -f "$ARCHIVE" ]; then
echo -e "${RED}ERROR: Archive not found: $ARCHIVE${NC}"
exit 1
fi
# macOS stat command
ARCHIVE_SIZE=$(stat -f%z "$ARCHIVE" 2>/dev/null)
ARCHIVE_SIZE_MB=$(awk "BEGIN {printf \"%.2f\", $ARCHIVE_SIZE / 1048576}")
echo "Size: ${ARCHIVE_SIZE_MB} MB"
echo ""
# ============================================================================
# [3/16] Check tar
# ============================================================================
echo -e "${YELLOW}==[3/16] Check tar ==${NC}"
if ! command -v tar &>/dev/null; then
echo -e "${RED}ERROR: tar not found${NC}"
exit 1
fi
echo -e "${GREEN}[OK] tar found${NC}"
echo ""
# ============================================================================
# [4/16] Extract Archive
# ============================================================================
echo -e "${YELLOW}==[4/16] Extract Archive ==${NC}"
echo -e "${GRAY}[INFO] Extracting...${NC}"
tar -xzf "$ARCHIVE" -C "$START_DIR"
if [ $? -ne 0 ]; then
echo -e "${RED}ERROR: Extraction failed${NC}"
exit 1
fi
echo -e "${GREEN}[OK] Extracted${NC}"
echo ""
# ============================================================================
# [5/16] Set Permissions
# ============================================================================
echo -e "${YELLOW}==[5/16] Set Permissions ==${NC}"
chmod -R u+rw "$START_DIR" 2>/dev/null || true
# Remove extended attributes that macOS may add
xattr -cr "$START_DIR" 2>/dev/null || true
echo -e "${GREEN}[OK] Permissions set${NC}"
echo ""
# ============================================================================
# [6/16] Find Project Root
# ============================================================================
echo -e "${YELLOW}==[6/16] Find Project Root ==${NC}"
GRADLEW=$(find "$START_DIR" -name "gradlew" -type f 2>/dev/null | sort | head -1)
if [ -z "$GRADLEW" ]; then
echo -e "${RED}ERROR: gradlew not found${NC}"
exit 1
fi
PROJECT_DIR=$(dirname "$GRADLEW")
echo -e "${CYAN}Project: $PROJECT_DIR${NC}"
cd "$PROJECT_DIR"
echo ""
# ============================================================================
# [7/16] Fix Permissions
# ============================================================================
echo -e "${YELLOW}==[7/16] Fix Permissions ==${NC}"
chmod +x ./gradlew
find . -name "*.sh" -type f -exec chmod +x {} \; 2>/dev/null || true
# Remove quarantine attributes that macOS adds to downloaded files
xattr -d com.apple.quarantine ./gradlew 2>/dev/null || true
find . -name "*.jar" -exec xattr -d com.apple.quarantine {} \; 2>/dev/null || true
echo -e "${GREEN}[OK] Permissions fixed${NC}"
echo ""
# ============================================================================
# [8/16] Verify Wrapper
# ============================================================================
echo -e "${YELLOW}==[8/16] Verify Wrapper ==${NC}"
WRAPPER_DIR="$PROJECT_DIR/gradle/wrapper"
WRAPPER_JAR="$WRAPPER_DIR/gradle-wrapper.jar"
WRAPPER_PROP="$WRAPPER_DIR/gradle-wrapper.properties"
if [ ! -f "$WRAPPER_PROP" ]; then
echo -e "${RED}ERROR: gradle-wrapper.properties missing${NC}"
exit 1
fi
if [ ! -f "$WRAPPER_JAR" ]; then
SEED_JAR="$PROJECT_DIR/$WRAPPER_SEED_PATH/gradle-wrapper.jar"
if [ -f "$SEED_JAR" ]; then
mkdir -p "$WRAPPER_DIR"
cp "$SEED_JAR" "$WRAPPER_JAR"
echo -e "${GREEN}[OK] Injected from seed${NC}"
else
echo -e "${RED}ERROR: wrapper jar missing${NC}"
exit 1
fi
else
echo -e "${GREEN}[OK] Wrapper verified${NC}"
fi
echo ""
# ============================================================================
# [9/16] Set GRADLE_USER_HOME
# ============================================================================
echo -e "${YELLOW}==[9/16] Set GRADLE_USER_HOME ==${NC}"
OFFLINE_HOME="$PROJECT_DIR/$OFFLINE_HOME_NAME"
if [ ! -d "$OFFLINE_HOME" ]; then
echo -e "${RED}ERROR: _offline_gradle_home not found in archive${NC}"
exit 1
fi
export GRADLE_USER_HOME="$OFFLINE_HOME"
echo -e "${CYAN}GRADLE_USER_HOME = $GRADLE_USER_HOME${NC}"
# Check cache
CACHES_DIR="$OFFLINE_HOME/caches"
if [ -d "$CACHES_DIR" ]; then
# macOS du command
if du -k "$CACHES_DIR" &>/dev/null; then
CACHE_SIZE=$(du -sk "$CACHES_DIR" 2>/dev/null | cut -f1)
CACHE_SIZE=$((CACHE_SIZE * 1024))
else
CACHE_SIZE=0
fi
CACHE_SIZE_MB=$(awk "BEGIN {printf \"%.2f\", $CACHE_SIZE / 1048576}")
echo -e "${CYAN}[INFO] Cache size: ${CACHE_SIZE_MB} MB${NC}"
else
echo -e "${YELLOW}[WARN] No cache folder found${NC}"
fi
echo ""
# ============================================================================
# [10/16] Verify settings.gradle
# ============================================================================
echo -e "${YELLOW}==[10/16] Verify settings.gradle ==${NC}"
SETTINGS_FILE=""
if [ -f "./settings.gradle" ]; then
SETTINGS_FILE="settings.gradle"
elif [ -f "./settings.gradle.kts" ]; then
SETTINGS_FILE="settings.gradle.kts"
fi
if [ -n "$SETTINGS_FILE" ]; then
if grep -q "mavenLocal()" "$SETTINGS_FILE" && grep -q "pluginManagement" "$SETTINGS_FILE"; then
echo -e "${GREEN}[OK] settings.gradle configured for offline${NC}"
else
echo -e "${YELLOW}[WARN] settings.gradle may not be configured for offline${NC}"
echo -e "${GRAY}[INFO] Build may fail if plugins not cached${NC}"
fi
fi
echo ""
# ============================================================================
# [11/16] Test Gradle
# ============================================================================
echo -e "${YELLOW}==[11/16] Test Gradle ==${NC}"
GRADLE_WORKS=false
if ./gradlew --offline --version &>/dev/null; then
GRADLE_WORKS=true
echo -e "${GREEN}[OK] Gradle working in offline mode${NC}"
else
echo -e "${YELLOW}[WARN] Gradle --version failed${NC}"
fi
echo ""
# ============================================================================
# [12/16] Stop Daemon
# ============================================================================
echo -e "${YELLOW}==[12/16] Stop Daemon ==${NC}"
./gradlew --stop &>/dev/null || true
sleep 2
echo -e "${GREEN}[OK] Daemon stopped${NC}"
echo ""
# ============================================================================
# [13/16] Run Offline Build
# ============================================================================
echo -e "${YELLOW}==[13/16] Run Offline Build ==${NC}"
echo ""
echo -e "${CYAN}============================================================${NC}"
echo -e "${CYAN} Building with --offline flag${NC}"
echo -e "${CYAN} All dependencies from local cache${NC}"
echo -e "${CYAN}============================================================${NC}"
echo ""
BUILD_SUCCESS=false
BUILD_TASK=""
# Try bootJar
echo -e "${GRAY}[TRY] --offline bootJar...${NC}"
if ./gradlew --offline clean bootJar --no-daemon; then
BUILD_SUCCESS=true
BUILD_TASK="bootJar"
fi
# Try jar
if [ "$BUILD_SUCCESS" = false ]; then
echo -e "${GRAY}[TRY] --offline jar...${NC}"
if ./gradlew --offline clean jar --no-daemon; then
BUILD_SUCCESS=true
BUILD_TASK="jar"
fi
fi
# Try build
if [ "$BUILD_SUCCESS" = false ]; then
echo -e "${GRAY}[TRY] --offline build...${NC}"
if ./gradlew --offline build --no-daemon; then
BUILD_SUCCESS=true
BUILD_TASK="build"
fi
fi
echo ""
if [ "$BUILD_SUCCESS" = true ]; then
echo -e "${GREEN}============================================================${NC}"
echo -e "${GREEN} BUILD SUCCESS! (task: $BUILD_TASK)${NC}"
echo -e "${GREEN}============================================================${NC}"
else
echo -e "${RED}============================================================${NC}"
echo -e "${RED} BUILD FAILED!${NC}"
echo -e "${RED}============================================================${NC}"
echo ""
echo -e "${YELLOW}Possible causes:${NC}"
echo -e "${WHITE} - Dependencies not in cache${NC}"
echo -e "${WHITE} - Plugin resolution failed${NC}"
echo -e "${WHITE} - Need complete build in online env first${NC}"
exit 1
fi
echo ""
# ============================================================================
# [14/16] Show Build Output
# ============================================================================
echo -e "${YELLOW}==[14/16] Build Output ==${NC}"
LIBS_DIR="$PROJECT_DIR/build/libs"
if [ -d "$LIBS_DIR" ]; then
echo -e "${CYAN}build/libs contents:${NC}"
ls -lh "$LIBS_DIR"/*.jar 2>/dev/null | awk '{printf " %-40s %10s\n", $9, $5}'
MAIN_JAR=$(find "$LIBS_DIR" -name "*.jar" -type f ! -name "*-plain.jar" ! -name "*-sources.jar" ! -name "*-javadoc.jar" 2>/dev/null | head -1)
else
echo -e "${YELLOW}[WARN] build/libs not found${NC}"
fi
echo ""
# ============================================================================
# [15/16] Run Instructions
# ============================================================================
echo -e "${YELLOW}==[15/16] Run Instructions ==${NC}"
echo ""
if [ -n "$MAIN_JAR" ]; then
echo -e "${CYAN}To run the application:${NC}"
echo -e "${WHITE} java -jar $(basename "$MAIN_JAR")${NC}"
echo ""
fi
echo -e "${CYAN}To rebuild:${NC}"
echo -e "${WHITE} export GRADLE_USER_HOME=\"./_offline_gradle_home\"${NC}"
echo -e "${WHITE} ./gradlew --offline bootJar --no-daemon${NC}"
echo ""
# ============================================================================
# [16/16] Complete
# ============================================================================
echo -e "${GREEN}============================================================${NC}"
echo -e "${GREEN} Offline Build Complete!${NC}"
echo -e "${GREEN}============================================================${NC}"
echo ""
echo -e "${CYAN}Project: $PROJECT_DIR${NC}"
echo ""

View File

@@ -0,0 +1,14 @@
How to Use
[1] Move The Two scripts to a Last location
1. pack_offline_bundle_airgap.ps1
2. unpack_and_offline_build_airgap.ps1
[2] Packing Scripts Start --Options: Internet connect Require when you packing, after then you will get all gradle File
command on powershell: powershell -ExecutionPolicy Bypass -File .\pack_offline_bundle_airgap.ps1
[3] UnPacking Scripts Start --Options: The JPA Spring boot Project All gradle, gradle cache File, Build File etc (Check using .\gradlew.bat bootRun --offline)
command on powershell: powershell -ExecutionPolicy Bypass -File .\unpack_and_offline_build_airgap.ps1 .\kamco-dabeeo-backoffice_offline_bundle_20260121_145830.tar.gz (The tar File name have to be changed)
PS. You can check the ALL Gradle Backup File location Users/d-pn-0071/Desktop/lala/kamco-dabeeo-backoffice/gradle/win/kamco-dabeeo-backoffice_gradle.tar.gz

View File

@@ -0,0 +1,672 @@
# pack_offline_bundle_airgap.ps1
# ============================================================================
# Gradle Offline Bundle Packer
# ============================================================================
# Version: 4.0
#
# WORKFLOW:
# 1. [ONLINE] Build project (./gradlew.bat bootJar) - downloads all deps
# 2. [ONLINE] Test run (./gradlew.bat bootRun) - verify app works
# 3. [OFFLINE TEST] Verify offline build works
# 4. Create bundle with all cached dependencies
#
# REQUIREMENTS:
# - Internet connection (for initial build)
# - Project with gradlew.bat
# ============================================================================
$ErrorActionPreference = "Stop"
$ProgressPreference = "SilentlyContinue"
# ============================================================================
# Configuration
# ============================================================================
$WRAPPER_SEED_PATH = "wrapper_jar_seed"
$OFFLINE_HOME_NAME = "_offline_gradle_home"
$BOOTRUN_TIMEOUT_SECONDS = 60
Write-Host ""
Write-Host "============================================================" -ForegroundColor Cyan
Write-Host " Gradle Offline Bundle Packer v4.0" -ForegroundColor Cyan
Write-Host "============================================================" -ForegroundColor Cyan
Write-Host ""
Write-Host " This script will:" -ForegroundColor White
Write-Host " 1. Build project with internet (download dependencies)" -ForegroundColor Gray
Write-Host " 2. Test run application (verify it works)" -ForegroundColor Gray
Write-Host " 3. Test offline build (verify cache is complete)" -ForegroundColor Gray
Write-Host " 4. Create offline bundle for air-gapped environment" -ForegroundColor Gray
Write-Host ""
Write-Host "============================================================" -ForegroundColor Cyan
Write-Host ""
# ============================================================================
# [1/20] Check Current Directory
# ============================================================================
Write-Host "==[1/20] Check Current Directory ==" -ForegroundColor Yellow
$Root = (Get-Location).Path
Write-Host ("ROOT_DIR: " + $Root)
Write-Host ""
# ============================================================================
# [2/20] Check Required Files
# ============================================================================
Write-Host "==[2/20] Check Required Files ==" -ForegroundColor Yellow
if (!(Test-Path -LiteralPath ".\gradlew.bat")) {
throw "ERROR: gradlew.bat not found. Run from project root."
}
Write-Host "[OK] gradlew.bat" -ForegroundColor Green
$buildFile = $null
if (Test-Path -LiteralPath ".\build.gradle") { $buildFile = "build.gradle" }
elseif (Test-Path -LiteralPath ".\build.gradle.kts") { $buildFile = "build.gradle.kts" }
else { throw "ERROR: build.gradle(.kts) not found." }
Write-Host ("[OK] " + $buildFile) -ForegroundColor Green
$settingsFile = $null
if (Test-Path -LiteralPath ".\settings.gradle") { $settingsFile = "settings.gradle" }
elseif (Test-Path -LiteralPath ".\settings.gradle.kts") { $settingsFile = "settings.gradle.kts" }
if ($settingsFile) { Write-Host ("[OK] " + $settingsFile) -ForegroundColor Green }
Write-Host ""
# ============================================================================
# [3/20] Check Gradle Wrapper
# ============================================================================
Write-Host "==[3/20] Check Gradle Wrapper ==" -ForegroundColor Yellow
$WrapperDir = Join-Path $Root "gradle\wrapper"
$WrapperJar = Join-Path $WrapperDir "gradle-wrapper.jar"
$WrapperProp = Join-Path $WrapperDir "gradle-wrapper.properties"
New-Item -ItemType Directory -Force -Path $WrapperDir | Out-Null
if (!(Test-Path -LiteralPath $WrapperProp)) {
throw "ERROR: gradle-wrapper.properties not found."
}
if (!(Test-Path -LiteralPath $WrapperJar)) {
$SeedJar = Join-Path $Root "$WRAPPER_SEED_PATH\gradle-wrapper.jar"
if (Test-Path -LiteralPath $SeedJar) {
Copy-Item -Force -LiteralPath $SeedJar -Destination $WrapperJar
Write-Host "[OK] Wrapper jar injected from seed" -ForegroundColor Green
} else {
throw "ERROR: gradle-wrapper.jar missing"
}
} else {
Write-Host "[OK] gradle-wrapper.jar exists" -ForegroundColor Green
}
# Create seed backup
$SeedDir = Join-Path $Root $WRAPPER_SEED_PATH
if (!(Test-Path -LiteralPath $SeedDir)) {
New-Item -ItemType Directory -Force -Path $SeedDir | Out-Null
Copy-Item -Force -LiteralPath $WrapperJar -Destination (Join-Path $SeedDir "gradle-wrapper.jar")
}
Write-Host ""
# ============================================================================
# [4/20] Set GRADLE_USER_HOME (Project Local)
# ============================================================================
Write-Host "==[4/20] Set GRADLE_USER_HOME ==" -ForegroundColor Yellow
$OfflineHome = Join-Path $Root $OFFLINE_HOME_NAME
New-Item -ItemType Directory -Force -Path $OfflineHome | Out-Null
$env:GRADLE_USER_HOME = $OfflineHome
Write-Host ("GRADLE_USER_HOME = " + $env:GRADLE_USER_HOME) -ForegroundColor Cyan
Write-Host "[INFO] All dependencies will be cached in project folder" -ForegroundColor Gray
Write-Host ""
# ============================================================================
# [5/20] Check Internet Connection
# ============================================================================
Write-Host "==[5/20] Check Internet Connection ==" -ForegroundColor Yellow
$hasInternet = $false
$testHosts = @("plugins.gradle.org", "repo.maven.apache.org", "repo1.maven.org")
foreach ($testHost in $testHosts) {
try {
$result = Test-Connection -ComputerName $testHost -Count 1 -Quiet -TimeoutSeconds 3 -ErrorAction SilentlyContinue
if ($result) {
$hasInternet = $true
Write-Host ("[OK] Connected to " + $testHost) -ForegroundColor Green
break
}
} catch { }
}
if (-not $hasInternet) {
# Try DNS resolution as fallback
try {
[System.Net.Dns]::GetHostAddresses("google.com") | Out-Null
$hasInternet = $true
Write-Host "[OK] Internet available (DNS)" -ForegroundColor Green
} catch { }
}
if (-not $hasInternet) {
Write-Host ""
Write-Host "============================================================" -ForegroundColor Red
Write-Host " ERROR: No Internet Connection!" -ForegroundColor Red
Write-Host "============================================================" -ForegroundColor Red
Write-Host ""
Write-Host "This script requires internet for initial build." -ForegroundColor Yellow
Write-Host "Please connect to internet and run again." -ForegroundColor Yellow
Write-Host ""
throw "No internet connection"
}
Write-Host ""
# ============================================================================
# [6/20] Initial Gradle Setup
# ============================================================================
Write-Host "==[6/20] Initial Gradle Setup ==" -ForegroundColor Yellow
Write-Host "[INFO] Downloading Gradle distribution..." -ForegroundColor Gray
try {
$output = cmd /c ".\gradlew.bat --version 2>&1"
if ($LASTEXITCODE -eq 0) {
$gradleVersion = $output | Select-String "Gradle\s+(\d+\.\d+)" |
ForEach-Object { $_.Matches[0].Groups[1].Value }
Write-Host ("[OK] Gradle " + $gradleVersion) -ForegroundColor Green
} else {
throw "Gradle setup failed"
}
} catch {
Write-Host "[ERROR] Gradle setup failed" -ForegroundColor Red
throw $_
}
Write-Host ""
# ============================================================================
# [7/20] ONLINE BUILD - bootJar (Download All Dependencies)
# ============================================================================
Write-Host "==[7/20] ONLINE BUILD - bootJar ==" -ForegroundColor Yellow
Write-Host ""
Write-Host "============================================================" -ForegroundColor Cyan
Write-Host " Building project (downloading all dependencies)" -ForegroundColor Cyan
Write-Host " This may take several minutes on first run..." -ForegroundColor Cyan
Write-Host "============================================================" -ForegroundColor Cyan
Write-Host ""
$buildSuccess = $false
try {
cmd /c ".\gradlew.bat clean bootJar --no-daemon"
if ($LASTEXITCODE -eq 0) {
$buildSuccess = $true
}
} catch { }
if (-not $buildSuccess) {
Write-Host ""
Write-Host "============================================================" -ForegroundColor Red
Write-Host " BUILD FAILED!" -ForegroundColor Red
Write-Host "============================================================" -ForegroundColor Red
Write-Host ""
Write-Host "Please fix build errors and run this script again." -ForegroundColor Yellow
throw "Build failed"
}
Write-Host ""
Write-Host "[OK] bootJar build SUCCESS" -ForegroundColor Green
Write-Host ""
# ============================================================================
# [8/20] ONLINE TEST - bootRun (Verify Application Works)
# ============================================================================
Write-Host "==[8/20] ONLINE TEST - bootRun ==" -ForegroundColor Yellow
Write-Host ""
Write-Host "============================================================" -ForegroundColor Cyan
Write-Host " Starting application to verify it works..." -ForegroundColor Cyan
Write-Host " Will run for $BOOTRUN_TIMEOUT_SECONDS seconds then stop" -ForegroundColor Cyan
Write-Host "============================================================" -ForegroundColor Cyan
Write-Host ""
$bootRunSuccess = $false
try {
# Start bootRun as background job
$job = Start-Job -ScriptBlock {
param($projectDir, $gradleHome)
Set-Location $projectDir
$env:GRADLE_USER_HOME = $gradleHome
cmd /c ".\gradlew.bat bootRun --no-daemon 2>&1"
} -ArgumentList $Root, $OfflineHome
Write-Host "[INFO] Application starting..." -ForegroundColor Gray
# Wait for startup (check for typical Spring Boot messages)
$startTime = Get-Date
$startupDetected = $false
while (((Get-Date) - $startTime).TotalSeconds -lt $BOOTRUN_TIMEOUT_SECONDS) {
Start-Sleep -Seconds 3
# Check if job has output
$jobOutput = Receive-Job -Job $job -Keep -ErrorAction SilentlyContinue
if ($jobOutput) {
$outputText = $jobOutput -join "`n"
# Check for Spring Boot startup success indicators
if ($outputText -match "Started .+ in .+ seconds" -or
$outputText -match "Tomcat started on port" -or
$outputText -match "Netty started on port" -or
$outputText -match "Application .+ started") {
$startupDetected = $true
Write-Host "[OK] Application started successfully!" -ForegroundColor Green
break
}
# Check for errors
if ($outputText -match "APPLICATION FAILED TO START" -or
$outputText -match "Error starting ApplicationContext") {
Write-Host "[ERROR] Application failed to start" -ForegroundColor Red
break
}
}
$elapsed = [math]::Round(((Get-Date) - $startTime).TotalSeconds)
Write-Host ("[INFO] Waiting... " + $elapsed + "s") -ForegroundColor Gray
}
# Stop the job
Write-Host "[INFO] Stopping application..." -ForegroundColor Gray
Stop-Job -Job $job -ErrorAction SilentlyContinue
Remove-Job -Job $job -Force -ErrorAction SilentlyContinue
# Also stop any remaining Gradle processes
cmd /c ".\gradlew.bat --stop 2>&1" | Out-Null
if ($startupDetected) {
$bootRunSuccess = $true
}
} catch {
Write-Host "[WARN] bootRun test encountered error: $_" -ForegroundColor DarkYellow
}
# Cleanup any remaining processes
try {
Get-Process -Name "java" -ErrorAction SilentlyContinue |
Where-Object { $_.CommandLine -match "bootRun|spring" } |
Stop-Process -Force -ErrorAction SilentlyContinue
} catch { }
Write-Host ""
if ($bootRunSuccess) {
Write-Host "[OK] bootRun test PASSED" -ForegroundColor Green
} else {
Write-Host "[WARN] bootRun test could not verify startup" -ForegroundColor DarkYellow
Write-Host "[INFO] Continuing anyway - bootJar succeeded" -ForegroundColor Gray
}
Write-Host ""
# ============================================================================
# [9/20] Stop All Gradle Daemons
# ============================================================================
Write-Host "==[9/20] Stop Gradle Daemons ==" -ForegroundColor Yellow
cmd /c ".\gradlew.bat --stop 2>&1" | Out-Null
Start-Sleep -Seconds 3
Write-Host "[OK] Daemons stopped" -ForegroundColor Green
Write-Host ""
# ============================================================================
# [10/20] OFFLINE TEST - Verify Offline Build Works
# ============================================================================
Write-Host "==[10/20] OFFLINE TEST - Verify Cache ==" -ForegroundColor Yellow
Write-Host ""
Write-Host "============================================================" -ForegroundColor Cyan
Write-Host " Testing offline build (--offline flag)" -ForegroundColor Cyan
Write-Host " This verifies all dependencies are cached" -ForegroundColor Cyan
Write-Host "============================================================" -ForegroundColor Cyan
Write-Host ""
$offlineSuccess = $false
try {
cmd /c ".\gradlew.bat --offline clean bootJar --no-daemon"
if ($LASTEXITCODE -eq 0) {
$offlineSuccess = $true
}
} catch { }
Write-Host ""
if ($offlineSuccess) {
Write-Host "[OK] Offline build SUCCESS - Cache is complete!" -ForegroundColor Green
} else {
Write-Host "============================================================" -ForegroundColor Red
Write-Host " OFFLINE BUILD FAILED!" -ForegroundColor Red
Write-Host "============================================================" -ForegroundColor Red
Write-Host ""
Write-Host "Some dependencies may not be cached properly." -ForegroundColor Yellow
Write-Host ""
$continue = Read-Host "Continue creating bundle anyway? (y/N)"
if ($continue -ne "y" -and $continue -ne "Y") {
throw "Offline verification failed"
}
}
Write-Host ""
# ============================================================================
# [11/20] Backup Original settings.gradle
# ============================================================================
Write-Host "==[11/20] Backup settings.gradle ==" -ForegroundColor Yellow
if ($settingsFile) {
$settingsPath = Join-Path $Root $settingsFile
$settingsBackup = Join-Path $Root "${settingsFile}.original.bak"
if (!(Test-Path -LiteralPath $settingsBackup)) {
Copy-Item -Force -LiteralPath $settingsPath -Destination $settingsBackup
Write-Host "[OK] Backup: ${settingsFile}.original.bak" -ForegroundColor Green
} else {
Write-Host "[OK] Backup exists" -ForegroundColor Green
}
}
Write-Host ""
# ============================================================================
# [12/20] Modify settings.gradle for Offline
# ============================================================================
Write-Host "==[12/20] Configure settings.gradle for Offline ==" -ForegroundColor Yellow
if ($settingsFile) {
$settingsPath = Join-Path $Root $settingsFile
$content = Get-Content -LiteralPath $settingsPath -Raw
# --- Always strip BOM if present (prevents Groovy 'Unexpected character: ')
$hadBom = $false
if ($content.Length -gt 0 -and $content[0] -eq [char]0xFEFF) {
$hadBom = $true
$content = $content -replace "^\uFEFF", ""
}
$isOfflineConfigured =
($content -match "mavenLocal\(\)") -and
($content -match "pluginManagement[\s\S]*repositories")
if ($isOfflineConfigured) {
# Even if already configured, re-save without BOM if needed
if ($hadBom) {
[System.IO.File]::WriteAllText(
$settingsPath,
$content,
(New-Object System.Text.UTF8Encoding($false))
)
Write-Host "[FIX] settings.gradle BOM removed (saved as UTF-8 without BOM)" -ForegroundColor Green
} else {
Write-Host "[OK] Already configured for offline" -ForegroundColor Green
}
} else {
$newHeader = @"
// ============================================================================
// OFFLINE BUILD CONFIGURATION (Auto-generated by pack script)
// Original backup: ${settingsFile}.original.bak
// ============================================================================
pluginManagement {
repositories {
mavenLocal()
gradlePluginPortal()
}
}
"@
# Remove existing pluginManagement
$cleaned = $content -replace '(?s)pluginManagement\s*\{[^{}]*(\{[^{}]*\}[^{}]*)*\}\s*', ''
$final = $newHeader + $cleaned.Trim()
# --- Write as UTF-8 WITHOUT BOM
[System.IO.File]::WriteAllText(
$settingsPath,
$final,
(New-Object System.Text.UTF8Encoding($false))
)
Write-Host "[OK] settings.gradle updated for offline (UTF-8 without BOM)" -ForegroundColor Green
}
}
Write-Host ""
# ============================================================================
# [13/20] Copy Maven Local Repository
# ============================================================================
Write-Host "==[13/20] Copy Maven Local (.m2) ==" -ForegroundColor Yellow
$m2Repo = Join-Path $env:USERPROFILE ".m2\repository"
$localM2 = Join-Path $OfflineHome "m2_repository"
if (Test-Path -LiteralPath $m2Repo) {
$m2Size = (Get-ChildItem -Path $m2Repo -Recurse -File -ErrorAction SilentlyContinue |
Measure-Object -Property Length -Sum).Sum
if ($m2Size -gt 1MB) {
$m2SizeMB = [math]::Round($m2Size / 1MB, 2)
Write-Host ("[INFO] .m2 size: " + $m2SizeMB + " MB") -ForegroundColor Cyan
# Copy important plugin directories
$pluginDirs = @(
"org\springframework\boot",
"io\spring",
"com\diffplug"
)
foreach ($dir in $pluginDirs) {
$src = Join-Path $m2Repo $dir
if (Test-Path -LiteralPath $src) {
$dst = Join-Path $localM2 $dir
New-Item -ItemType Directory -Force -Path (Split-Path $dst -Parent) | Out-Null
if (!(Test-Path -LiteralPath $dst)) {
Copy-Item -Recurse -Force -LiteralPath $src -Destination $dst -ErrorAction SilentlyContinue
Write-Host ("[OK] Copied " + $dir) -ForegroundColor Green
}
}
}
}
} else {
Write-Host "[INFO] No .m2 repository found" -ForegroundColor Gray
}
Write-Host ""
# ============================================================================
# [14/20] Create Helper Scripts
# ============================================================================
Write-Host "==[14/20] Create Helper Scripts ==" -ForegroundColor Yellow
# run_offline_build.ps1
$runScript = @'
# run_offline_build.ps1 - Quick offline build script
$env:GRADLE_USER_HOME = Join-Path (Get-Location).Path "_offline_gradle_home"
Write-Host "GRADLE_USER_HOME = $env:GRADLE_USER_HOME" -ForegroundColor Cyan
Write-Host ""
.\gradlew.bat --offline bootJar --no-daemon
if ($LASTEXITCODE -eq 0) {
Write-Host ""
Write-Host "BUILD SUCCESS!" -ForegroundColor Green
Write-Host ""
Write-Host "JAR files:" -ForegroundColor Cyan
Get-ChildItem .\build\libs\*.jar | ForEach-Object { Write-Host (" " + $_.Name) }
} else {
Write-Host "BUILD FAILED" -ForegroundColor Red
}
'@
[System.IO.File]::WriteAllText((Join-Path $Root "run_offline_build.ps1"), $runScript, (New-Object System.Text.UTF8Encoding($false)))
Write-Host "[OK] run_offline_build.ps1" -ForegroundColor Green
# run_offline_bootrun.ps1
$bootRunScript = @'
# run_offline_bootrun.ps1 - Run application offline
$env:GRADLE_USER_HOME = Join-Path (Get-Location).Path "_offline_gradle_home"
Write-Host "GRADLE_USER_HOME = $env:GRADLE_USER_HOME" -ForegroundColor Cyan
Write-Host ""
Write-Host "Starting application (Ctrl+C to stop)..." -ForegroundColor Yellow
Write-Host ""
.\gradlew.bat --offline bootRun --no-daemon
'@
[System.IO.File]::WriteAllText((Join-Path $Root "run_offline_bootrun.ps1"), $bootRunScript, (New-Object System.Text.UTF8Encoding($false)))
Write-Host "[OK] run_offline_bootrun.ps1" -ForegroundColor Green
Write-Host ""
# ============================================================================
# [15/20] Stop Daemons Again
# ============================================================================
Write-Host "==[15/20] Final Daemon Cleanup ==" -ForegroundColor Yellow
cmd /c ".\gradlew.bat --stop 2>&1" | Out-Null
Start-Sleep -Seconds 2
Write-Host "[OK] Daemons stopped" -ForegroundColor Green
Write-Host ""
# ============================================================================
# [16/20] Clean Lock Files
# ============================================================================
Write-Host "==[16/20] Clean Lock Files ==" -ForegroundColor Yellow
try {
$DaemonDir = Join-Path $OfflineHome "daemon"
if (Test-Path -LiteralPath $DaemonDir) {
Remove-Item -Recurse -Force -LiteralPath $DaemonDir -ErrorAction SilentlyContinue
}
Get-ChildItem -Path $OfflineHome -Recurse -Include "*.lock","*.log","*.tmp" -File -ErrorAction SilentlyContinue |
ForEach-Object { Remove-Item -Force -LiteralPath $_.FullName -ErrorAction SilentlyContinue }
Write-Host "[OK] Lock files cleaned" -ForegroundColor Green
} catch {
Write-Host "[WARN] Some files could not be cleaned" -ForegroundColor DarkYellow
}
Write-Host ""
# ============================================================================
# [17/20] Calculate Cache Size
# ============================================================================
Write-Host "==[17/20] Cache Summary ==" -ForegroundColor Yellow
$CachesDir = Join-Path $OfflineHome "caches"
$WrapperDists = Join-Path $OfflineHome "wrapper\dists"
$totalSize = 0
if (Test-Path -LiteralPath $CachesDir) {
$size = (Get-ChildItem -Path $CachesDir -Recurse -File -ErrorAction SilentlyContinue |
Measure-Object -Property Length -Sum).Sum
$totalSize += $size
Write-Host ("[INFO] Dependencies: " + [math]::Round($size/1MB, 2) + " MB") -ForegroundColor Cyan
}
if (Test-Path -LiteralPath $WrapperDists) {
$size = (Get-ChildItem -Path $WrapperDists -Recurse -File -ErrorAction SilentlyContinue |
Measure-Object -Property Length -Sum).Sum
$totalSize += $size
Write-Host ("[INFO] Gradle dist: " + [math]::Round($size/1MB, 2) + " MB") -ForegroundColor Cyan
}
Write-Host ("[INFO] Total cache: " + [math]::Round($totalSize/1MB, 2) + " MB") -ForegroundColor Cyan
Write-Host ""
# ============================================================================
# [18/20] Create Archive
# ============================================================================
Write-Host "==[18/20] Create Archive ==" -ForegroundColor Yellow
$BaseName = Split-Path $Root -Leaf
$Ts = Get-Date -Format "yyyyMMdd_HHmmss"
$Parent = Split-Path $Root -Parent
$ArchivePath = Join-Path $Parent "${BaseName}_offline_bundle_${Ts}.tar.gz"
Write-Host ("Archive: " + $ArchivePath)
$tar = Get-Command tar.exe -ErrorAction SilentlyContinue
if (-not $tar) { throw "ERROR: tar.exe not found" }
Write-Host "[INFO] Creating archive (this may take several minutes)..." -ForegroundColor Gray
& tar.exe -czf $ArchivePath `
--exclude ".git" `
--exclude ".idea" `
--exclude ".DS_Store" `
--exclude "*.log" `
--exclude "*.lock" `
--exclude "_offline_gradle_home/daemon" `
--exclude "_offline_gradle_home/native" `
--exclude "_offline_gradle_home/jdks" `
--exclude "build" `
--exclude "out" `
--exclude ".gradle" `
-C $Root .
if ($LASTEXITCODE -ne 0) { throw "ERROR: tar failed" }
$archiveSize = (Get-Item -LiteralPath $ArchivePath).Length
$archiveSizeMB = [math]::Round($archiveSize / 1MB, 2)
Write-Host ("[OK] Archive created: " + $archiveSizeMB + " MB") -ForegroundColor Green
Write-Host ""
# ============================================================================
# [19/20] Verify Archive
# ============================================================================
Write-Host "==[19/20] Verify Archive ==" -ForegroundColor Yellow
$listOutput = & tar.exe -tzf $ArchivePath 2>&1
$checks = @(
"gradle/wrapper/gradle-wrapper.jar",
"gradlew.bat",
"_offline_gradle_home/caches",
"run_offline_build.ps1"
)
foreach ($check in $checks) {
$found = $listOutput | Select-String -SimpleMatch $check
if ($found) {
Write-Host (" [OK] " + $check) -ForegroundColor Green
} else {
Write-Host (" [WARN] " + $check) -ForegroundColor DarkYellow
}
}
Write-Host ""
# ============================================================================
# [20/20] Complete
# ============================================================================
Write-Host "============================================================" -ForegroundColor Green
Write-Host " BUNDLE CREATION COMPLETE!" -ForegroundColor Green
Write-Host "============================================================" -ForegroundColor Green
Write-Host ""
Write-Host ("Archive: " + $ArchivePath) -ForegroundColor Cyan
Write-Host ("Size: " + $archiveSizeMB + " MB") -ForegroundColor Cyan
Write-Host ""
Write-Host "============================================================" -ForegroundColor Cyan
Write-Host " Test Results" -ForegroundColor Cyan
Write-Host "============================================================" -ForegroundColor Cyan
Write-Host (" Online build (bootJar): " + $(if($buildSuccess){"PASSED"}else{"FAILED"})) -ForegroundColor $(if($buildSuccess){"Green"}else{"Red"})
Write-Host (" Online test (bootRun): " + $(if($bootRunSuccess){"PASSED"}else{"SKIPPED"})) -ForegroundColor $(if($bootRunSuccess){"Green"}else{"Yellow"})
Write-Host (" Offline build test: " + $(if($offlineSuccess){"PASSED"}else{"FAILED"})) -ForegroundColor $(if($offlineSuccess){"Green"}else{"Red"})
Write-Host ""
Write-Host "============================================================" -ForegroundColor Yellow
Write-Host " Usage in Air-gapped Environment" -ForegroundColor Yellow
Write-Host "============================================================" -ForegroundColor Yellow
Write-Host ""
Write-Host "Option 1: Use unpack script" -ForegroundColor White
Write-Host " .\unpack_and_offline_build_airgap.ps1" -ForegroundColor Gray
Write-Host ""
Write-Host "Option 2: Manual extraction" -ForegroundColor White
Write-Host " tar -xzf <archive>.tar.gz" -ForegroundColor Gray
Write-Host " cd <project>" -ForegroundColor Gray
Write-Host " .\run_offline_build.ps1" -ForegroundColor Gray
Write-Host ""
Write-Host "Option 3: Direct commands" -ForegroundColor White
Write-Host ' $env:GRADLE_USER_HOME = ".\\_offline_gradle_home"' -ForegroundColor Gray
Write-Host " .\gradlew.bat --offline bootJar --no-daemon" -ForegroundColor Gray
Write-Host ""

View File

@@ -0,0 +1,355 @@
# unpack_and_offline_build_airgap.ps1
# ============================================================================
# Execution Environment: OFFLINE (Air-gapped, No Internet)
# Purpose: Extract bundle and run offline build
# ============================================================================
# Windows PowerShell Only
# Version: 3.1
#
# IMPORTANT: This script automatically:
# 1. Extracts the archive
# 2. Sets GRADLE_USER_HOME to project local cache
# 3. Configures settings.gradle for offline resolution
# 4. Runs build with --offline flag
# ============================================================================
$ErrorActionPreference = "Stop"
$ProgressPreference = "SilentlyContinue"
# ============================================================================
# Configuration
# ============================================================================
$WRAPPER_SEED_PATH = "wrapper_jar_seed"
$OFFLINE_HOME_NAME = "_offline_gradle_home"
Write-Host ""
Write-Host "============================================================" -ForegroundColor Cyan
Write-Host " Gradle Offline Build Runner" -ForegroundColor Cyan
Write-Host " Environment: AIR-GAPPED (No Internet)" -ForegroundColor Cyan
Write-Host " Mode: Fully Offline (--offline enforced)" -ForegroundColor Cyan
Write-Host "============================================================" -ForegroundColor Cyan
Write-Host ""
# ============================================================================
# [1/16] Check Current Directory
# ============================================================================
Write-Host "==[1/16] Check Current Directory ==" -ForegroundColor Yellow
$Start = (Get-Location).Path
Write-Host ("PWD: " + $Start)
Write-Host ""
# ============================================================================
# [2/16] Select Archive
# ============================================================================
Write-Host "==[2/16] Select Archive ==" -ForegroundColor Yellow
$Archive = $null
if ($args.Count -ge 1) {
$Archive = $args[0].Trim().Trim('"').Trim("'")
}
if ([string]::IsNullOrWhiteSpace($Archive)) {
$candidates = Get-ChildItem -Path $Start -File -ErrorAction SilentlyContinue |
Where-Object { $_.Name -match "\.(tar\.gz|tgz)$" } |
Sort-Object LastWriteTime -Descending
if (-not $candidates -or $candidates.Count -eq 0) {
Write-Host "[ERROR] No archive found" -ForegroundColor Red
Get-ChildItem -Path $Start -File | Format-Table Name, Length -AutoSize
throw "ERROR: No .tar.gz file found"
}
$Archive = $candidates[0].FullName
Write-Host ("[AUTO] " + (Split-Path $Archive -Leaf)) -ForegroundColor Cyan
} else {
if (-not [System.IO.Path]::IsPathRooted($Archive)) {
$Archive = Join-Path $Start $Archive
}
Write-Host ("[USER] " + (Split-Path $Archive -Leaf)) -ForegroundColor Cyan
}
if (-not (Test-Path -LiteralPath $Archive)) {
throw "ERROR: Archive not found: $Archive"
}
$archiveSizeMB = [math]::Round((Get-Item -LiteralPath $Archive).Length / 1MB, 2)
Write-Host ("Size: " + $archiveSizeMB + " MB")
Write-Host ""
# ============================================================================
# [3/16] Check tar.exe
# ============================================================================
Write-Host "==[3/16] Check tar.exe ==" -ForegroundColor Yellow
$tar = Get-Command tar.exe -ErrorAction SilentlyContinue
if (-not $tar) { throw "ERROR: tar.exe not found" }
Write-Host "[OK] tar.exe found" -ForegroundColor Green
Write-Host ""
# ============================================================================
# [4/16] Extract Archive
# ============================================================================
Write-Host "==[4/16] Extract Archive ==" -ForegroundColor Yellow
Write-Host "[INFO] Extracting..." -ForegroundColor Gray
& tar.exe -xzf $Archive -C $Start
if ($LASTEXITCODE -ne 0) { throw "ERROR: Extraction failed" }
Write-Host "[OK] Extracted" -ForegroundColor Green
Write-Host ""
# ============================================================================
# [5/16] Unblock Files
# ============================================================================
Write-Host "==[5/16] Unblock Files ==" -ForegroundColor Yellow
try {
Get-ChildItem -Path $Start -Recurse -Force -ErrorAction SilentlyContinue |
Unblock-File -ErrorAction SilentlyContinue
Write-Host "[OK] Files unblocked" -ForegroundColor Green
} catch {
Write-Host "[WARN] Unblock failed" -ForegroundColor DarkYellow
}
Write-Host ""
# ============================================================================
# [6/16] Find Project Root
# ============================================================================
Write-Host "==[6/16] Find Project Root ==" -ForegroundColor Yellow
$gradlewList = Get-ChildItem -Path $Start -Recurse -Filter "gradlew.bat" -File -ErrorAction SilentlyContinue
if (-not $gradlewList) { throw "ERROR: gradlew.bat not found" }
$gradlew = $gradlewList | Sort-Object { $_.FullName.Length } | Select-Object -First 1
$ProjectDir = $gradlew.Directory.FullName
Write-Host ("Project: " + $ProjectDir) -ForegroundColor Cyan
Set-Location -LiteralPath $ProjectDir
Write-Host ""
# ============================================================================
# [7/16] Fix Permissions
# ============================================================================
Write-Host "==[7/16] Fix Permissions ==" -ForegroundColor Yellow
try {
$currentUser = [System.Security.Principal.WindowsIdentity]::GetCurrent().Name
cmd /c "icacls `"$ProjectDir`" /grant `"$currentUser`:(OI)(CI)F`" /t /q" 2>&1 | Out-Null
Write-Host "[OK] Permissions fixed" -ForegroundColor Green
} catch {
Write-Host "[WARN] icacls failed" -ForegroundColor DarkYellow
}
Write-Host ""
# ============================================================================
# [8/16] Verify Wrapper
# ============================================================================
Write-Host "==[8/16] Verify Wrapper ==" -ForegroundColor Yellow
$WrapperDir = Join-Path $ProjectDir "gradle\wrapper"
$WrapperJar = Join-Path $WrapperDir "gradle-wrapper.jar"
$WrapperProp = Join-Path $WrapperDir "gradle-wrapper.properties"
if (!(Test-Path -LiteralPath $WrapperProp)) {
throw "ERROR: gradle-wrapper.properties missing"
}
if (!(Test-Path -LiteralPath $WrapperJar)) {
$SeedJar = Join-Path $ProjectDir "$WRAPPER_SEED_PATH\gradle-wrapper.jar"
if (Test-Path -LiteralPath $SeedJar) {
New-Item -ItemType Directory -Force -Path $WrapperDir | Out-Null
Copy-Item -Force -LiteralPath $SeedJar -Destination $WrapperJar
Write-Host "[OK] Injected from seed" -ForegroundColor Green
} else {
throw "ERROR: wrapper jar missing"
}
} else {
Write-Host "[OK] Wrapper verified" -ForegroundColor Green
}
Write-Host ""
# ============================================================================
# [9/16] Set GRADLE_USER_HOME
# ============================================================================
Write-Host "==[9/16] Set GRADLE_USER_HOME ==" -ForegroundColor Yellow
$OfflineHome = Join-Path $ProjectDir $OFFLINE_HOME_NAME
if (!(Test-Path -LiteralPath $OfflineHome)) {
throw "ERROR: _offline_gradle_home not found in archive"
}
$env:GRADLE_USER_HOME = $OfflineHome
Write-Host ("GRADLE_USER_HOME = " + $env:GRADLE_USER_HOME) -ForegroundColor Cyan
# Check cache
$CachesDir = Join-Path $OfflineHome "caches"
if (Test-Path -LiteralPath $CachesDir) {
$cacheSize = (Get-ChildItem -Path $CachesDir -Recurse -File -ErrorAction SilentlyContinue |
Measure-Object -Property Length -Sum).Sum
$cacheSizeMB = [math]::Round($cacheSize / 1MB, 2)
Write-Host ("[INFO] Cache size: " + $cacheSizeMB + " MB") -ForegroundColor Cyan
} else {
Write-Host "[WARN] No cache folder found" -ForegroundColor DarkYellow
}
Write-Host ""
# ============================================================================
# [10/16] Verify settings.gradle
# ============================================================================
Write-Host "==[10/16] Verify settings.gradle ==" -ForegroundColor Yellow
$settingsFile = $null
if (Test-Path -LiteralPath ".\settings.gradle") { $settingsFile = "settings.gradle" }
elseif (Test-Path -LiteralPath ".\settings.gradle.kts") { $settingsFile = "settings.gradle.kts" }
if ($settingsFile) {
$content = Get-Content -LiteralPath ".\$settingsFile" -Raw
if ($content -match "mavenLocal\(\)" -and $content -match "pluginManagement") {
Write-Host "[OK] settings.gradle configured for offline" -ForegroundColor Green
} else {
Write-Host "[WARN] settings.gradle may not be configured for offline" -ForegroundColor DarkYellow
Write-Host "[INFO] Build may fail if plugins not cached" -ForegroundColor Gray
}
}
Write-Host ""
# ============================================================================
# [11/16] Test Gradle
# ============================================================================
Write-Host "==[11/16] Test Gradle ==" -ForegroundColor Yellow
$gradleWorks = $false
try {
$output = cmd /c ".\gradlew.bat --offline --version 2>&1"
if ($LASTEXITCODE -eq 0) {
$gradleWorks = $true
Write-Host "[OK] Gradle working in offline mode" -ForegroundColor Green
}
} catch { }
if (-not $gradleWorks) {
Write-Host "[WARN] Gradle --version failed" -ForegroundColor DarkYellow
}
Write-Host ""
# ============================================================================
# [12/16] Stop Daemon
# ============================================================================
Write-Host "==[12/16] Stop Daemon ==" -ForegroundColor Yellow
try { cmd /c ".\gradlew.bat --stop 2>&1" | Out-Null } catch { }
Start-Sleep -Seconds 2
Write-Host "[OK] Daemon stopped" -ForegroundColor Green
Write-Host ""
# ============================================================================
# [13/16] Run Offline Build
# ============================================================================
Write-Host "==[13/16] Run Offline Build ==" -ForegroundColor Yellow
Write-Host ""
Write-Host "============================================================" -ForegroundColor Cyan
Write-Host " Building with --offline flag" -ForegroundColor Cyan
Write-Host " All dependencies from local cache" -ForegroundColor Cyan
Write-Host "============================================================" -ForegroundColor Cyan
Write-Host ""
$buildSuccess = $false
$buildTask = $null
# Try bootJar
Write-Host "[TRY] --offline bootJar..." -ForegroundColor Gray
try {
cmd /c ".\gradlew.bat --offline clean bootJar --no-daemon"
if ($LASTEXITCODE -eq 0) {
$buildSuccess = $true
$buildTask = "bootJar"
}
} catch { }
# Try jar
if (-not $buildSuccess) {
Write-Host "[TRY] --offline jar..." -ForegroundColor Gray
try {
cmd /c ".\gradlew.bat --offline clean jar --no-daemon"
if ($LASTEXITCODE -eq 0) {
$buildSuccess = $true
$buildTask = "jar"
}
} catch { }
}
# Try build
if (-not $buildSuccess) {
Write-Host "[TRY] --offline build..." -ForegroundColor Gray
try {
cmd /c ".\gradlew.bat --offline build --no-daemon"
if ($LASTEXITCODE -eq 0) {
$buildSuccess = $true
$buildTask = "build"
}
} catch { }
}
Write-Host ""
if ($buildSuccess) {
Write-Host "============================================================" -ForegroundColor Green
Write-Host (" BUILD SUCCESS! (task: " + $buildTask + ")") -ForegroundColor Green
Write-Host "============================================================" -ForegroundColor Green
} else {
Write-Host "============================================================" -ForegroundColor Red
Write-Host " BUILD FAILED!" -ForegroundColor Red
Write-Host "============================================================" -ForegroundColor Red
Write-Host ""
Write-Host "Possible causes:" -ForegroundColor Yellow
Write-Host " - Dependencies not in cache" -ForegroundColor White
Write-Host " - Plugin resolution failed" -ForegroundColor White
Write-Host " - Need complete build in online env first" -ForegroundColor White
throw "Build failed"
}
Write-Host ""
# ============================================================================
# [14/16] Show Build Output
# ============================================================================
Write-Host "==[14/16] Build Output ==" -ForegroundColor Yellow
$libsDir = Join-Path $ProjectDir "build\libs"
if (Test-Path -LiteralPath $libsDir) {
Write-Host "build/libs contents:" -ForegroundColor Cyan
Get-ChildItem -LiteralPath $libsDir |
Format-Table Name, @{L="Size(KB)";E={[math]::Round($_.Length/1KB,1)}} -AutoSize |
Out-Host
$mainJar = Get-ChildItem -LiteralPath $libsDir -Filter "*.jar" |
Where-Object { $_.Name -notmatch "plain|sources|javadoc" } |
Select-Object -First 1
} else {
Write-Host "[WARN] build/libs not found" -ForegroundColor DarkYellow
}
Write-Host ""
# ============================================================================
# [15/16] Run Instructions
# ============================================================================
Write-Host "==[15/16] Run Instructions ==" -ForegroundColor Yellow
Write-Host ""
if ($mainJar) {
Write-Host "To run the application:" -ForegroundColor Cyan
Write-Host (" java -jar build\libs\" + $mainJar.Name) -ForegroundColor White
Write-Host ""
}
Write-Host "To rebuild:" -ForegroundColor Cyan
Write-Host ' $env:GRADLE_USER_HOME = ".\\_offline_gradle_home"' -ForegroundColor White
Write-Host " .\gradlew.bat --offline bootJar --no-daemon" -ForegroundColor White
Write-Host ""
# ============================================================================
# [16/16] Complete
# ============================================================================
Write-Host "============================================================" -ForegroundColor Green
Write-Host " Offline Build Complete!" -ForegroundColor Green
Write-Host "============================================================" -ForegroundColor Green
Write-Host ""
Write-Host ("Project: " + $ProjectDir) -ForegroundColor Cyan
Write-Host ""

Binary file not shown.

View File

@@ -0,0 +1,7 @@
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-8.14.3-bin.zip
networkTimeout=10000
validateDistributionUrl=true
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists

251
kamco-make-dataset-generation/gradlew vendored Executable file
View File

@@ -0,0 +1,251 @@
#!/bin/sh
#
# Copyright © 2015-2021 the original authors.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# SPDX-License-Identifier: Apache-2.0
#
##############################################################################
#
# Gradle start up script for POSIX generated by Gradle.
#
# Important for running:
#
# (1) You need a POSIX-compliant shell to run this script. If your /bin/sh is
# noncompliant, but you have some other compliant shell such as ksh or
# bash, then to run this script, type that shell name before the whole
# command line, like:
#
# ksh Gradle
#
# Busybox and similar reduced shells will NOT work, because this script
# requires all of these POSIX shell features:
# * functions;
# * expansions «$var», «${var}», «${var:-default}», «${var+SET}»,
# «${var#prefix}», «${var%suffix}», and «$( cmd )»;
# * compound commands having a testable exit status, especially «case»;
# * various built-in commands including «command», «set», and «ulimit».
#
# Important for patching:
#
# (2) This script targets any POSIX shell, so it avoids extensions provided
# by Bash, Ksh, etc; in particular arrays are avoided.
#
# The "traditional" practice of packing multiple parameters into a
# space-separated string is a well documented source of bugs and security
# problems, so this is (mostly) avoided, by progressively accumulating
# options in "$@", and eventually passing that to Java.
#
# Where the inherited environment variables (DEFAULT_JVM_OPTS, JAVA_OPTS,
# and GRADLE_OPTS) rely on word-splitting, this is performed explicitly;
# see the in-line comments for details.
#
# There are tweaks for specific operating systems such as AIX, CygWin,
# Darwin, MinGW, and NonStop.
#
# (3) This script is generated from the Groovy template
# https://github.com/gradle/gradle/blob/HEAD/platforms/jvm/plugins-application/src/main/resources/org/gradle/api/internal/plugins/unixStartScript.txt
# within the Gradle project.
#
# You can find Gradle at https://github.com/gradle/gradle/.
#
##############################################################################
# Attempt to set APP_HOME
# Resolve links: $0 may be a link
app_path=$0
# Need this for daisy-chained symlinks.
while
APP_HOME=${app_path%"${app_path##*/}"} # leaves a trailing /; empty if no leading path
[ -h "$app_path" ]
do
ls=$( ls -ld "$app_path" )
link=${ls#*' -> '}
case $link in #(
/*) app_path=$link ;; #(
*) app_path=$APP_HOME$link ;;
esac
done
# This is normally unused
# shellcheck disable=SC2034
APP_BASE_NAME=${0##*/}
# Discard cd standard output in case $CDPATH is set (https://github.com/gradle/gradle/issues/25036)
APP_HOME=$( cd -P "${APP_HOME:-./}" > /dev/null && printf '%s\n' "$PWD" ) || exit
# Use the maximum available, or set MAX_FD != -1 to use that value.
MAX_FD=maximum
warn () {
echo "$*"
} >&2
die () {
echo
echo "$*"
echo
exit 1
} >&2
# OS specific support (must be 'true' or 'false').
cygwin=false
msys=false
darwin=false
nonstop=false
case "$( uname )" in #(
CYGWIN* ) cygwin=true ;; #(
Darwin* ) darwin=true ;; #(
MSYS* | MINGW* ) msys=true ;; #(
NONSTOP* ) nonstop=true ;;
esac
CLASSPATH="\\\"\\\""
# Determine the Java command to use to start the JVM.
if [ -n "$JAVA_HOME" ] ; then
if [ -x "$JAVA_HOME/jre/sh/java" ] ; then
# IBM's JDK on AIX uses strange locations for the executables
JAVACMD=$JAVA_HOME/jre/sh/java
else
JAVACMD=$JAVA_HOME/bin/java
fi
if [ ! -x "$JAVACMD" ] ; then
die "ERROR: JAVA_HOME is set to an invalid directory: $JAVA_HOME
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
else
JAVACMD=java
if ! command -v java >/dev/null 2>&1
then
die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH.
Please set the JAVA_HOME variable in your environment to match the
location of your Java installation."
fi
fi
# Increase the maximum file descriptors if we can.
if ! "$cygwin" && ! "$darwin" && ! "$nonstop" ; then
case $MAX_FD in #(
max*)
# In POSIX sh, ulimit -H is undefined. That's why the result is checked to see if it worked.
# shellcheck disable=SC2039,SC3045
MAX_FD=$( ulimit -H -n ) ||
warn "Could not query maximum file descriptor limit"
esac
case $MAX_FD in #(
'' | soft) :;; #(
*)
# In POSIX sh, ulimit -n is undefined. That's why the result is checked to see if it worked.
# shellcheck disable=SC2039,SC3045
ulimit -n "$MAX_FD" ||
warn "Could not set maximum file descriptor limit to $MAX_FD"
esac
fi
# Collect all arguments for the java command, stacking in reverse order:
# * args from the command line
# * the main class name
# * -classpath
# * -D...appname settings
# * --module-path (only if needed)
# * DEFAULT_JVM_OPTS, JAVA_OPTS, and GRADLE_OPTS environment variables.
# For Cygwin or MSYS, switch paths to Windows format before running java
if "$cygwin" || "$msys" ; then
APP_HOME=$( cygpath --path --mixed "$APP_HOME" )
CLASSPATH=$( cygpath --path --mixed "$CLASSPATH" )
JAVACMD=$( cygpath --unix "$JAVACMD" )
# Now convert the arguments - kludge to limit ourselves to /bin/sh
for arg do
if
case $arg in #(
-*) false ;; # don't mess with options #(
/?*) t=${arg#/} t=/${t%%/*} # looks like a POSIX filepath
[ -e "$t" ] ;; #(
*) false ;;
esac
then
arg=$( cygpath --path --ignore --mixed "$arg" )
fi
# Roll the args list around exactly as many times as the number of
# args, so each arg winds up back in the position where it started, but
# possibly modified.
#
# NB: a `for` loop captures its iteration list before it begins, so
# changing the positional parameters here affects neither the number of
# iterations, nor the values presented in `arg`.
shift # remove old arg
set -- "$@" "$arg" # push replacement arg
done
fi
# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
DEFAULT_JVM_OPTS='"-Xmx64m" "-Xms64m"'
# Collect all arguments for the java command:
# * DEFAULT_JVM_OPTS, JAVA_OPTS, and optsEnvironmentVar are not allowed to contain shell fragments,
# and any embedded shellness will be escaped.
# * For example: A user cannot expect ${Hostname} to be expanded, as it is an environment variable and will be
# treated as '${Hostname}' itself on the command line.
set -- \
"-Dorg.gradle.appname=$APP_BASE_NAME" \
-classpath "$CLASSPATH" \
-jar "$APP_HOME/gradle/wrapper/gradle-wrapper.jar" \
"$@"
# Stop when "xargs" is not available.
if ! command -v xargs >/dev/null 2>&1
then
die "xargs is not available"
fi
# Use "xargs" to parse quoted args.
#
# With -n1 it outputs one arg per line, with the quotes and backslashes removed.
#
# In Bash we could simply go:
#
# readarray ARGS < <( xargs -n1 <<<"$var" ) &&
# set -- "${ARGS[@]}" "$@"
#
# but POSIX shell has neither arrays nor command substitution, so instead we
# post-process each arg (as a line of input to sed) to backslash-escape any
# character that might be a shell metacharacter, then use eval to reverse
# that process (while maintaining the separation between arguments), and wrap
# the whole thing up as a single "set" statement.
#
# This will of course break if any of these variables contains a newline or
# an unmatched quote.
#
eval "set -- $(
printf '%s\n' "$DEFAULT_JVM_OPTS $JAVA_OPTS $GRADLE_OPTS" |
xargs -n1 |
sed ' s~[^-[:alnum:]+,./:=@_]~\\&~g; ' |
tr '\n' ' '
)" '"$@"'
exec "$JAVACMD" "$@"

View File

@@ -0,0 +1,94 @@
@rem
@rem Copyright 2015 the original author or authors.
@rem
@rem Licensed under the Apache License, Version 2.0 (the "License");
@rem you may not use this file except in compliance with the License.
@rem You may obtain a copy of the License at
@rem
@rem https://www.apache.org/licenses/LICENSE-2.0
@rem
@rem Unless required by applicable law or agreed to in writing, software
@rem distributed under the License is distributed on an "AS IS" BASIS,
@rem WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
@rem See the License for the specific language governing permissions and
@rem limitations under the License.
@rem
@rem SPDX-License-Identifier: Apache-2.0
@rem
@if "%DEBUG%"=="" @echo off
@rem ##########################################################################
@rem
@rem Gradle startup script for Windows
@rem
@rem ##########################################################################
@rem Set local scope for the variables with windows NT shell
if "%OS%"=="Windows_NT" setlocal
set DIRNAME=%~dp0
if "%DIRNAME%"=="" set DIRNAME=.
@rem This is normally unused
set APP_BASE_NAME=%~n0
set APP_HOME=%DIRNAME%
@rem Resolve any "." and ".." in APP_HOME to make it shorter.
for %%i in ("%APP_HOME%") do set APP_HOME=%%~fi
@rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script.
set DEFAULT_JVM_OPTS="-Xmx64m" "-Xms64m"
@rem Find java.exe
if defined JAVA_HOME goto findJavaFromJavaHome
set JAVA_EXE=java.exe
%JAVA_EXE% -version >NUL 2>&1
if %ERRORLEVEL% equ 0 goto execute
echo. 1>&2
echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH. 1>&2
echo. 1>&2
echo Please set the JAVA_HOME variable in your environment to match the 1>&2
echo location of your Java installation. 1>&2
goto fail
:findJavaFromJavaHome
set JAVA_HOME=%JAVA_HOME:"=%
set JAVA_EXE=%JAVA_HOME%/bin/java.exe
if exist "%JAVA_EXE%" goto execute
echo. 1>&2
echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME% 1>&2
echo. 1>&2
echo Please set the JAVA_HOME variable in your environment to match the 1>&2
echo location of your Java installation. 1>&2
goto fail
:execute
@rem Setup the command line
set CLASSPATH=
@rem Execute Gradle
"%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -classpath "%CLASSPATH%" -jar "%APP_HOME%\gradle\wrapper\gradle-wrapper.jar" %*
:end
@rem End local scope for the variables with windows NT shell
if %ERRORLEVEL% equ 0 goto mainEnd
:fail
rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of
rem the _cmd.exe /c_ return code!
set EXIT_CODE=%ERRORLEVEL%
if %EXIT_CODE% equ 0 set EXIT_CODE=1
if not ""=="%GRADLE_EXIT_CONSOLE%" exit %EXIT_CODE%
exit /b %EXIT_CODE%
:mainEnd
if "%OS%"=="Windows_NT" endlocal
:omega

View File

@@ -0,0 +1 @@
rootProject.name = 'kamco-geojson-scheduler'

View File

@@ -0,0 +1,15 @@
package com.kamco.cd.geojsonscheduler;
import com.kamco.cd.geojsonscheduler.config.DockerProperties;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
@SpringBootApplication
@EnableConfigurationProperties(DockerProperties.class)
public class GeoJsonSchedulerApplication {
public static void main(String[] args) {
SpringApplication.run(GeoJsonSchedulerApplication.class, args);
}
}

View File

@@ -0,0 +1,108 @@
package com.kamco.cd.geojsonscheduler.batch;
import com.kamco.cd.geojsonscheduler.service.DockerRunnerService;
import lombok.RequiredArgsConstructor;
import lombok.extern.log4j.Log4j2;
import org.springframework.batch.core.StepContribution;
import org.springframework.batch.core.scope.context.ChunkContext;
import org.springframework.batch.core.step.tasklet.Tasklet;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
/**
* Docker 컨테이너 실행 Tasklet
*
* <p>학습 데이터 생성 파이프라인이 담긴 Docker 컨테이너를 실행합니다. 이전 Step(makeGeoJsonStep)에서 생성된 GeoJSON
* 파일들을 입력으로 받아 학습 데이터셋을 생성합니다.
*
* <p><b>주요 기능:</b>
*
* <ul>
* <li>Docker 컨테이너 실행 (kamco_full_pipeline.py)
* <li>입력: /dataset/request/{resultUid}/*.geojson
* <li>출력: /dataset/response/{resultUid}/*
* <li>Docker 실행 실패 시 RuntimeException 발생 (Step 실패 처리)
* </ul>
*
* <p><b>실행 조건:</b>
*
* <ul>
* <li>makeGeoJsonStep이 성공적으로 완료되어야 함
* <li>request/{resultUid}/ 디렉토리에 GeoJSON 파일이 존재해야 함
* </ul>
*
* <p><b>Docker Exit Code 처리:</b>
*
* <ul>
* <li>Exit Code = 0: 정상 종료 (Step 성공)
* <li>Exit Code != 0: 비정상 종료 (RuntimeException 발생 → Step 실패)
* </ul>
*
* @author KAMCO Development Team
* @since 1.0.0
* @see DockerRunnerService
*/
@Log4j2
@Component
@RequiredArgsConstructor
public class DockerRunTasklet implements Tasklet {
/** Docker 컨테이너 실행을 담당하는 서비스 */
private final DockerRunnerService dockerRunnerService;
/** Job Parameter로 전달받은 결과물 고유 ID (UUID) */
@Value("#{jobParameters['resultUid']}")
private String resultUid;
/**
* Docker 컨테이너 실행 작업 수행
*
* <p>DockerRunnerService를 통해 학습 데이터 생성 파이프라인을 실행합니다. Docker 프로세스가 비정상 종료될 경우
* RuntimeException이 발생하여 Step이 실패 처리됩니다.
*
* @param contribution Step 실행 정보를 담는 객체
* @param chunkContext Chunk 실행 컨텍스트
* @return RepeatStatus.FINISHED - 작업 완료
* @throws RuntimeException Docker 프로세스가 비정상 종료(exitCode != 0)된 경우
*/
@Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) {
log.info("========================================");
log.info("Docker 컨테이너 실행 시작 (ResultUid={})", resultUid);
log.info("========================================");
// 실행 전 사전 정보 로깅
log.info("[사전 정보]");
log.info(" - 입력 디렉토리: /dataset/request/{}/", resultUid);
log.info(" - 출력 디렉토리: /dataset/response/{}/", resultUid);
log.info(" - 파이프라인: kamco_full_pipeline.py");
try {
// DockerRunnerService를 통해 Docker 컨테이너 실행
// exitCode != 0 시 RuntimeException 발생
log.info("[Docker 실행] 컨테이너 시작...");
dockerRunnerService.run(resultUid);
log.info("[Docker 실행] ✓ 컨테이너 정상 종료 (exitCode=0)");
} catch (RuntimeException e) {
// Docker 실행 실패 시 (DockerRunnerService에서 던진 예외)
log.error("[Docker 실행] ✗ 컨테이너 실행 실패!", e);
log.error(" - ResultUid: {}", resultUid);
log.error(" - 에러 메시지: {}", e.getMessage());
log.error(" - 확인사항:");
log.error(" 1. request/{}/ 디렉토리에 GeoJSON 파일이 있는지 확인", resultUid);
log.error(" 2. Docker 이미지가 정상적으로 존재하는지 확인");
log.error(" 3. Docker 볼륨 마운트 경로 확인");
log.error(" 4. kamco_full_pipeline.py 스크립트 로그 확인");
throw e; // 예외 재발생 → Step 실패 처리
}
log.info("========================================");
log.info("Docker 컨테이너 실행 완료 (ResultUid={})", resultUid);
log.info(" - 결과물 저장 위치: /dataset/response/{}/", resultUid);
log.info("========================================");
return RepeatStatus.FINISHED;
}
}

View File

@@ -0,0 +1,36 @@
package com.kamco.cd.geojsonscheduler.batch;
import com.kamco.cd.geojsonscheduler.listener.BatchHistoryListener;
import lombok.RequiredArgsConstructor;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.job.builder.JobBuilder;
import org.springframework.batch.core.repository.JobRepository;
import org.springframework.batch.core.step.builder.StepBuilder;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.transaction.PlatformTransactionManager;
@Configuration
@RequiredArgsConstructor
public class ExportGeoJsonJobConfig {
private final JobRepository jobRepository;
private final PlatformTransactionManager transactionManager;
private final LaunchChildJobsTasklet launchChildJobsTasklet;
@Bean
public Job exportGeoJsonJob(BatchHistoryListener historyListener) {
return new JobBuilder("exportGeoJsonJob", jobRepository)
.listener(historyListener)
.start(launchChildJobsStep())
.build();
}
@Bean
public Step launchChildJobsStep() {
return new StepBuilder("launchChildJobsStep", jobRepository)
.tasklet(launchChildJobsTasklet, transactionManager)
.build();
}
}

View File

@@ -0,0 +1,152 @@
package com.kamco.cd.geojsonscheduler.batch;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.SerializationFeature;
import com.kamco.cd.geojsonscheduler.dto.TrainingDataReviewJobDto.AnalCntInfo;
import com.kamco.cd.geojsonscheduler.dto.TrainingDataReviewJobDto.AnalMapSheetList;
import com.kamco.cd.geojsonscheduler.dto.TrainingDataReviewJobDto.CompleteLabelData;
import com.kamco.cd.geojsonscheduler.dto.TrainingDataReviewJobDto.CompleteLabelData.GeoJsonFeature;
import com.kamco.cd.geojsonscheduler.dto.TrainingDataReviewJobDto.FeatureCollection;
import com.kamco.cd.geojsonscheduler.repository.TrainingDataReviewJobRepository;
import com.kamco.cd.geojsonscheduler.service.DockerRunnerService;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.List;
import java.util.Objects;
import lombok.RequiredArgsConstructor;
import lombok.extern.log4j.Log4j2;
import org.springframework.batch.core.StepContribution;
import org.springframework.batch.core.scope.context.ChunkContext;
import org.springframework.batch.core.step.tasklet.Tasklet;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
@Log4j2
@Component
@RequiredArgsConstructor
public class ExportGeoJsonTasklet implements Tasklet {
private final TrainingDataReviewJobRepository repository;
private final DockerRunnerService dockerRunnerService;
@Value("${training-data.geojson-dir}")
private String trainingDataDir;
@Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) {
log.info("========================================");
log.info("배치 작업 시작");
log.info("========================================");
// 1. StepContext를 통해 바로 가져오기 (가장 추천)
String jobName = chunkContext.getStepContext().getJobName();
log.info("Job Name: {}", jobName);
// 진행중인 회차 중, complete_cnt 가 존재하는 회차 목록 가져오기
log.info("진행중인 회차 목록 조회 중...");
List<AnalCntInfo> analList = repository.findAnalCntInfoList();
log.info("진행중인 회차 수: {}", analList.size());
int processedAnalCount = 0;
for (AnalCntInfo info : analList) {
log.info("----------------------------------------");
log.info("회차 처리 중: AnalUid={}, ResultUid={}", info.getAnalUid(), info.getResultUid());
log.info("전체 건수: {}, 파일 건수: {}", info.getAllCnt(), info.getFileCnt());
if (Objects.equals(info.getAllCnt(), info.getFileCnt())) {
log.info("모든 파일이 이미 처리됨. 건너뜀.");
continue;
}
//추론 ID
String resultUid = info.getResultUid();
log.info("ResultUid: {}", resultUid);
//insert 하기 jobname, resultUid , 시작시간
// 어제까지 검수 완료된 총 데이터의 도엽별 목록 가져오기
log.info("검수 완료된 도엽 목록 조회 중... (AnalUid={})", info.getAnalUid());
List<AnalMapSheetList> analMapList = repository.findCompletedAnalMapSheetList(info.getAnalUid());
log.info("검수 완료된 도엽 수: {}", analMapList.size());
//TODO 도엽이 4개이상 존재할때 만 RUN 하기
if (analMapList.isEmpty()) {
log.warn("검수 완료된 도엽이 없음. 건너뜀.");
continue;
}
//insert 하기 jobname, resultUid , 시작시간
boolean anyProcessed = false;
int processedMapSheetCount = 0;
int totalGeoJsonFiles = 0;
for (AnalMapSheetList mapSheet : analMapList) {
log.info(" 도엽 처리 중: MapSheetNum={}", mapSheet.getMapSheetNum());
//도엽별 geom 데이터 가지고 와서 geojson 만들기
List<CompleteLabelData> completeList =
repository.findCompletedYesterdayLabelingList(
info.getAnalUid(), mapSheet.getMapSheetNum());
log.info(" 완료된 라벨링 데이터 수: {}", completeList.size());
if (!completeList.isEmpty()) {
List<Long> geoUids = completeList.stream().map(CompleteLabelData::getGeoUid).toList();
log.info(" GeoUID 목록 생성 완료: {} 건", geoUids.size());
List<GeoJsonFeature> features = completeList.stream().map(GeoJsonFeature::from).toList();
log.info(" GeoJSON Feature 변환 완료: {} 개", features.size());
FeatureCollection collection = new FeatureCollection(features);
String filename = mapSheet.buildFilename(resultUid);
log.info(" GeoJSON 파일명: {}", filename);
// 형식 /kamco-nfs/dataset/request/uuid/filename
Path outputPath = Paths.get(trainingDataDir + File.separator + "request" + File.separator + resultUid, filename);
log.info(" 출력 경로: {}", outputPath);
try {
Files.createDirectories(outputPath.getParent());
log.info(" 디렉토리 생성 완료: {}", outputPath.getParent());
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.enable(SerializationFeature.INDENT_OUTPUT);
objectMapper.writeValue(outputPath.toFile(), collection);
log.info(" GeoJSON 파일 저장 완료: {}", outputPath);
repository.updateLearnDataGeomFileCreateYn(geoUids);
log.info(" DB 업데이트 완료: {} 건", geoUids.size());
anyProcessed = true;
processedMapSheetCount++;
totalGeoJsonFiles++;
} catch (IOException e) {
log.error(" GeoJSON 파일 생성 실패: {}", e.getMessage(), e);
}
}
}
log.info("회차 처리 완료: ResultUid={}", resultUid);
log.info(" 처리된 도엽 수: {}", processedMapSheetCount);
log.info(" 생성된 GeoJSON 파일 수: {}", totalGeoJsonFiles);
if (anyProcessed) {
log.info("Docker 컨테이너 실행 중... (ResultUid={})", resultUid);
dockerRunnerService.run(resultUid);
log.info("Docker 컨테이너 실행 완료 (ResultUid={})", resultUid);
processedAnalCount++;
} else {
log.warn("처리된 도엽이 없어 Docker 실행 건너뜀 (ResultUid={})", resultUid);
}
}
log.info("========================================");
log.info("배치 작업 완료");
log.info("처리된 회차 수: {}", processedAnalCount);
log.info("========================================");
return RepeatStatus.FINISHED;
}
}

View File

@@ -0,0 +1,209 @@
package com.kamco.cd.geojsonscheduler.batch;
import com.kamco.cd.geojsonscheduler.dto.TrainingDataReviewJobDto.AnalCntInfo;
import com.kamco.cd.geojsonscheduler.repository.TrainingDataReviewJobRepository;
import java.util.List;
import java.util.Objects;
import lombok.RequiredArgsConstructor;
import lombok.extern.log4j.Log4j2;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.JobParameters;
import org.springframework.batch.core.JobParametersBuilder;
import org.springframework.batch.core.StepContribution;
import org.springframework.batch.core.launch.JobLauncher;
import org.springframework.batch.core.scope.context.ChunkContext;
import org.springframework.batch.core.step.tasklet.Tasklet;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.stereotype.Component;
/**
* Child Job 실행 Tasklet (Parent Job용)
*
* <p>진행 중인 모든 분석 회차(AnalCntInfo)를 조회하여 각 회차마다 독립적인 Child Job을 실행합니다. 각 Child Job은 3개의
* Step(makeGeoJson → dockerRun → zipResponse)을 순차적으로 실행합니다.
*
* <p><b>주요 기능:</b>
*
* <ul>
* <li>진행 중인 분석 회차 목록 조회 (tb_map_sheet_anal_inference, anal_state='ING')
* <li>각 회차별 처리 필요 여부 판단 (all_cnt != file_cnt)
* <li>회차마다 Child Job(processAnalCntInfoJob) 실행
* <li>부분 실패 허용 (한 회차 실패해도 다른 회차 계속 처리)
* </ul>
*
* <p><b>실행 조건:</b>
*
* <ul>
* <li>tb_map_sheet_anal_inference.anal_state = 'ING' (진행 중)
* <li>검수 완료 건수(complete_cnt) > 0
* <li>all_cnt != file_cnt (아직 파일 생성이 완료되지 않음)
* </ul>
*
* <p><b>실패 정책:</b>
*
* <ul>
* <li>현재: 부분 실패 허용 (일부 Child Job 실패해도 Parent Job 성공)
* <li>변경 가능: 87-89라인 주석 해제 시 하나라도 실패하면 Parent Job 실패
* </ul>
*
* @author KAMCO Development Team
* @since 1.0.0
* @see com.kamco.cd.geojsonscheduler.batch.ProcessAnalCntInfoJobConfig
*/
@Log4j2
@Component
@RequiredArgsConstructor
public class LaunchChildJobsTasklet implements Tasklet {
/** 분석 회차 정보 조회를 위한 Repository */
private final TrainingDataReviewJobRepository repository;
/** Child Job을 실행하기 위한 JobLauncher */
private final JobLauncher jobLauncher;
/** 실행할 Child Job (processAnalCntInfoJob) */
@Qualifier("processAnalCntInfoJob")
private final Job processAnalCntInfoJob;
/**
* Parent Job의 메인 로직 실행
*
* <p>진행 중인 모든 분석 회차를 조회하여 각 회차마다 Child Job을 실행합니다. 한 회차가 실패해도 다른 회차는 계속 처리되며, 최종적으로 통계를
* 로깅합니다.
*
* @param contribution Step 실행 정보를 담는 객체
* @param chunkContext Chunk 실행 컨텍스트
* @return RepeatStatus.FINISHED - 작업 완료
*/
@Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) {
log.info("========================================");
log.info("Parent Job 시작: AnalCntInfo 리스트 조회 및 Child Job 실행");
log.info("========================================");
// Step 1: 진행 중인 분석 회차 목록 조회
log.info("[Step 1/3] 진행 중인 분석 회차 목록 조회 중...");
log.info(" - 조회 조건: anal_state='ING' AND complete_cnt > 0");
List<AnalCntInfo> analList = repository.findAnalCntInfoList();
log.info("[Step 1/3] 조회 완료");
log.info(" - 진행 중인 회차 수: {} 개", analList.size());
if (analList.isEmpty()) {
log.warn("[경고] 처리할 분석 회차가 없습니다.");
log.warn(" - 확인사항:");
log.warn(" 1. tb_map_sheet_anal_inference 테이블의 anal_state='ING' 데이터 확인");
log.warn(" 2. 검수 완료(COMPLETE) 건수가 있는지 확인");
return RepeatStatus.FINISHED;
}
// Step 2: 각 회차별 Child Job 실행
log.info("[Step 2/3] 회차별 Child Job 실행 시작");
// 실행 통계를 위한 카운터
int processedCount = 0; // 성공적으로 처리된 회차 수
int skippedCount = 0; // 건너뛴 회차 수 (이미 처리 완료)
int failedCount = 0; // 실패한 회차 수
// 각 분석 회차별로 Child Job 실행
for (int i = 0; i < analList.size(); i++) {
AnalCntInfo info = analList.get(i);
log.info("========================================");
log.info("[회차 {}/{}] AnalCntInfo 처리 시작", i + 1, analList.size());
log.info(" - AnalUid: {}", info.getAnalUid());
log.info(" - ResultUid: {}", info.getResultUid());
log.info(" - 전체 건수(all_cnt): {}", info.getAllCnt());
log.info(" - 검수 완료 건수(complete_cnt): {}", info.getCompleteCnt());
log.info(" - 파일 생성 완료 건수(file_cnt): {}", info.getFileCnt());
// 처리 필요 여부 판단: all_cnt == file_cnt면 이미 모든 파일이 생성됨
if (Objects.equals(info.getAllCnt(), info.getFileCnt())) {
log.info("[건너뜀] 모든 파일이 이미 처리 완료됨 (all_cnt={}, file_cnt={})",
info.getAllCnt(), info.getFileCnt());
log.info(" - 재처리가 필요한 경우 file_create_yn 플래그를 초기화하세요.");
skippedCount++;
continue;
}
try {
// Child Job Parameters 생성
JobParameters jobParameters =
new JobParametersBuilder()
.addLong("analUid", info.getAnalUid())
.addString("resultUid", info.getResultUid())
.addLong("timestamp", System.currentTimeMillis()) // JobInstance 고유성 보장
.toJobParameters();
log.info("[Child Job 실행] processAnalCntInfoJob 시작...");
log.info(" - JobParameters: analUid={}, resultUid={}", info.getAnalUid(),
info.getResultUid());
// Child Job 실행 (동기 방식)
// 내부적으로 makeGeoJsonStep → dockerRunStep → zipResponseStep 순차 실행
long startTime = System.currentTimeMillis();
jobLauncher.run(processAnalCntInfoJob, jobParameters);
long duration = System.currentTimeMillis() - startTime;
log.info("[Child Job 완료] ✓ 정상 종료");
log.info(" - AnalUid: {}", info.getAnalUid());
log.info(" - ResultUid: {}", info.getResultUid());
log.info(" - 실행 시간: {} ms ({} 초)", duration, duration / 1000);
processedCount++;
} catch (Exception e) {
// Child Job 실행 실패 시 (Step 실패 또는 예외 발생)
log.error("[Child Job 실패] ✗ 실행 중 오류 발생", e);
log.error(" - AnalUid: {}", info.getAnalUid());
log.error(" - ResultUid: {}", info.getResultUid());
log.error(" - 에러 메시지: {}", e.getMessage());
log.error(" - 확인사항:");
log.error(" 1. batch_step_history 테이블에서 실패한 Step 확인");
log.error(" 2. error_message 컬럼에서 상세 에러 내용 확인");
failedCount++;
// 한 회차 실패해도 다음 회차 계속 처리 (부분 실패 허용)
log.info("[계속 진행] 다음 회차 처리를 계속합니다.");
}
}
// Step 3: 최종 통계 및 결과 로깅
log.info("========================================");
log.info("[Step 3/3] Parent Job 실행 결과 요약");
log.info(" - 총 회차 수: {} 개", analList.size());
log.info(" - 성공: {} 개", processedCount);
log.info(" - 건너뜀: {} 개", skippedCount);
log.info(" - 실패: {} 개", failedCount);
// 성공률 계산
if (analList.size() > 0) {
double successRate =
(double) processedCount / (analList.size() - skippedCount) * 100;
log.info(" - 성공률: {}% (건너뛴 회차 제외)", String.format("%.2f", successRate));
}
log.info("========================================");
// 실패 정책 처리
if (failedCount > 0) {
log.warn("[경고] {} 개의 Child Job 실행이 실패했습니다.", failedCount);
log.warn(" - 실패 상세 내용은 batch_step_history 테이블을 확인하세요.");
log.warn(" - SQL: SELECT * FROM batch_step_history WHERE status='FAILED' ORDER BY"
+ " started_dttm DESC;");
// 실패가 있어도 Parent Job은 성공으로 처리 (부분 성공 정책)
// 만약 하나라도 실패하면 Parent Job도 실패로 처리하려면 아래 주석 해제
// throw new RuntimeException(
// String.format("%d 개의 Child Job 실행이 실패했습니다. (성공: %d, 실패: %d)",
// failedCount, processedCount, failedCount));
} else {
log.info("[완료] 모든 Child Job이 정상적으로 완료되었습니다.");
}
return RepeatStatus.FINISHED;
}
}

View File

@@ -0,0 +1,205 @@
package com.kamco.cd.geojsonscheduler.batch;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.SerializationFeature;
import com.kamco.cd.geojsonscheduler.dto.TrainingDataReviewJobDto.AnalMapSheetList;
import com.kamco.cd.geojsonscheduler.dto.TrainingDataReviewJobDto.CompleteLabelData;
import com.kamco.cd.geojsonscheduler.dto.TrainingDataReviewJobDto.CompleteLabelData.GeoJsonFeature;
import com.kamco.cd.geojsonscheduler.dto.TrainingDataReviewJobDto.FeatureCollection;
import com.kamco.cd.geojsonscheduler.repository.TrainingDataReviewJobRepository;
import java.io.File;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.List;
import lombok.RequiredArgsConstructor;
import lombok.extern.log4j.Log4j2;
import org.springframework.batch.core.StepContribution;
import org.springframework.batch.core.scope.context.ChunkContext;
import org.springframework.batch.core.step.tasklet.Tasklet;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
/**
* GeoJSON 파일 생성 Tasklet
*
* <p>검수 완료된 라벨링 데이터를 GeoJSON 형식으로 변환하여 파일로 저장합니다. 각 도엽(Map Sheet)별로 별도의 GeoJSON 파일을
* 생성하며, 파일 생성 완료 후 DB에 플래그를 업데이트합니다.
*
* <p><b>주요 기능:</b>
*
* <ul>
* <li>검수 완료된 도엽 목록 조회
* <li>도엽별 라벨링 데이터 조회 및 GeoJSON Feature 변환
* <li>GeoJSON 파일 생성 (/dataset/request/{resultUid}/*.geojson)
* <li>DB에 파일 생성 완료 플래그 업데이트
* </ul>
*
* <p><b>파일 명명 규칙:</b> {resultUid_8자}_{compareYyyy}_{targetYyyy}_{mapSheetNum}_D15.geojson
*
* <p><b>예시:</b> ED80D700_2022_2023_3724036_D15.geojson
*
* @author KAMCO Development Team
* @since 1.0.0
*/
@Log4j2
@Component
@RequiredArgsConstructor
public class MakeGeoJsonTasklet implements Tasklet {
/** 라벨링 데이터 조회를 위한 Repository */
private final TrainingDataReviewJobRepository repository;
/** GeoJSON 파일이 저장될 베이스 디렉토리 경로 (예: /kamco-nfs/dataset) */
@Value("${training-data.geojson-dir}")
private String trainingDataDir;
/** Job Parameter로 전달받은 분석 회차 UID */
@Value("#{jobParameters['analUid']}")
private Long analUid;
/** Job Parameter로 전달받은 결과물 고유 ID (UUID) */
@Value("#{jobParameters['resultUid']}")
private String resultUid;
/**
* GeoJSON 파일 생성 작업 실행
*
* <p>검수 완료된 라벨링 데이터를 조회하여 도엽별로 GeoJSON 파일을 생성합니다.
*
* @param contribution Step 실행 정보를 담는 객체
* @param chunkContext Chunk 실행 컨텍스트
* @return RepeatStatus.FINISHED - 작업 완료
* @throws RuntimeException 생성된 GeoJSON 파일이 없는 경우 또는 파일 생성 실패 시
*/
@Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) {
log.info("========================================");
log.info("GeoJSON 생성 시작 (AnalUid={}, ResultUid={})", analUid, resultUid);
log.info("========================================");
// Step 1: 검수 완료된 도엽 목록 조회
log.info("[Step 1/4] 검수 완료된 도엽 목록 조회 중... (AnalUid={})", analUid);
List<AnalMapSheetList> analMapList = repository.findCompletedAnalMapSheetList(analUid);
log.info("[Step 1/4] 검수 완료된 도엽 수: {}", analMapList.size());
// 검수 완료된 도엽이 없으면 작업 종료
if (analMapList.isEmpty()) {
log.warn("[경고] 검수 완료된 도엽이 없음. 작업을 건너뜁니다.");
log.warn(" - 확인사항: tb_labeling_assignment 테이블의 inspect_state='COMPLETE' 데이터가 있는지 확인");
return RepeatStatus.FINISHED;
}
// GeoJSON 파일 생성 통계를 위한 카운터
int processedMapSheetCount = 0; // 처리된 도엽 수
int totalGeoJsonFiles = 0; // 생성된 GeoJSON 파일 수
// Step 2: 각 도엽별로 GeoJSON 파일 생성
log.info("[Step 2/4] 도엽별 GeoJSON 파일 생성 시작 (총 {} 개 도엽)", analMapList.size());
for (AnalMapSheetList mapSheet : analMapList) {
log.info("----------------------------------------");
log.info(" [도엽 처리] MapSheetNum={}", mapSheet.getMapSheetNum());
// Step 2-1: 도엽별 검수 완료된 라벨링 데이터 조회
log.info(" [2-1] 라벨링 데이터 조회 중...");
List<CompleteLabelData> completeList =
repository.findCompletedYesterdayLabelingList(analUid, mapSheet.getMapSheetNum());
log.info(" [2-1] 조회 완료: {} 건", completeList.size());
// 라벨링 데이터가 없으면 다음 도엽으로
if (completeList.isEmpty()) {
log.info(" [건너뜀] 라벨링 데이터가 없어 GeoJSON 파일 생성을 건너뜁니다.");
continue;
}
// Step 2-2: GeoUID 목록 추출 (DB 업데이트용)
log.info(" [2-2] GeoUID 목록 추출 중...");
List<Long> geoUids = completeList.stream().map(CompleteLabelData::getGeoUid).toList();
log.info(" [2-2] GeoUID 목록 생성 완료: {} 건", geoUids.size());
// Step 2-3: GeoJSON Feature 객체로 변환
log.info(" [2-3] GeoJSON Feature 변환 중...");
List<GeoJsonFeature> features = completeList.stream().map(GeoJsonFeature::from).toList();
log.info(" [2-3] GeoJSON Feature 변환 완료: {} 개", features.size());
// Step 2-4: FeatureCollection 생성 및 파일명 결정
log.info(" [2-4] FeatureCollection 생성 중...");
FeatureCollection collection = new FeatureCollection(features);
String filename = mapSheet.buildFilename(resultUid);
log.info(" [2-4] GeoJSON 파일명: {}", filename);
// Step 2-5: 파일 저장 경로 생성
// 형식: /kamco-nfs/dataset/request/{resultUid}/{filename}.geojson
Path outputPath =
Paths.get(
trainingDataDir + File.separator + "request" + File.separator + resultUid, filename);
log.info(" [2-5] 출력 경로: {}", outputPath);
try {
// Step 2-6: 디렉토리 생성 (존재하지 않으면)
log.info(" [2-6] 디렉토리 생성 중...");
Files.createDirectories(outputPath.getParent());
log.info(" [2-6] 디렉토리 생성 완료: {}", outputPath.getParent());
// Step 2-7: GeoJSON 파일 저장 (Pretty Print 포맷)
log.info(" [2-7] GeoJSON 파일 저장 중...");
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.enable(SerializationFeature.INDENT_OUTPUT); // JSON 들여쓰기 적용
objectMapper.writeValue(outputPath.toFile(), collection);
log.info(" [2-7] ✓ GeoJSON 파일 저장 완료: {}", outputPath);
// Step 2-8: DB에 파일 생성 완료 플래그 업데이트
log.info(" [2-8] DB 파일 생성 플래그 업데이트 중...");
repository.updateLearnDataGeomFileCreateYn(geoUids);
log.info(" [2-8] ✓ DB 업데이트 완료: {} 건", geoUids.size());
// 통계 카운터 증가
processedMapSheetCount++;
totalGeoJsonFiles++;
log.info(" [완료] 도엽 '{}' 처리 성공", mapSheet.getMapSheetNum());
} catch (IOException e) {
// 파일 생성 실패 시 예외 발생 (Step 실패 처리)
log.error(" [실패] GeoJSON 파일 생성 실패", e);
log.error(" - 파일명: {}", filename);
log.error(" - 경로: {}", outputPath);
log.error(" - 에러 메시지: {}", e.getMessage());
throw new RuntimeException("GeoJSON 파일 생성 실패: " + filename, e);
}
}
// Step 3: 처리 결과 요약 및 검증
log.info("========================================");
log.info("[Step 3/4] GeoJSON 생성 작업 완료");
log.info(" - ResultUid: {}", resultUid);
log.info(" - AnalUid: {}", analUid);
log.info(" - 처리 대상 도엽 수: {}", analMapList.size());
log.info(" - 처리 완료 도엽 수: {}", processedMapSheetCount);
log.info(" - 생성된 GeoJSON 파일 수: {}", totalGeoJsonFiles);
log.info("========================================");
// Step 4: 필수 검증 - 최소 1개 이상의 파일이 생성되어야 함
log.info("[Step 4/4] 필수 검증 수행 중...");
if (totalGeoJsonFiles == 0) {
log.error("[실패] 생성된 GeoJSON 파일이 없습니다!");
log.error(" - AnalUid: {}", analUid);
log.error(" - ResultUid: {}", resultUid);
log.error(" - 조회된 도엽 수: {}", analMapList.size());
log.error(" - 확인사항:");
log.error(" 1. 각 도엽에 라벨링 데이터가 있는지 확인");
log.error(" 2. findCompletedYesterdayLabelingList() 쿼리 결과 확인");
throw new RuntimeException(
String.format(
"생성된 GeoJSON 파일이 없습니다. (AnalUid=%d, ResultUid=%s)", analUid, resultUid));
}
log.info("[Step 4/4] ✓ 검증 완료: {} 개의 GeoJSON 파일이 정상적으로 생성되었습니다.", totalGeoJsonFiles);
log.info("GeoJSON 파일 저장 위치: {}/request/{}/", trainingDataDir, resultUid);
return RepeatStatus.FINISHED;
}
}

View File

@@ -0,0 +1,57 @@
package com.kamco.cd.geojsonscheduler.batch;
import com.kamco.cd.geojsonscheduler.listener.StepHistoryListener;
import lombok.RequiredArgsConstructor;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.job.builder.JobBuilder;
import org.springframework.batch.core.repository.JobRepository;
import org.springframework.batch.core.step.builder.StepBuilder;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.transaction.PlatformTransactionManager;
@Configuration
@RequiredArgsConstructor
public class ProcessAnalCntInfoJobConfig {
private final JobRepository jobRepository;
private final PlatformTransactionManager transactionManager;
private final MakeGeoJsonTasklet makeGeoJsonTasklet;
private final DockerRunTasklet dockerRunTasklet;
private final ZipResponseTasklet zipResponseTasklet;
private final StepHistoryListener stepHistoryListener;
@Bean
public Job processAnalCntInfoJob() {
return new JobBuilder("processAnalCntInfoJob", jobRepository)
.start(makeGeoJsonStep())
.next(dockerRunStep())
.next(zipResponseStep())
.build();
}
@Bean
public Step makeGeoJsonStep() {
return new StepBuilder("makeGeoJsonStep", jobRepository)
.tasklet(makeGeoJsonTasklet, transactionManager)
.listener(stepHistoryListener)
.build();
}
@Bean
public Step dockerRunStep() {
return new StepBuilder("dockerRunStep", jobRepository)
.tasklet(dockerRunTasklet, transactionManager)
.listener(stepHistoryListener)
.build();
}
@Bean
public Step zipResponseStep() {
return new StepBuilder("zipResponseStep", jobRepository)
.tasklet(zipResponseTasklet, transactionManager)
.listener(stepHistoryListener)
.build();
}
}

View File

@@ -0,0 +1,36 @@
package com.kamco.cd.geojsonscheduler.batch;
import com.kamco.cd.geojsonscheduler.listener.BatchHistoryListener;
import lombok.RequiredArgsConstructor;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.job.builder.JobBuilder;
import org.springframework.batch.core.repository.JobRepository;
import org.springframework.batch.core.step.builder.StepBuilder;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.transaction.PlatformTransactionManager;
@Configuration
@RequiredArgsConstructor
public class TrainModelJobConfig {
private final JobRepository jobRepository;
private final PlatformTransactionManager transactionManager;
private final TrainModelTasklet trainModelTasklet;
@Bean
public Job trainModelJob(BatchHistoryListener historyListener) {
return new JobBuilder("trainModelJob", jobRepository)
.listener(historyListener)
.start(trainModelStep())
.build();
}
@Bean
public Step trainModelStep() {
return new StepBuilder("trainModelStep", jobRepository)
.tasklet(trainModelTasklet, transactionManager)
.build();
}
}

View File

@@ -0,0 +1,60 @@
package com.kamco.cd.geojsonscheduler.batch;
import com.kamco.cd.geojsonscheduler.service.TrainDockerRunnerService;
import lombok.RequiredArgsConstructor;
import lombok.extern.log4j.Log4j2;
import org.springframework.batch.core.StepContribution;
import org.springframework.batch.core.scope.context.ChunkContext;
import org.springframework.batch.core.step.tasklet.Tasklet;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.stereotype.Component;
@Log4j2
@Component
@RequiredArgsConstructor
public class TrainModelTasklet implements Tasklet {
private final TrainDockerRunnerService trainDockerRunnerService;
@Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) {
log.info("========================================");
log.info("학습 배치 작업 시작");
log.info("========================================");
String jobName = chunkContext.getStepContext().getJobName();
log.info("Job Name: {}", jobName);
// Job 파라미터에서 dataset-folder와 output-folder 가져오기
String datasetFolder = (String) chunkContext.getStepContext()
.getJobParameters()
.get("dataset-folder");
String outputFolder = (String) chunkContext.getStepContext()
.getJobParameters()
.get("output-folder");
log.info("Dataset Folder Parameter: {}", datasetFolder);
log.info("Output Folder Parameter: {}", outputFolder);
if (datasetFolder == null || datasetFolder.isBlank()) {
log.error("dataset-folder 파라미터가 없습니다!");
throw new IllegalArgumentException("dataset-folder parameter is required");
}
if (outputFolder == null || outputFolder.isBlank()) {
log.error("output-folder 파라미터가 없습니다!");
throw new IllegalArgumentException("output-folder parameter is required");
}
// Train Docker 실행
log.info("Train Docker 실행 중...");
trainDockerRunnerService.runTraining(datasetFolder, outputFolder);
log.info("Train Docker 실행 완료");
log.info("========================================");
log.info("학습 배치 작업 완료");
log.info("========================================");
return RepeatStatus.FINISHED;
}
}

View File

@@ -0,0 +1,264 @@
package com.kamco.cd.geojsonscheduler.batch;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.zip.ZipEntry;
import java.util.zip.ZipOutputStream;
import lombok.RequiredArgsConstructor;
import lombok.extern.log4j.Log4j2;
import org.springframework.batch.core.StepContribution;
import org.springframework.batch.core.scope.context.ChunkContext;
import org.springframework.batch.core.step.tasklet.Tasklet;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
/**
* 결과물 ZIP 압축 Tasklet
*
* <p>Docker 컨테이너 실행으로 생성된 학습 데이터 결과물을 ZIP 파일로 압축합니다. 압축된 파일은 다운로드 또는 배포를 위해
* 사용됩니다.
*
* <p><b>주요 기능:</b>
*
* <ul>
* <li>response/{resultUid}/ 디렉토리 검증
* <li>디렉토리 내 모든 파일과 서브디렉토리 재귀적 압축
* <li>압축 파일 생성: response/{resultUid}.zip
* </ul>
*
* <p><b>압축 설정:</b>
*
* <ul>
* <li>Hidden 파일 제외
* <li>디렉토리 구조 유지
* <li>버퍼 크기: 1024 bytes
* </ul>
*
* <p><b>실행 조건:</b>
*
* <ul>
* <li>dockerRunStep이 성공적으로 완료되어야 함
* <li>response/{resultUid}/ 디렉토리가 존재해야 함
* <li>디렉토리 내에 최소 1개 이상의 파일이 존재해야 함
* </ul>
*
* @author KAMCO Development Team
* @since 1.0.0
*/
@Log4j2
@Component
@RequiredArgsConstructor
public class ZipResponseTasklet implements Tasklet {
/** 학습 데이터 저장 베이스 디렉토리 경로 (예: /kamco-nfs/dataset) */
@Value("${training-data.geojson-dir}")
private String trainingDataDir;
/** Job Parameter로 전달받은 결과물 고유 ID (UUID) */
@Value("#{jobParameters['resultUid']}")
private String resultUid;
/**
* 결과물 압축 작업 실행
*
* <p>response/{resultUid}/ 디렉토리를 검증하고 ZIP 파일로 압축합니다.
*
* @param contribution Step 실행 정보를 담는 객체
* @param chunkContext Chunk 실행 컨텍스트
* @return RepeatStatus.FINISHED - 작업 완료
* @throws RuntimeException response 디렉토리가 존재하지 않거나 압축 실패 시
*/
@Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) {
log.info("========================================");
log.info("결과물 압축 시작 (ResultUid={})", resultUid);
log.info("========================================");
// Step 1: 압축 대상 디렉토리 및 출력 파일 경로 설정
log.info("[Step 1/5] 경로 설정 중...");
Path responseDir =
Paths.get(trainingDataDir + File.separator + "response" + File.separator + resultUid);
Path zipFile =
Paths.get(
trainingDataDir + File.separator + "response" + File.separator + resultUid + ".zip");
log.info("[Step 1/5] 경로 설정 완료");
log.info(" - 압축 대상 디렉토리: {}", responseDir);
log.info(" - 압축 파일 저장 경로: {}", zipFile);
// Step 2: response 디렉토리 존재 여부 검증
log.info("[Step 2/5] Response 디렉토리 검증 중...");
if (!Files.exists(responseDir)) {
log.error("[실패] Response 디렉토리가 존재하지 않습니다!");
log.error(" - 경로: {}", responseDir);
log.error(" - ResultUid: {}", resultUid);
log.error(" - 확인사항:");
log.error(" 1. dockerRunStep이 정상적으로 완료되었는지 확인");
log.error(" 2. Docker 컨테이너가 결과물을 생성했는지 확인");
log.error(" 3. Docker 볼륨 마운트 경로가 올바른지 확인");
throw new RuntimeException("Response 디렉토리가 존재하지 않습니다: " + responseDir);
}
log.info("[Step 2/5] ✓ Response 디렉토리 존재 확인");
// Step 3: 디렉토리 내용 확인 (파일 수 카운트)
log.info("[Step 3/5] 디렉토리 내용 분석 중...");
File responseDirFile = responseDir.toFile();
long fileCount = countFilesRecursively(responseDirFile);
log.info("[Step 3/5] 디렉토리 분석 완료");
log.info(" - 총 파일 수: {} 개", fileCount);
if (fileCount == 0) {
log.warn("[경고] 압축할 파일이 없습니다. (디렉토리가 비어있음)");
log.warn(" - 디렉토리: {}", responseDir);
}
// Step 4: ZIP 압축 실행
log.info("[Step 4/5] ZIP 압축 시작...");
try {
long startTime = System.currentTimeMillis();
zipDirectory(responseDirFile, zipFile.toFile());
long endTime = System.currentTimeMillis();
long duration = endTime - startTime;
long zipSize = Files.size(zipFile);
double zipSizeMB = zipSize / (1024.0 * 1024.0);
log.info("[Step 4/5] ✓ ZIP 압축 완료");
log.info(" - 압축 파일: {}", zipFile);
log.info(" - 압축 파일 크기: {} bytes ({} MB)", zipSize, String.format("%.2f", zipSizeMB));
log.info(" - 압축 소요 시간: {} ms", duration);
} catch (IOException e) {
log.error("[실패] ZIP 압축 중 오류 발생!", e);
log.error(" - 소스 디렉토리: {}", responseDir);
log.error(" - 대상 ZIP 파일: {}", zipFile);
log.error(" - 에러 메시지: {}", e.getMessage());
throw new RuntimeException("Response 디렉토리 압축 실패: " + responseDir, e);
}
// Step 5: 최종 검증 및 완료
log.info("[Step 5/5] 최종 검증 중...");
if (Files.exists(zipFile)) {
log.info("[Step 5/5] ✓ ZIP 파일 생성 확인 완료");
} else {
log.error("[실패] ZIP 파일이 생성되지 않았습니다!");
throw new RuntimeException("ZIP 파일 생성 실패: " + zipFile);
}
log.info("========================================");
log.info("결과물 압축 완료 (ResultUid={})", resultUid);
log.info(" - ZIP 파일 위치: {}", zipFile);
log.info("========================================");
return RepeatStatus.FINISHED;
}
/**
* 디렉토리를 ZIP 파일로 압축
*
* @param sourceDir 압축할 소스 디렉토리
* @param zipFile 생성될 ZIP 파일
* @throws IOException 압축 중 IO 오류 발생 시
*/
private void zipDirectory(File sourceDir, File zipFile) throws IOException {
log.debug("ZIP 압축 시작: {} -> {}", sourceDir.getAbsolutePath(), zipFile.getAbsolutePath());
try (FileOutputStream fos = new FileOutputStream(zipFile);
ZipOutputStream zos = new ZipOutputStream(fos)) {
zipDirectoryRecursive(sourceDir, sourceDir.getName(), zos);
}
log.debug("ZIP 압축 완료: {}", zipFile.getAbsolutePath());
}
/**
* 디렉토리를 재귀적으로 ZIP으로 압축
*
* <p>서브디렉토리를 포함한 모든 파일과 디렉토리를 ZIP에 추가합니다. Hidden 파일은 제외됩니다.
*
* @param fileToZip 압축할 파일 또는 디렉토리
* @param fileName ZIP 엔트리 이름
* @param zos ZipOutputStream
* @throws IOException 압축 중 IO 오류 발생 시
*/
private void zipDirectoryRecursive(File fileToZip, String fileName, ZipOutputStream zos)
throws IOException {
// Hidden 파일은 제외
if (fileToZip.isHidden()) {
log.debug("Hidden 파일 제외: {}", fileToZip.getName());
return;
}
// 디렉토리인 경우
if (fileToZip.isDirectory()) {
log.debug("디렉토리 추가: {}", fileName);
// 디렉토리 엔트리 생성
if (fileName.endsWith("/")) {
zos.putNextEntry(new ZipEntry(fileName));
zos.closeEntry();
} else {
zos.putNextEntry(new ZipEntry(fileName + "/"));
zos.closeEntry();
}
// 하위 파일 및 디렉토리 재귀 처리
File[] children = fileToZip.listFiles();
if (children != null) {
for (File childFile : children) {
zipDirectoryRecursive(childFile, fileName + "/" + childFile.getName(), zos);
}
}
return;
}
// 파일인 경우: 파일 내용을 ZIP에 추가
log.debug("파일 추가: {}", fileName);
try (FileInputStream fis = new FileInputStream(fileToZip)) {
ZipEntry zipEntry = new ZipEntry(fileName);
zos.putNextEntry(zipEntry);
// 파일 내용을 버퍼를 통해 복사
byte[] buffer = new byte[1024];
int length;
while ((length = fis.read(buffer)) >= 0) {
zos.write(buffer, 0, length);
}
}
}
/**
* 디렉토리 내 파일 수를 재귀적으로 카운트
*
* @param directory 카운트할 디렉토리
* @return 디렉토리 내 총 파일 수 (서브디렉토리 포함)
*/
private long countFilesRecursively(File directory) {
if (!directory.isDirectory()) {
return directory.isFile() ? 1 : 0;
}
File[] children = directory.listFiles();
if (children == null) {
return 0;
}
long count = 0;
for (File child : children) {
if (child.isFile()) {
count++;
} else if (child.isDirectory()) {
count += countFilesRecursively(child);
}
}
return count;
}
}

View File

@@ -0,0 +1,23 @@
package com.kamco.cd.geojsonscheduler.config;
import java.util.List;
import lombok.Getter;
import lombok.Setter;
import org.springframework.boot.context.properties.ConfigurationProperties;
@Getter
@Setter
@ConfigurationProperties(prefix = "training-data.docker")
public class DockerProperties {
private String image;
private String user;
private String datasetVolume;
private String imagesVolume;
private String inputRoot;
private String outputRoot;
private int patchSize;
private int overlapPct;
private List<String> trainValTestRatio;
private double keepEmptyRatio;
}

View File

@@ -0,0 +1,25 @@
package com.kamco.cd.geojsonscheduler.config;
import lombok.Getter;
import lombok.Setter;
import org.springframework.boot.context.properties.ConfigurationProperties;
@Getter
@Setter
@ConfigurationProperties(prefix = "train-data.docker")
public class TrainDockerProperties {
private String image;
private String dataVolume;
private String checkpointsVolume;
private String datasetFolder;
private String outputFolder;
private String inputSize;
private String cropSize;
private int batchSize;
private String gpuIds;
private int gpus;
private String lr;
private String backbone;
private int epochs;
}

View File

@@ -0,0 +1,121 @@
package com.kamco.cd.geojsonscheduler.dto;
import com.fasterxml.jackson.annotation.JsonIgnore;
import com.fasterxml.jackson.annotation.JsonPropertyOrder;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.node.ObjectNode;
import com.kamco.cd.geojsonscheduler.dto.TrainingDataReviewJobDto.CompleteLabelData.GeoJsonFeature;
import java.util.List;
import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Getter;
import lombok.RequiredArgsConstructor;
import lombok.Setter;
public class TrainingDataReviewJobDto {
@Getter
@Setter
@RequiredArgsConstructor
@AllArgsConstructor
public static class AnalCntInfo {
Long analUid;
String resultUid;
Long allCnt;
Long completeCnt;
Long fileCnt;
}
@Getter
@Builder
@AllArgsConstructor
public static class AnalMapSheetList {
private Integer compareYyyy;
private Integer targetYyyy;
private String mapSheetNum;
public String buildFilename(String resultUid) {
return String.format(
"%s_%s_%s_%s_D15.geojson",
resultUid.substring(0, 8),
compareYyyy,
targetYyyy,
mapSheetNum);
}
}
@Getter
@Setter
@JsonPropertyOrder({"type", "features"})
public static class FeatureCollection {
private final String type = "FeatureCollection";
private List<GeoJsonFeature> features;
public FeatureCollection(List<GeoJsonFeature> features) {
this.features = features;
}
}
@Getter
@Setter
@JsonPropertyOrder({"type", "geometry", "properties"})
public static class CompleteLabelData {
private Long geoUid;
private String type;
@JsonIgnore private String geomStr;
private JsonNode geometry;
private Properties properties;
public CompleteLabelData(Long geoUid, String type, String geomStr, Properties properties) {
this.geoUid = geoUid;
this.type = type;
this.geomStr = geomStr;
ObjectMapper mapper = new ObjectMapper();
JsonNode jsonNode = null;
try {
if (geomStr != null) {
jsonNode = mapper.readTree(this.geomStr);
}
} catch (JsonProcessingException e) {
throw new RuntimeException(e);
}
this.geometry = jsonNode;
if (jsonNode != null && jsonNode.isObject()) {
((ObjectNode) jsonNode).remove("crs");
}
this.properties = properties;
}
@Getter
@Setter
@RequiredArgsConstructor
@AllArgsConstructor
public static class Properties {
private String modelId;
private String before;
private String after;
}
@Getter
@AllArgsConstructor
public static class GeoJsonFeature {
private String type;
private JsonNode geometry;
private Properties properties;
public static GeoJsonFeature from(CompleteLabelData data) {
return new GeoJsonFeature(data.getType(), data.getGeometry(), data.getProperties());
}
}
}
}

View File

@@ -0,0 +1,18 @@
package com.kamco.cd.geojsonscheduler.enums;
import lombok.AllArgsConstructor;
import lombok.Getter;
@Getter
@AllArgsConstructor
public enum InspectState {
UNCONFIRM("미확인"),
EXCEPT("제외"),
COMPLETE("완료");
private final String desc;
public String getId() {
return name();
}
}

View File

@@ -0,0 +1,19 @@
package com.kamco.cd.geojsonscheduler.enums;
import lombok.AllArgsConstructor;
import lombok.Getter;
@Getter
@AllArgsConstructor
public enum LabelMngState {
PENDING("작업대기"),
ASSIGNED("작업할당"),
ING("진행중"),
FINISH("종료");
private final String desc;
public String getId() {
return name();
}
}

View File

@@ -0,0 +1,90 @@
package com.kamco.cd.geojsonscheduler.listener;
import lombok.extern.log4j.Log4j2;
import org.springframework.batch.core.JobExecution;
import org.springframework.batch.core.JobExecutionListener;
import org.springframework.batch.core.BatchStatus;
import org.springframework.stereotype.Component;
import java.time.Duration;
import java.util.UUID;
@Log4j2
@Component
public class BatchHistoryListener implements JobExecutionListener {
private final BatchHistoryService batchHistoryService;
public BatchHistoryListener(BatchHistoryService batchHistoryService) {
this.batchHistoryService = batchHistoryService;
}
@Override
public void beforeJob(JobExecution jobExecution) {
log.info("=========================================================");
log.info("배치 Job 시작 - BatchHistoryListener");
log.info("=========================================================");
// 1. UUID 생성 (또는 파라미터에서 가져오기)
UUID uuid = UUID.randomUUID();
log.info("배치 UUID 생성: {}", uuid);
// 2. JobExecutionContext에 UUID 저장 (afterJob에서 쓰기 위해)
jobExecution.getExecutionContext().put("batch_uuid", uuid);
// 3. Job 이름과 비즈니스 ID(파라미터 등) 가져오기
String jobName = jobExecution.getJobInstance().getJobName();
String businessId = jobExecution.getJobParameters().getString("id", "UNKNOWN"); // 파라미터 'id'가 있다고 가정
log.info("Job Name: {}", jobName);
log.info("Business ID: {}", businessId);
log.info("Job Instance ID: {}", jobExecution.getJobInstance().getInstanceId());
log.info("Job Execution ID: {}", jobExecution.getId());
// 4. 시작 기록
log.info("batch_history 테이블에 시작 기록 저장 중...");
batchHistoryService.startBatch(uuid, jobName, businessId);
log.info("배치 시작 기록 저장 완료");
}
@Override
public void afterJob(JobExecution jobExecution) {
log.info("=========================================================");
log.info("배치 Job 종료 - BatchHistoryListener");
log.info("=========================================================");
// 1. 저장해둔 UUID 꺼내기
UUID uuid = (UUID) jobExecution.getExecutionContext().get("batch_uuid");
log.info("배치 UUID: {}", uuid);
// 2. 성공 여부 판단 (COMPLETED면 성공, 그 외 실패)
boolean isSuccess = jobExecution.getStatus() == BatchStatus.COMPLETED;
log.info("배치 상태: {}", jobExecution.getStatus());
log.info("배치 성공 여부: {}", isSuccess ? "성공" : "실패");
if (jobExecution.getStatus() == BatchStatus.FAILED) {
log.error("배치 실행 실패!");
jobExecution.getAllFailureExceptions().forEach(t ->
log.error("실패 원인: {}", t.getMessage(), t)
);
}
// 실행 시간 계산
if (jobExecution.getStartTime() != null && jobExecution.getEndTime() != null) {
Duration duration = Duration.between(jobExecution.getStartTime(), jobExecution.getEndTime());
long seconds = duration.getSeconds();
long millis = duration.toMillis();
log.info("배치 실행 시간: {} ms ({} 초)", millis, seconds);
}
// 3. 종료 기록
if (uuid != null) {
log.info("batch_history 테이블에 종료 기록 저장 중...");
batchHistoryService.finishBatch(uuid, isSuccess);
log.info("배치 종료 기록 저장 완료");
} else {
log.warn("배치 UUID가 없어 종료 기록을 저장할 수 없습니다.");
}
log.info("=========================================================");
}
}

View File

@@ -0,0 +1,89 @@
package com.kamco.cd.geojsonscheduler.listener;
import lombok.extern.log4j.Log4j2;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.sql.Timestamp;
import java.time.LocalDateTime;
import java.util.UUID;
@Log4j2
@Service
public class BatchHistoryService {
private final JdbcTemplate jdbcTemplate;
public BatchHistoryService(JdbcTemplate jdbcTemplate) {
this.jdbcTemplate = jdbcTemplate;
}
/**
* 배치 시작 시 이력 저장 (INSERT)
*/
@Transactional
public void startBatch(UUID uuid, String jobName, String businessId) {
log.info("[BatchHistoryService] 배치 시작 기록 저장");
log.info(" UUID: {}", uuid);
log.info(" Job Name: {}", jobName);
log.info(" Business ID: {}", businessId);
String sql = """
INSERT INTO public.batch_history
(uuid, job, id, created_dttm, updated_dttm, status)
VALUES (?, ?, ?, ?, ?, ?)
""";
Timestamp now = Timestamp.valueOf(LocalDateTime.now());
log.info(" 시작 시간: {}", now);
// 초기 상태는 'STARTED'로 저장
int rowsAffected = jdbcTemplate.update(sql,
uuid,
jobName,
businessId,
now, // created_dttm
now, // updated_dttm
"STARTED"
);
log.info("[BatchHistoryService] 배치 시작 기록 저장 완료 ({} rows affected)", rowsAffected);
}
/**
* 배치 종료 시 이력 업데이트 (UPDATE)
*/
@Transactional
public void finishBatch(UUID uuid, boolean isSuccess) {
log.info("[BatchHistoryService] 배치 종료 기록 업데이트");
log.info(" UUID: {}", uuid);
log.info(" 성공 여부: {}", isSuccess);
String sql = """
UPDATE public.batch_history
SET status = ?,
updated_dttm = ?,
completed_dttm = ?
WHERE uuid = ?
""";
Timestamp now = Timestamp.valueOf(LocalDateTime.now());
String status = isSuccess ? "COMPLETED" : "FAILED";
log.info(" 완료 시간: {}", now);
log.info(" 최종 상태: {}", status);
int rowsAffected = jdbcTemplate.update(sql,
status,
now, // updated_dttm (마지막 변경 시간)
now, // completed_dttm (완료 시간)
uuid
);
if (rowsAffected > 0) {
log.info("[BatchHistoryService] 배치 종료 기록 업데이트 완료 ({} rows affected)", rowsAffected);
} else {
log.warn("[BatchHistoryService] 업데이트된 row가 없습니다. UUID가 존재하지 않을 수 있습니다: {}", uuid);
}
}
}

View File

@@ -0,0 +1,148 @@
package com.kamco.cd.geojsonscheduler.listener;
import com.kamco.cd.geojsonscheduler.repository.BatchStepHistoryRepository;
import lombok.RequiredArgsConstructor;
import lombok.extern.log4j.Log4j2;
import org.springframework.batch.core.ExitStatus;
import org.springframework.batch.core.StepExecution;
import org.springframework.batch.core.StepExecutionListener;
import org.springframework.stereotype.Component;
/**
* Step 실행 이력 Listener
*
* <p>각 Step의 시작과 종료 시점에 실행되어 batch_step_history 테이블에 실행 이력을 기록합니다. Step 시작 시 STARTED
* 상태로 INSERT하고, Step 종료 시 SUCCESS 또는 FAILED 상태로 UPDATE합니다.
*
* <p><b>주요 기능:</b>
*
* <ul>
* <li>Step 시작 시 DB에 STARTED 상태 기록
* <li>Step 종료 시 DB에 SUCCESS 또는 FAILED 상태 업데이트
* <li>실패 시 에러 메시지 자동 기록
* <li>리스너 자체 오류가 Step 실행에 영향을 주지 않도록 예외 처리
* </ul>
*
* <p><b>적용 대상:</b>
*
* <ul>
* <li>makeGeoJsonStep
* <li>dockerRunStep
* <li>zipResponseStep
* </ul>
*
* <p><b>필수 JobParameters:</b>
*
* <ul>
* <li>analUid (Long): 분석 회차 UID
* <li>resultUid (String): 결과물 고유 ID
* </ul>
*
* @author KAMCO Development Team
* @since 1.0.0
* @see BatchStepHistoryRepository
*/
@Log4j2
@Component
@RequiredArgsConstructor
public class StepHistoryListener implements StepExecutionListener {
/** Step 이력 DB 저장을 위한 Repository */
private final BatchStepHistoryRepository batchStepHistoryRepository;
@Override
public void beforeStep(StepExecution stepExecution) {
log.info("=========================================================");
log.info("Step 시작 - StepHistoryListener");
log.info("=========================================================");
String stepName = stepExecution.getStepName();
log.info("Step Name: {}", stepName);
try {
Long analUid = stepExecution.getJobParameters().getLong("analUid");
String resultUid = stepExecution.getJobParameters().getString("resultUid");
if (analUid == null || resultUid == null) {
log.warn(
"JobParameters에 analUid 또는 resultUid가 없어 Step 이력을 기록할 수 없습니다.");
return;
}
log.info("AnalUid: {}, ResultUid: {}", analUid, resultUid);
// Step 시작 기록
batchStepHistoryRepository.startStep(analUid, resultUid, stepName);
log.info("Step 시작 기록 저장 완료");
} catch (Exception e) {
log.error("Step 시작 기록 저장 실패: {}", e.getMessage(), e);
// 리스너 오류가 Step 실행을 방해하지 않도록 예외를 던지지 않음
}
}
@Override
public ExitStatus afterStep(StepExecution stepExecution) {
log.info("=========================================================");
log.info("Step 종료 - StepHistoryListener");
log.info("=========================================================");
String stepName = stepExecution.getStepName();
log.info("Step Name: {}", stepName);
log.info("Step Exit Status: {}", stepExecution.getExitStatus());
try {
Long analUid = stepExecution.getJobParameters().getLong("analUid");
String resultUid = stepExecution.getJobParameters().getString("resultUid");
if (analUid == null || resultUid == null) {
log.warn(
"JobParameters에 analUid 또는 resultUid가 없어 Step 이력을 기록할 수 없습니다.");
return stepExecution.getExitStatus();
}
log.info("AnalUid: {}, ResultUid: {}", analUid, resultUid);
// Step 성공 여부 판단
boolean isSuccess = ExitStatus.COMPLETED.equals(stepExecution.getExitStatus());
log.info("Step 성공 여부: {}", isSuccess ? "성공" : "실패");
if (isSuccess) {
// Step 성공 기록
batchStepHistoryRepository.finishStepSuccess(analUid, resultUid, stepName);
log.info("Step 성공 기록 저장 완료");
} else {
// Step 실패 기록
String errorMessage = buildErrorMessage(stepExecution);
batchStepHistoryRepository.finishStepFailed(analUid, resultUid, stepName, errorMessage);
log.info("Step 실패 기록 저장 완료");
}
} catch (Exception e) {
log.error("Step 종료 기록 저장 실패: {}", e.getMessage(), e);
// 리스너 오류가 Step 실행을 방해하지 않도록 예외를 던지지 않음
}
log.info("=========================================================");
return stepExecution.getExitStatus();
}
/**
* Step 실행 실패 시 에러 메시지 생성
*/
private String buildErrorMessage(StepExecution stepExecution) {
StringBuilder sb = new StringBuilder();
sb.append("ExitStatus: ").append(stepExecution.getExitStatus()).append("\n");
if (!stepExecution.getFailureExceptions().isEmpty()) {
sb.append("Failure Exceptions:\n");
for (Throwable t : stepExecution.getFailureExceptions()) {
sb.append("- ").append(t.getClass().getSimpleName()).append(": ").append(t.getMessage())
.append("\n");
}
}
return sb.toString();
}
}

View File

@@ -0,0 +1,169 @@
package com.kamco.cd.geojsonscheduler.repository;
import java.sql.Timestamp;
import java.time.LocalDateTime;
import lombok.RequiredArgsConstructor;
import lombok.extern.log4j.Log4j2;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.stereotype.Repository;
import org.springframework.transaction.annotation.Transactional;
/**
* Batch Step 실행 이력 Repository
*
* <p>각 AnalCntInfo의 Step별 실행 이력을 batch_step_history 테이블에 저장하고 조회합니다. Step 시작 시 STARTED
* 상태로 INSERT하고, Step 종료 시 SUCCESS 또는 FAILED 상태로 UPDATE합니다.
*
* <p><b>주요 기능:</b>
*
* <ul>
* <li>Step 시작 기록 (status=STARTED)
* <li>Step 성공 기록 (status=SUCCESS)
* <li>Step 실패 기록 (status=FAILED, error_message 포함)
* </ul>
*
* <p><b>테이블 구조:</b>
*
* <ul>
* <li>id: Step 이력 고유 ID (BIGSERIAL)
* <li>anal_uid: 분석 UID
* <li>result_uid: 결과 UID
* <li>step_name: Step 이름 (makeGeoJsonStep/dockerRunStep/zipResponseStep)
* <li>status: 상태 (STARTED/SUCCESS/FAILED)
* <li>error_message: 에러 메시지 (실패 시)
* <li>started_dttm: Step 시작 일시
* <li>completed_dttm: Step 완료 일시
* </ul>
*
* @author KAMCO Development Team
* @since 1.0.0
*/
@Log4j2
@Repository
@RequiredArgsConstructor
public class BatchStepHistoryRepository {
/** JDBC 쿼리 실행을 위한 Template */
private final JdbcTemplate jdbcTemplate;
/**
* Step 시작 기록
*/
@Transactional
public void startStep(Long analUid, String resultUid, String stepName) {
log.info("[BatchStepHistoryRepository] Step 시작 기록 저장");
log.info(" AnalUid: {}, ResultUid: {}, StepName: {}", analUid, resultUid, stepName);
String sql =
"""
INSERT INTO public.batch_step_history
(anal_uid, result_uid, step_name, status, started_dttm, created_dttm, updated_dttm)
VALUES (?, ?, ?, ?, ?, ?, ?)
""";
Timestamp now = Timestamp.valueOf(LocalDateTime.now());
int rowsAffected =
jdbcTemplate.update(
sql,
analUid,
resultUid,
stepName,
"STARTED",
now, // started_dttm
now, // created_dttm
now // updated_dttm
);
log.info(
"[BatchStepHistoryRepository] Step 시작 기록 저장 완료 ({} rows affected)", rowsAffected);
}
/**
* Step 성공 기록
*/
@Transactional
public void finishStepSuccess(Long analUid, String resultUid, String stepName) {
log.info("[BatchStepHistoryRepository] Step 성공 기록 업데이트");
log.info(" AnalUid: {}, ResultUid: {}, StepName: {}", analUid, resultUid, stepName);
String sql =
"""
UPDATE public.batch_step_history
SET status = ?,
completed_dttm = ?,
updated_dttm = ?
WHERE anal_uid = ?
AND result_uid = ?
AND step_name = ?
AND status = 'STARTED'
""";
Timestamp now = Timestamp.valueOf(LocalDateTime.now());
int rowsAffected =
jdbcTemplate.update(sql, "SUCCESS", now, now, analUid, resultUid, stepName);
if (rowsAffected > 0) {
log.info(
"[BatchStepHistoryRepository] Step 성공 기록 업데이트 완료 ({} rows affected)",
rowsAffected);
} else {
log.warn(
"[BatchStepHistoryRepository] 업데이트된 row가 없습니다. AnalUid: {}, ResultUid: {},"
+ " StepName: {}",
analUid,
resultUid,
stepName);
}
}
/**
* Step 실패 기록
*/
@Transactional
public void finishStepFailed(
Long analUid, String resultUid, String stepName, String errorMessage) {
log.info("[BatchStepHistoryRepository] Step 실패 기록 업데이트");
log.info(" AnalUid: {}, ResultUid: {}, StepName: {}", analUid, resultUid, stepName);
log.info(" ErrorMessage: {}", errorMessage);
String sql =
"""
UPDATE public.batch_step_history
SET status = ?,
error_message = ?,
completed_dttm = ?,
updated_dttm = ?
WHERE anal_uid = ?
AND result_uid = ?
AND step_name = ?
AND status = 'STARTED'
""";
Timestamp now = Timestamp.valueOf(LocalDateTime.now());
// error_message는 최대 1000자로 제한
String truncatedError =
errorMessage != null && errorMessage.length() > 1000
? errorMessage.substring(0, 1000)
: errorMessage;
int rowsAffected =
jdbcTemplate.update(
sql, "FAILED", truncatedError, now, now, analUid, resultUid, stepName);
if (rowsAffected > 0) {
log.info(
"[BatchStepHistoryRepository] Step 실패 기록 업데이트 완료 ({} rows affected)",
rowsAffected);
} else {
log.warn(
"[BatchStepHistoryRepository] 업데이트된 row가 없습니다. AnalUid: {}, ResultUid: {},"
+ " StepName: {}",
analUid,
resultUid,
stepName);
}
}
}

View File

@@ -0,0 +1,127 @@
package com.kamco.cd.geojsonscheduler.repository;
import com.kamco.cd.geojsonscheduler.dto.TrainingDataReviewJobDto.AnalCntInfo;
import com.kamco.cd.geojsonscheduler.dto.TrainingDataReviewJobDto.AnalMapSheetList;
import com.kamco.cd.geojsonscheduler.dto.TrainingDataReviewJobDto.CompleteLabelData;
import com.kamco.cd.geojsonscheduler.dto.TrainingDataReviewJobDto.CompleteLabelData.Properties;
import java.sql.Timestamp;
import java.time.LocalDate;
import java.time.ZoneId;
import java.time.ZonedDateTime;
import java.util.List;
import java.util.stream.Collectors;
import lombok.RequiredArgsConstructor;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.stereotype.Repository;
@Repository
@RequiredArgsConstructor
public class TrainingDataReviewJobRepository {
private final JdbcTemplate jdbcTemplate;
public List<AnalCntInfo> findAnalCntInfoList() {
String sql =
"""
SELECT
la.anal_uid,
msl.uid AS result_uid,
SUM(CASE WHEN la.inspect_state IN ('UNCONFIRM', 'COMPLETE') OR la.inspect_state IS NULL THEN 1 ELSE 0 END) AS all_cnt,
SUM(CASE WHEN la.inspect_state = 'COMPLETE' THEN 1 ELSE 0 END) AS complete_cnt,
SUM(CASE WHEN mslg.file_create_yn = true THEN 1 ELSE 0 END) AS file_cnt
FROM tb_labeling_assignment la
INNER JOIN tb_map_sheet_anal_inference msai ON la.anal_uid = msai.anal_uid AND msai.anal_state = 'ING'
LEFT JOIN tb_map_sheet_learn msl ON msai.learn_id = msl.id
LEFT JOIN tb_map_sheet_learn_data_geom mslg ON la.inference_geom_uid = mslg.geo_uid
GROUP BY la.anal_uid, msl.uid
HAVING SUM(CASE WHEN la.inspect_state = 'COMPLETE' THEN 1 ELSE 0 END) > 0
""";
return jdbcTemplate.query(
sql,
(rs, rowNum) ->
new AnalCntInfo(
rs.getLong("anal_uid"),
rs.getString("result_uid"),
rs.getLong("all_cnt"),
rs.getLong("complete_cnt"),
rs.getLong("file_cnt")));
}
public List<AnalMapSheetList> findCompletedAnalMapSheetList(Long analUid) {
ZonedDateTime end =
LocalDate.now(ZoneId.of("Asia/Seoul")).atStartOfDay(ZoneId.of("Asia/Seoul"));
String sql =
"""
SELECT
msai.compare_yyyy,
msai.target_yyyy,
la.assign_group_id
FROM tb_labeling_assignment la
INNER JOIN tb_map_sheet_anal_inference msai ON la.anal_uid = msai.anal_uid
WHERE la.anal_uid = ?
AND la.inspect_state = 'COMPLETE'
AND la.inspect_stat_dttm < ?
GROUP BY msai.compare_yyyy, msai.target_yyyy, la.assign_group_id
""";
return jdbcTemplate.query(
sql,
(rs, rowNum) ->
AnalMapSheetList.builder()
.compareYyyy(rs.getInt("compare_yyyy"))
.targetYyyy(rs.getInt("target_yyyy"))
.mapSheetNum(rs.getString("assign_group_id"))
.build(),
analUid,
Timestamp.from(end.toInstant()));
}
public List<CompleteLabelData> findCompletedYesterdayLabelingList(
Long analUid, String mapSheetNum) {
ZonedDateTime end =
LocalDate.now(ZoneId.of("Asia/Seoul")).atStartOfDay(ZoneId.of("Asia/Seoul"));
String sql =
"""
SELECT
mslg.geo_uid,
'Feature' AS type,
ST_AsGeoJSON(ST_Transform(mslg.geom, 4326)) AS geom_str,
CASE
WHEN mslg.class_after_cd IN ('building', 'container') THEN 'M1'
WHEN mslg.class_after_cd = 'waste' THEN 'M2'
ELSE 'M3'
END AS model_id,
mslg.class_before_cd,
mslg.class_after_cd
FROM tb_labeling_assignment la
LEFT JOIN tb_map_sheet_learn_data_geom mslg ON la.inference_geom_uid = mslg.geo_uid
WHERE la.anal_uid = ?
AND la.assign_group_id = ?
AND la.inspect_state = 'COMPLETE'
AND la.inspect_stat_dttm < ?
""";
return jdbcTemplate.query(
sql,
(rs, rowNum) ->
new CompleteLabelData(
rs.getLong("geo_uid"),
rs.getString("type"),
rs.getString("geom_str"),
new Properties(
rs.getString("model_id"),
rs.getString("class_before_cd"),
rs.getString("class_after_cd"))),
analUid,
mapSheetNum,
Timestamp.from(end.toInstant()));
}
public void updateLearnDataGeomFileCreateYn(List<Long> geoUids) {
String placeholders = geoUids.stream().map(id -> "?").collect(Collectors.joining(","));
String sql =
"UPDATE tb_map_sheet_learn_data_geom SET file_create_yn = true, updated_dttm = NOW()"
+ " WHERE geo_uid IN ("
+ placeholders
+ ")";
jdbcTemplate.update(sql, geoUids.toArray());
}
}

View File

@@ -0,0 +1,215 @@
package com.kamco.cd.geojsonscheduler.service;
import com.kamco.cd.geojsonscheduler.config.DockerProperties;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.util.ArrayList;
import java.util.List;
import lombok.RequiredArgsConstructor;
import lombok.extern.log4j.Log4j2;
import org.springframework.stereotype.Service;
/**
* Docker 컨테이너 실행 서비스
*
* <p>학습 데이터 생성 파이프라인이 포함된 Docker 컨테이너를 실행하고 결과를 모니터링합니다. Docker 프로세스의 표준 출력을 실시간으로
* 로깅하며, 비정상 종료 시 RuntimeException을 발생시켜 Batch Step을 실패 처리합니다.
*
* <p><b>주요 기능:</b>
*
* <ul>
* <li>Docker run 명령어 생성 (볼륨 마운트, 환경 변수, 파라미터 설정)
* <li>Docker 프로세스 실행 및 실시간 로그 출력
* <li>Exit Code 검증 (0이 아닐 시 예외 발생)
* <li>프로세스 인터럽트 처리
* </ul>
*
* <p><b>Docker 명령어 구조:</b>
*
* <pre>
* docker run --rm
* --user {dockerUser}
* -v {datasetVolume}
* -v {imagesVolume}
* --entrypoint python
* {dockerImage}
* code/kamco_full_pipeline.py
* --labelling-folder request/{resultUid}
* --output-folder response/{resultUid}
* --input_root {inputRoot}
* --output_root {outputRoot}
* --patch_size {patchSize}
* --overlap_pct {overlapPct}
* --train_val_test_ratio {train} {val} {test}
* --keep_empty_ratio {keepEmptyRatio}
* </pre>
*
* @author KAMCO Development Team
* @since 1.0.0
* @see DockerProperties
*/
@Log4j2
@Service
@RequiredArgsConstructor
public class DockerRunnerService {
/** Docker 실행 관련 설정 정보 (application.yml) */
private final DockerProperties dockerProperties;
/**
* Docker 컨테이너 실행 및 모니터링
*
* <p>학습 데이터 생성 파이프라인을 Docker 컨테이너로 실행합니다. 컨테이너의 표준 출력을 실시간으로 로깅하며, 비정상 종료 시
* RuntimeException을 발생시켜 Step 실패로 처리합니다.
*
* @param resultUid 결과물 고유 ID (UUID)
* @throws RuntimeException Docker 프로세스 실패 (exitCode != 0), IO 오류, 또는 인터럽트 발생 시
*/
public void run(String resultUid) {
// Step 1: Docker 명령어 생성
log.info("[Step 1/4] Docker 명령어 생성 중...");
List<String> command = buildCommand(resultUid);
log.info("[Step 1/4] Docker 명령어 생성 완료");
log.debug(" - 명령어: {}", String.join(" ", command));
try {
// Step 2: Docker 프로세스 시작
log.info("[Step 2/4] Docker 프로세스 시작 중...");
ProcessBuilder pb = new ProcessBuilder(command);
pb.redirectErrorStream(true); // stderr를 stdout으로 리다이렉트
Process process = pb.start();
log.info("[Step 2/4] Docker 프로세스 시작 완료 (PID={})", process.pid());
// Step 3: Docker 프로세스 출력 실시간 로깅
log.info("[Step 3/4] Docker 프로세스 출력 모니터링 중...");
int lineCount = 0;
try (BufferedReader reader =
new BufferedReader(new InputStreamReader(process.getInputStream()))) {
String line;
while ((line = reader.readLine()) != null) {
log.info("[docker] {}", line);
lineCount++;
}
}
log.info("[Step 3/4] Docker 프로세스 출력 완료 (총 {} 라인)", lineCount);
// Step 4: Exit Code 검증
log.info("[Step 4/4] Docker 프로세스 종료 대기 중...");
int exitCode = process.waitFor();
log.info("[Step 4/4] Docker 프로세스 종료 (exitCode={})", exitCode);
if (exitCode != 0) {
// Docker 프로세스 비정상 종료 (Step 실패 처리)
log.error("[실패] Docker 프로세스가 비정상 종료되었습니다!");
log.error(" - Exit Code: {}", exitCode);
log.error(" - ResultUid: {}", resultUid);
log.error(" - 확인사항:");
log.error(" 1. Docker 컨테이너 로그 확인 (위 [docker] 로그 참조)");
log.error(" 2. request/{}/ 디렉토리에 GeoJSON 파일 확인", resultUid);
log.error(" 3. Docker 이미지 및 볼륨 마운트 경로 확인");
throw new RuntimeException(
String.format(
"Docker process failed with exit code %d for resultUid: %s",
exitCode, resultUid));
} else {
// Docker 프로세스 정상 종료
log.info("[성공] Docker 프로세스가 정상 종료되었습니다.");
log.info(" - ResultUid: {}", resultUid);
log.info(" - 결과물 위치: /dataset/response/{}/", resultUid);
}
} catch (IOException e) {
// Docker 명령어 실행 실패 (파일 시스템 오류 등)
log.error("[실패] Docker 명령어 실행 중 IO 오류 발생!", e);
log.error(" - ResultUid: {}", resultUid);
log.error(" - 에러 메시지: {}", e.getMessage());
throw new RuntimeException("Failed to run docker command for resultUid: " + resultUid, e);
} catch (InterruptedException e) {
// Docker 프로세스 인터럽트 (사용자 취소 또는 시스템 종료)
log.error("[인터럽트] Docker 프로세스가 중단되었습니다!", e);
log.error(" - ResultUid: {}", resultUid);
log.error(" - 에러 메시지: {}", e.getMessage());
Thread.currentThread().interrupt(); // 인터럽트 상태 복원
throw new RuntimeException("Docker process interrupted for resultUid: " + resultUid, e);
}
}
/**
* Docker 명령어 생성
*
* <p>DockerProperties 설정 정보와 resultUid를 기반으로 Docker run 명령어를 생성합니다.
*
* @param resultUid 결과물 고유 ID (입력/출력 폴더 경로에 사용)
* @return Docker 명령어 문자열 리스트 (ProcessBuilder 실행용)
*/
private List<String> cmd buildCommand(String resultUid) {
log.debug("Docker 명령어 파라미터 구성 중...");
List<String> cmd = new ArrayList<>();
// Docker 기본 명령어
cmd.add("docker");
cmd.add("run");
cmd.add("--rm"); // 컨테이너 종료 시 자동 삭제
// 사용자 및 권한 설정
cmd.add("--user");
cmd.add(dockerProperties.getUser()); // 예: "1000:1000"
log.debug(" - User: {}", dockerProperties.getUser());
// 볼륨 마운트 (호스트:컨테이너)
cmd.add("-v");
cmd.add(dockerProperties.getDatasetVolume()); // 예: "/kamco-nfs/dataset:/dataset"
log.debug(" - Dataset Volume: {}", dockerProperties.getDatasetVolume());
cmd.add("-v");
cmd.add(dockerProperties.getImagesVolume()); // 예: "/kamco-nfs/images:/images"
log.debug(" - Images Volume: {}", dockerProperties.getImagesVolume());
// Entrypoint 및 이미지
cmd.add("--entrypoint");
cmd.add("python"); // Python으로 스크립트 실행
cmd.add(dockerProperties.getImage()); // 예: "kamco/dataset-generator:latest"
log.debug(" - Image: {}", dockerProperties.getImage());
// Python 스크립트 및 파라미터
cmd.add("code/kamco_full_pipeline.py");
// 입출력 폴더 설정
cmd.add("--labelling-folder");
cmd.add("request/" + resultUid);
log.debug(" - Labelling Folder: request/{}", resultUid);
cmd.add("--output-folder");
cmd.add("response/" + resultUid);
log.debug(" - Output Folder: response/{}", resultUid);
// 파이프라인 파라미터
cmd.add("--input_root");
cmd.add(dockerProperties.getInputRoot());
cmd.add("--output_root");
cmd.add(dockerProperties.getOutputRoot());
cmd.add("--patch_size");
cmd.add(String.valueOf(dockerProperties.getPatchSize()));
log.debug(" - Patch Size: {}", dockerProperties.getPatchSize());
cmd.add("--overlap_pct");
cmd.add(String.valueOf(dockerProperties.getOverlapPct()));
log.debug(" - Overlap Percent: {}", dockerProperties.getOverlapPct());
cmd.add("--train_val_test_ratio");
cmd.addAll(dockerProperties.getTrainValTestRatio()); // 예: ["0.7", "0.2", "0.1"]
log.debug(" - Train/Val/Test Ratio: {}", dockerProperties.getTrainValTestRatio());
cmd.add("--keep_empty_ratio");
cmd.add(String.valueOf(dockerProperties.getKeepEmptyRatio()));
log.debug(" - Keep Empty Ratio: {}", dockerProperties.getKeepEmptyRatio());
log.debug("Docker 명령어 파라미터 구성 완료");
return cmd;
}
}

View File

@@ -0,0 +1,121 @@
package com.kamco.cd.geojsonscheduler.service;
import com.kamco.cd.geojsonscheduler.config.TrainDockerProperties;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.util.ArrayList;
import java.util.List;
import lombok.RequiredArgsConstructor;
import lombok.extern.log4j.Log4j2;
import org.springframework.stereotype.Service;
@Log4j2
@Service
@RequiredArgsConstructor
public class TrainDockerRunnerService {
private final TrainDockerProperties trainDockerProperties;
public void runTraining(String datasetFolder, String outputFolder) {
log.info("========================================");
log.info("Train Docker 실행 시작");
log.info("Dataset Folder: {}", datasetFolder);
log.info("Output Folder: {}", outputFolder);
log.info("========================================");
List<String> command = buildTrainCommand(datasetFolder, outputFolder);
log.info("Docker 명령어: {}", String.join(" ", command));
try {
ProcessBuilder pb = new ProcessBuilder(command);
pb.redirectErrorStream(true);
Process process = pb.start();
log.info("Docker 프로세스 시작됨 (Detached mode)");
try (BufferedReader reader =
new BufferedReader(new InputStreamReader(process.getInputStream()))) {
String line;
while ((line = reader.readLine()) != null) {
log.info("[train-docker] {}", line);
}
}
int exitCode = process.waitFor();
if (exitCode != 0) {
log.error("Train Docker 프로세스 실패: exitCode={}, datasetFolder={}, outputFolder={}",
exitCode, datasetFolder, outputFolder);
} else {
log.info("Train Docker 프로세스 시작 완료: datasetFolder={}, outputFolder={}",
datasetFolder, outputFolder);
}
} catch (IOException e) {
log.error("Train Docker 실행 실패: datasetFolder={}, outputFolder={}, error={}",
datasetFolder, outputFolder, e.getMessage(), e);
} catch (InterruptedException e) {
log.error("Train Docker 프로세스 중단: datasetFolder={}, outputFolder={}, error={}",
datasetFolder, outputFolder, e.getMessage(), e);
Thread.currentThread().interrupt();
}
log.info("========================================");
log.info("Train Docker 실행 완료");
log.info("========================================");
}
private List<String> buildTrainCommand(String datasetFolder, String outputFolder) {
List<String> cmd = new ArrayList<>();
cmd.add("docker");
cmd.add("run");
cmd.add("-d"); // detached mode
cmd.add("--name");
cmd.add("train-cd");
cmd.add("--rm");
cmd.add("--gpus");
cmd.add("all");
cmd.add("--ipc=host");
cmd.add("--shm-size=16g");
cmd.add("--ulimit");
cmd.add("memlock=-1");
cmd.add("--ulimit");
cmd.add("stack=67108864");
cmd.add("-e");
cmd.add("NCCL_DEBUG=INFO");
cmd.add("-e");
cmd.add("NCCL_IB_DISABLE=1");
cmd.add("-e");
cmd.add("NCCL_P2P_DISABLE=0");
cmd.add("-e");
cmd.add("NCCL_SOCKET_IFNAME=eth0");
cmd.add("-v");
cmd.add(trainDockerProperties.getDataVolume());
cmd.add("-v");
cmd.add(trainDockerProperties.getCheckpointsVolume());
cmd.add("-it");
cmd.add(trainDockerProperties.getImage());
cmd.add("python");
cmd.add("/workspace/change-detection-code/train_wrapper.py");
cmd.add("--dataset-folder");
cmd.add(datasetFolder);
cmd.add("--output-folder");
cmd.add(outputFolder);
cmd.add("--input-size");
cmd.add(trainDockerProperties.getInputSize());
cmd.add("--crop-size");
cmd.add(trainDockerProperties.getCropSize());
cmd.add("--batch-size");
cmd.add(String.valueOf(trainDockerProperties.getBatchSize()));
cmd.add("--gpu-ids");
cmd.add(trainDockerProperties.getGpuIds());
cmd.add("--gpus");
cmd.add(String.valueOf(trainDockerProperties.getGpus()));
cmd.add("--lr");
cmd.add(trainDockerProperties.getLr());
cmd.add("--backbone");
cmd.add(trainDockerProperties.getBackbone());
cmd.add("--epochs");
cmd.add(String.valueOf(trainDockerProperties.getEpochs()));
return cmd;
}
}

View File

@@ -0,0 +1,11 @@
spring:
datasource:
url: jdbc:postgresql://192.168.2.127:15432/kamco_cds
username: kamco_cds
password: kamco_cds_Q!W@E#R$
hikari:
minimum-idle: 2
maximum-pool-size: 5
training-data:
geojson-dir: /kamco-nfs/dataset

View File

@@ -0,0 +1,8 @@
spring:
datasource:
url: jdbc:postgresql://localhost:5432/kamco_cds
username: kamco_cds
password: kamco_cds
training-data:
geojson-dir: /tmp/geojson

View File

@@ -0,0 +1,11 @@
spring:
datasource:
url: jdbc:postgresql://127.0.0.1:15432/kamco_cds
username: kamco_cds
password: kamco_cds_Q!W@E#R$
hikari:
minimum-idle: 2
maximum-pool-size: 5
training-data:
geojson-dir: /kamco-nfs/dataset

View File

@@ -0,0 +1,41 @@
spring:
application:
name: kamco-geojson-scheduler
profiles:
active: local
datasource:
driver-class-name: org.postgresql.Driver
hikari:
minimum-idle: 2
maximum-pool-size: 2
connection-timeout: 20000
idle-timeout: 300000
max-lifetime: 1800000
sql:
init:
mode: never
# schema-locations: classpath:sql/schema.sql
# Note: batch_history 테이블이 이미 존재하므로 SQL 초기화 비활성화
# 새 환경에서 테이블 생성이 필요한 경우, 아래 SQL을 수동으로 실행:
# src/main/resources/sql/schema.sql
batch:
job:
enabled: true
jdbc:
initialize-schema: always
training-data:
docker:
image: kamco-cd-dataset:latest
user: "1000:1000"
dataset-volume: /kamco-nfs/dataset:/dataset
images-volume: /kamco-nfs/images:/kamco-nfs:ro
input-root: /dataset
output-root: /dataset
patch-size: 512
overlap-pct: 50
train-val-test-ratio:
- "0.7"
- "0.2"
- "0.1"
keep-empty-ratio: 0.1

View File

@@ -0,0 +1,58 @@
-- batch_history 테이블 생성
CREATE TABLE IF NOT EXISTS public.batch_history (
uuid UUID PRIMARY KEY,
job VARCHAR(255) NOT NULL,
id VARCHAR(255) NOT NULL,
created_dttm TIMESTAMP NOT NULL,
updated_dttm TIMESTAMP NOT NULL,
status VARCHAR(50) NOT NULL,
completed_dttm TIMESTAMP
);
-- 인덱스 생성 (조회 성능 향상)
CREATE INDEX IF NOT EXISTS idx_batch_history_job ON public.batch_history(job);
CREATE INDEX IF NOT EXISTS idx_batch_history_status ON public.batch_history(status);
CREATE INDEX IF NOT EXISTS idx_batch_history_created ON public.batch_history(created_dttm DESC);
-- 코멘트
COMMENT ON TABLE public.batch_history IS '배치 작업 실행 이력';
COMMENT ON COLUMN public.batch_history.uuid IS '배치 실행 고유 ID';
COMMENT ON COLUMN public.batch_history.job IS '배치 작업 이름';
COMMENT ON COLUMN public.batch_history.id IS '비즈니스 ID';
COMMENT ON COLUMN public.batch_history.created_dttm IS '생성 일시';
COMMENT ON COLUMN public.batch_history.updated_dttm IS '수정 일시';
COMMENT ON COLUMN public.batch_history.status IS '상태 (STARTED/COMPLETED/FAILED)';
COMMENT ON COLUMN public.batch_history.completed_dttm IS '완료 일시';
-- batch_step_history 테이블 생성
CREATE TABLE IF NOT EXISTS public.batch_step_history (
id BIGSERIAL PRIMARY KEY,
anal_uid BIGINT NOT NULL,
result_uid VARCHAR(255) NOT NULL,
step_name VARCHAR(100) NOT NULL,
status VARCHAR(50) NOT NULL,
error_message TEXT,
started_dttm TIMESTAMP NOT NULL,
completed_dttm TIMESTAMP,
created_dttm TIMESTAMP NOT NULL,
updated_dttm TIMESTAMP NOT NULL
);
-- 인덱스 생성
CREATE INDEX IF NOT EXISTS idx_batch_step_history_anal_uid ON public.batch_step_history(anal_uid);
CREATE INDEX IF NOT EXISTS idx_batch_step_history_result_uid ON public.batch_step_history(result_uid);
CREATE INDEX IF NOT EXISTS idx_batch_step_history_status ON public.batch_step_history(status);
CREATE INDEX IF NOT EXISTS idx_batch_step_history_step_name ON public.batch_step_history(step_name);
-- 코멘트
COMMENT ON TABLE public.batch_step_history IS '배치 Step 실행 이력';
COMMENT ON COLUMN public.batch_step_history.id IS 'Step 이력 고유 ID';
COMMENT ON COLUMN public.batch_step_history.anal_uid IS '분석 UID';
COMMENT ON COLUMN public.batch_step_history.result_uid IS '결과 UID';
COMMENT ON COLUMN public.batch_step_history.step_name IS 'Step 이름 (makeGeoJsonStep/dockerRunStep/zipResponseStep)';
COMMENT ON COLUMN public.batch_step_history.status IS '상태 (STARTED/SUCCESS/FAILED)';
COMMENT ON COLUMN public.batch_step_history.error_message IS '에러 메시지';
COMMENT ON COLUMN public.batch_step_history.started_dttm IS 'Step 시작 일시';
COMMENT ON COLUMN public.batch_step_history.completed_dttm IS 'Step 완료 일시';
COMMENT ON COLUMN public.batch_step_history.created_dttm IS '생성 일시';
COMMENT ON COLUMN public.batch_step_history.updated_dttm IS '수정 일시';