diff --git a/imagery-make-dataset/CODE_STYLE_SETUP.md b/imagery-make-dataset/CODE_STYLE_SETUP.md new file mode 100755 index 0000000..2c07ff5 --- /dev/null +++ b/imagery-make-dataset/CODE_STYLE_SETUP.md @@ -0,0 +1,112 @@ +# Code Style 설정 가이드 + +이 문서는 프로젝트에서 Google Java Style을 자동으로 적용하기 위한 설정 가이드입니다. + +## 자동 포맷팅 구성 + +### 1. 커밋 시점 자동 포맷팅 (Git Pre-commit Hook) + +커밋 전에 자동으로 코드를 포맷팅하고 스테이징합니다. + +**설정 완료:** `.git/hooks/pre-commit` 파일이 자동으로 실행됩니다. + +**동작 방식:** +- 커밋 시도 시 `./gradlew spotlessApply` 자동 실행 +- 스테이징된 Java 파일을 자동으로 포맷팅 +- 포맷팅된 파일을 자동으로 다시 스테이징 +- 포맷팅이 완료되면 커밋 진행 + +**장점:** +- 수동으로 `spotlessApply`를 실행할 필요 없음 +- 항상 일관된 코드 스타일 유지 +- 포맷팅 누락 방지 + +### 2. IntelliJ IDEA 저장 시점 자동 포맷팅 + +#### 방법 1: Code Style 설정 임포트 (권장) + +1. **IntelliJ IDEA 열기** +2. **Settings/Preferences** (Mac: `⌘,` / Windows: `Ctrl+Alt+S`) +3. **Editor > Code Style > Java** +4. **⚙️ (톱니바퀴)** 클릭 > **Import Scheme > IntelliJ IDEA code style XML** +5. 프로젝트 루트의 `intellij-java-google-style.xml` 파일 선택 +6. **OK** 클릭 + +#### 방법 2: 저장 시 자동 포맷팅 활성화 + + +**Option A: Actions on Save 설정** + +1. **Settings/Preferences** > **Tools > Actions on Save** +2. 다음 옵션들을 활성화: + - ✅ **Reformat code** + - ✅ **Optimize imports** + - ✅ **Rearrange code** (선택사항) +3. **Changed lines** 또는 **Whole file** 선택 +4. **OK** 클릭 + +**Option B: Save Actions Plugin 사용 (더 많은 옵션)** + +1. **Settings/Preferences** > **Plugins** +2. **Marketplace**에서 "Save Actions" 검색 및 설치 +3. **Settings/Preferences** > **Other Settings > Save Actions** +4. 다음 옵션 활성화: + - ✅ **Activate save actions on save** + - ✅ **Reformat file** + - ✅ **Optimize imports** + - ✅ **Rearrange fields and methods** (선택사항) + +### 3. Gradle Spotless Plugin 수동 실행 + +#### 코드 포맷팅 체크 +```bash +# 포맷팅 문제 확인만 (수정하지 않음) +./gradlew spotlessCheck +``` + +#### 코드 자동 포맷팅 +```bash +# 모든 Java 파일 자동 포맷팅 적용 +./gradlew spotlessApply +``` + +#### 빌드 시 자동 체크 +```bash +# 빌드 전에 자동으로 spotlessCheck 실행됨 +./gradlew build +``` + +## 코드 스타일 규칙 + +프로젝트는 **Google Java Style Guide** 기반으로 다음 규칙을 따릅니다: + +- **Indentation**: 2 spaces (탭 아님) +- **Line Length**: 180 characters +- **Line Endings**: LF (Unix-style) +- **Charset**: UTF-8 +- **Import Order**: Static imports → 빈 줄 → Regular imports +- **Braces**: 모든 if, for, while, do 문에 중괄호 필수 + +## 문제 해결 + +### Pre-commit hook이 실행되지 않는 경우 +```bash +# 실행 권한 확인 및 부여 +chmod +x .git/hooks/pre-commit +``` + +### Spotless 플러그인이 동작하지 않는 경우 +```bash +# Gradle 의존성 다시 다운로드 +./gradlew clean build --refresh-dependencies +``` + +### IntelliJ 포맷팅이 다르게 적용되는 경우 +1. `intellij-java-google-style.xml` 다시 임포트 +2. **File > Invalidate Caches** > **Invalidate and Restart** + +## 추가 정보 + +- **Google Java Style Guide**: https://google.github.io/styleguide/javaguide.html +- **Spotless Plugin**: https://github.com/diffplug/spotless +- **IntelliJ Code Style**: https://www.jetbrains.com/help/idea/code-style.html diff --git a/imagery-make-dataset/COMMON_CODE_CACHE_REDIS.md b/imagery-make-dataset/COMMON_CODE_CACHE_REDIS.md new file mode 100755 index 0000000..fbff1ee --- /dev/null +++ b/imagery-make-dataset/COMMON_CODE_CACHE_REDIS.md @@ -0,0 +1,282 @@ +# 공통코드 Redis 캐시 시스템 - DanielLee + +## 요구사항 검토 + +### 1. **API를 통해 공통코드 제공** +- **구현 완료**: `CommonCodeApiController`에서 전체 공통코드 조회 API 제공 + ``` + GET /api/code + → 모든 공통코드 조회 + ``` +- **추가 구현**: 캐시 갱신 및 상태 확인 API + ``` + POST /api/code/cache/refresh → 캐시 갱신 + GET /api/code/cache/status → 캐시 상태 확인 + ``` + +--- + +### 2. **애플리케이션 로딩시 Redis 캐시에 올리기** +- **구현 완료**: `CommonCodeCacheManager` 클래스 생성 + +#### 초기화 메커니즘 +```java +@Component +@RequiredArgsConstructor +public class CommonCodeCacheManager { + + @EventListener(ApplicationReadyEvent.class) + public void initializeCommonCodeCache() { + // 애플리케이션 완전히 시작된 후 공통코드를 Redis에 미리 로드 + List allCommonCodes = commonCodeService.getFindAll(); + // @Cacheable이 자동으로 Redis에 캐시함 + } +} +``` + +#### 동작 흐름 +1. 애플리케이션 시작 +2. Spring이 모든 Bean 생성 완료 (`ApplicationReadyEvent` 발생) +3. `CommonCodeCacheManager.initializeCommonCodeCache()` 실행 +4. `commonCodeService.getFindAll()` 호출 (DB에서 조회) +5. `@Cacheable(value = "commonCodes")` 에노테이션이 결과를 Redis에 저장 + +--- + +### 3. **공통코드 변경시 데이터 갱신** + +#### 자동 갱신 +- **등록 (CREATE)**: `@CacheEvict` → 캐시 전체 삭제 +- **수정 (UPDATE)**: `@CacheEvict` → 캐시 전체 삭제 +- **삭제 (DELETE)**: `@CacheEvict` → 캐시 전체 삭제 +- **순서 변경**: `@CacheEvict` → 캐시 전체 삭제 + +```java +@CacheEvict(value = "commonCodes", allEntries = true) +public ResponseObj save(CommonCodeDto.AddReq req) { + // 공통코드 저장 + // ↓ + // 캐시 전체 삭제 (다음 조회 시 DB에서 새로 로드) +} +``` + +#### 수동 갱신 (관리자) +```java +POST /api/code/cache/refresh +``` +- 공통코드 설정이 변경된 후 API를 호출하여 캐시를 강제 갱신 + +#### 캐시 상태 모니터링 +```java +GET /api/code/cache/status +→ 응답: { "data": 150 } // 캐시된 공통코드 150개 +``` + +--- + +## 전체 아키텍처 + +``` +┌─────────────────────────────────────────────────────────┐ +│ 클라이언트 요청 │ +└──────────────────┬──────────────────────────────────────┘ + │ + ┌──────────▼──────────┐ + │ CommonCodeApiController + └──────────┬──────────┘ + │ + ┌─────────┴──────────┐ + │ │ + ┌────▼─────┐ ┌──────▼────────────┐ + │ 조회 API │ │ 캐시 관리 API │ + │ (GET) │ │(POST, GET) │ + └────┬─────┘ └──────┬────────────┘ + │ │ + │ ┌────────▼──────────┐ + │ │CommonCodeCacheManager + │ │(캐시 초기화/갱신) │ + │ └────────┬──────────┘ + │ │ + ┌────▼─────────────────┬─▼────┐ + │ CommonCodeService │ │ + │ (@Cacheable) │ │ + │ (@CacheEvict) │ │ + └────┬──────────────────┴──────┘ + │ + ┌────▼──────────┐ + │ Redis 캐시 │ + │ (공통코드) │ + └────┬──────────┘ + │ + ┌────▼──────────┐ + │ PostgreSQL DB │ + │ (공통코드) │ + └───────────────┘ +``` + +--- + +## API 명세 + +### 1. 공통코드 조회 (캐시됨) +``` +GET /api/code + +응답: +{ + "data": [ + { + "id": 1, + "code": "STATUS", + "name": "상태", + "description": "상태 공통코드", + "used": true, + ... + }, + ... + ] +} +``` + +### 2. 공통코드 캐시 갱신 +``` +POST /api/code/cache/refresh + +응답: +{ + "data": "공통코드 캐시가 갱신되었습니다." +} +``` + +### 3. 캐시 상태 확인 +``` +GET /api/code/cache/status + +응답: +{ + "data": 150 // Redis에 캐시된 공통코드 개수 +} +``` + +--- + +## 캐시 갱신 흐름 + +### 자동 갱신 (CRUD 작업) +``` +관리자가 공통코드 등록/수정/삭제 + ↓ +CommonCodeService.save() / update() / removeCode() +(@CacheEvict 실행) + ↓ +Redis 캐시 전체 삭제 + ↓ +다음 조회 시 DB에서 새로 로드 +``` + +### 수동 갱신 (API 호출) +``` +관리자: POST /api/code/cache/refresh + ↓ +CommonCodeCacheManager.refreshCommonCodeCache() + ↓ +캐시 정리 + 새로운 데이터 로드 + ↓ +Redis 캐시 업데이트 완료 +``` + +--- + +## 성능 최적화 효과 + +| 항목 | 개선 전 | 개선 후 | +|------|--------|--------| +| **조회 속도** | DB 직접 조회 (10-100ms) | Redis 캐시 (1-5ms) | +| **DB 부하** | 매번 조회 | 캐시 미스시만 조회 | +| **네트워크 대역폭** | 높음 (DB 왕복) | 낮음 (로컬 캐시) | +| **응답 시간** | 변동적 | 일정 (캐시) | + +--- + +## 추가 기능 + +### CommonCodeUtil - 전역 공통코드 조회 +```java +@Component +public class CommonCodeUtil { + // 모든 공통코드 조회 (캐시 활용) + public List getAllCommonCodes() + + // 특정 코드로 조회 + public List getCommonCodesByCode(String code) + + // ID로 단건 조회 + public Optional getCommonCodeById(Long id) + + // 코드명 조회 + public Optional getCodeName(String parentCode, String childCode) + + // 하위 코드 조회 + public List getChildCodesByParentCode(String parentCode) + + // 코드 사용 가능 여부 확인 + public boolean isCodeAvailable(Long parentId, String code) +} +``` + +### 사용 예시 +```java +@RequiredArgsConstructor +@RestController +public class SomeController { + + private final CommonCodeUtil commonCodeUtil; + + @GetMapping("/example") + public void example() { + // 1. 모든 공통코드 조회 (캐시됨) + List allCodes = commonCodeUtil.getAllCommonCodes(); + + // 2. 특정 코드 조회 + Optional name = commonCodeUtil.getCodeName("PARENT", "CHILD"); + + // 3. 코드 사용 가능 여부 확인 + boolean available = commonCodeUtil.isCodeAvailable(1L, "NEW_CODE"); + } +} +``` + +--- + +## 완료 체크리스트 + +- Redis 캐싱 어노테이션 적용 (@Cacheable, @CacheEvict) +- 애플리케이션 로딩시 캐시 초기화 +- CRUD 작업시 자동 캐시 갱신 +- 수동 캐시 갱신 API 제공 +- 캐시 상태 모니터링 API +- 전역 공통코드 조회 유틸리티 +- 포괄적인 유닛 테스트 (12개) + +--- + +## 모니터링 + +캐시 상태를 주기적으로 모니터링: +```bash +# 캐시 상태 확인 +curl http://localhost:8080/api/code/cache/status + +# 캐시 갱신 +curl -X POST http://localhost:8080/api/code/cache/refresh +``` + +로그 확인: +``` +=== 공통코드 캐시 초기화 시작 === +✓ 공통코드 150개가 Redis 캐시에 로드되었습니다. + - [STATUS] 상태 (ID: 1) + - [TYPE] 타입 (ID: 2) + ... +=== 공통코드 캐시 초기화 완료 === +``` diff --git a/imagery-make-dataset/Dockerfile-dev b/imagery-make-dataset/Dockerfile-dev new file mode 100755 index 0000000..994ea58 --- /dev/null +++ b/imagery-make-dataset/Dockerfile-dev @@ -0,0 +1,29 @@ +# Stage 1: Build stage (gradle build는 Jenkins에서 이미 수행) +FROM eclipse-temurin:21-jre-jammy + +# GDAL 설치 +RUN apt-get update && apt-get install -y \ + gdal-bin \ + libgdal-dev \ + && rm -rf /var/lib/apt/lists/* + +ARG UID=1000 +ARG GID=1000 + +RUN groupadd -g ${GID} manager01 \ + && useradd -u ${UID} -g ${GID} -m manager01 + +USER manager01 + +# 작업 디렉토리 설정 +WORKDIR /app + +# JAR 파일 복사 (Jenkins에서 빌드된 ROOT.jar) +COPY build/libs/ROOT.jar app.jar + +# 포트 노출 +EXPOSE 8080 + +# 애플리케이션 실행 +# dev 프로파일로 실행 +ENTRYPOINT ["java", "-jar", "-Dspring.profiles.active=dev", "app.jar"] diff --git a/imagery-make-dataset/Jenkinsfile-dev b/imagery-make-dataset/Jenkinsfile-dev new file mode 100755 index 0000000..e018a11 --- /dev/null +++ b/imagery-make-dataset/Jenkinsfile-dev @@ -0,0 +1,94 @@ +pipeline { + agent any + tools { + jdk 'jdk21' + } + environment { + BRANCH = 'develop' + GIT_REPO = 'https://10.100.0.10:3210/dabeeo/kamco-dabeeo-backoffice.git' + } + + + stages { + stage('Checkout') { + steps { + checkout([ + $class: 'GitSCM', + branches: [[name: "${env.BRANCH}"]], + userRemoteConfigs: [[ + url: "${env.GIT_REPO}", + credentialsId: 'jenkins-dev-token' + ]] + ]) + } + } + stage('Get Commit Hash') { + steps { + script { + env.COMMIT_HASH = sh(script: "git rev-parse --short HEAD", returnStdout: true).trim() + echo "Current commit hash: ${env.COMMIT_HASH}" + } + } + } + + stage('Build') { + steps { + sh "./gradlew clean build -x test" + } + } + + stage('Docker Build & Deploy') { + steps { + script { + echo "Building Docker image with tag: ${env.COMMIT_HASH}" + + // IMAGE_TAG 환경변수 설정 후 docker-compose로 빌드 및 배포 + sh """ + export IMAGE_TAG=${env.COMMIT_HASH} + + # 기존 컨테이너 중지 및 제거 + docker-compose -f docker-compose-dev.yml down || true + + # 새 이미지 빌드 + docker-compose -f docker-compose-dev.yml build + + # latest 태그도 추가 + docker tag kamco-changedetection-api:${env.COMMIT_HASH} kamco-changedetection-api:latest + + # 컨테이너 시작 + docker-compose -f docker-compose-dev.yml up -d + """ + + // 헬스체크 대기 + echo "Waiting for application to be ready..." + sh """ + for i in {1..30}; do + if docker exec kamco-changedetection-api curl -f http://localhost:8080/monitor/health > /dev/null 2>&1; then + echo "✅ Application is healthy!" + docker-compose -f docker-compose-dev.yml ps + exit 0 + fi + echo "⏳ Waiting for application... (\$i/30)" + sleep 2 + done + echo "⚠️ Warning: Health check timeout, checking container status..." + docker-compose -f docker-compose-dev.yml ps + """ + } + } + } + + stage('Cleanup Old Images') { + steps { + script { + echo "Cleaning up old Docker images..." + sh """ + # Keep latest 5 images, remove older ones + docker images kamco-changedetection-api --format "{{.ID}} {{.Tag}}" | \ + grep -v latest | tail -n +6 | awk '{print \$1}' | xargs -r docker rmi || true + """ + } + } + } + } +} diff --git a/imagery-make-dataset/README.md b/imagery-make-dataset/README.md new file mode 100755 index 0000000..d4227e4 --- /dev/null +++ b/imagery-make-dataset/README.md @@ -0,0 +1,21 @@ +# IMAGERY MAKE DATASET + +> 영상관리 이미지 파일 싱크 schedule + +## 📋 프로젝트 소개 + +**imagery-make-dataset**는 영상관리 이미지 파일 검사 및 싱크를 위한 schedule 입니다. + +## 🚀 시작하기 +MapSheetMngFileJobController 의 isSchedulerEnabled 변수가 true면 schedule 실행 + +/swagger-ui/index.html + +/api/job/mng-sync-job true, false 수정 가능 + +### 필수 요구사항 + +- Java 21 (JDK 21) +- PostgreSQL 12+ (PostGIS 확장 필요) +- Gradle 8.x (또는 Gradle Wrapper 사용) +- Docker & Docker Compose (선택사항) \ No newline at end of file diff --git a/imagery-make-dataset/build.gradle b/imagery-make-dataset/build.gradle new file mode 100755 index 0000000..2d1088f --- /dev/null +++ b/imagery-make-dataset/build.gradle @@ -0,0 +1,110 @@ +plugins { + id 'java' + id 'org.springframework.boot' version '3.5.7' + id 'io.spring.dependency-management' version '1.1.7' + id 'com.diffplug.spotless' version '6.25.0' +} + +group = 'com.kamco.cd' +version = '0.0.1-SNAPSHOT' +description = 'imagery-make-dataset' + +java { + toolchain { + languageVersion = JavaLanguageVersion.of(21) + } +} + +bootJar { + archiveFileName = "imagery-make-dataset.jar" +} + +jar { + enabled = false // plain.jar 안 만들기(혼동 방지) +} + +configurations { + compileOnly { + extendsFrom annotationProcessor + } +} + +repositories { + mavenCentral() + maven { url "https://repo.osgeo.org/repository/release/" } +} + +dependencies { + implementation 'org.springframework.boot:spring-boot-starter-data-jpa' + implementation 'org.springframework.boot:spring-boot-starter-web' + compileOnly 'org.projectlombok:lombok' + runtimeOnly 'org.postgresql:postgresql' + annotationProcessor 'org.projectlombok:lombok' + testImplementation 'org.springframework.boot:spring-boot-starter-test' + testRuntimeOnly 'org.junit.platform:junit-platform-launcher' + implementation 'org.springframework.boot:spring-boot-starter-validation' + + //geometry + implementation 'com.fasterxml.jackson.core:jackson-databind' + implementation "org.geotools:gt-shapefile:30.0" + implementation "org.geotools:gt-referencing:30.0" + implementation "org.geotools:gt-geojson:30.0" + implementation 'org.locationtech.jts.io:jts-io-common:1.20.0' + implementation 'org.locationtech.jts:jts-core:1.19.0' + implementation 'org.hibernate:hibernate-spatial:6.2.7.Final' + implementation 'org.geotools:gt-main:30.0' + implementation("org.geotools:gt-geotiff:30.0") { + exclude group: "javax.media", module: "jai_core" + } + implementation 'org.geotools:gt-epsg-hsql:30.0' + + // QueryDSL JPA + implementation 'com.querydsl:querydsl-jpa:5.0.0:jakarta' + + // Q클래스 생성용 annotationProcessor + annotationProcessor 'com.querydsl:querydsl-apt:5.0.0:jakarta' + annotationProcessor 'jakarta.annotation:jakarta.annotation-api' + annotationProcessor 'jakarta.persistence:jakarta.persistence-api' + + // actuator + implementation 'org.springframework.boot:spring-boot-starter-actuator' + + // Redis + implementation 'org.springframework.boot:spring-boot-starter-data-redis' + + // SpringDoc OpenAPI (Swagger) + implementation 'org.springdoc:springdoc-openapi-starter-webmvc-ui:2.7.0' + + // Apache Commons Compress for archive handling + implementation 'org.apache.commons:commons-compress:1.26.0' + + implementation 'com.fasterxml.jackson.datatype:jackson-datatype-jsr310' + implementation 'org.reflections:reflections:0.10.2' + + + implementation 'org.locationtech.jts:jts-core:1.19.0' + implementation 'org.locationtech.jts.io:jts-io-common:1.19.0' +} + +configurations.configureEach { + exclude group: 'javax.media', module: 'jai_core' +} + +tasks.named('test') { + useJUnitPlatform() +} + +// Spotless configuration for code formatting (2-space indent) +spotless { + java { + target 'src/**/*.java' + googleJavaFormat('1.19.2') // Default Google Style = 2 spaces (NO .aosp()!) + trimTrailingWhitespace() + endWithNewline() + } +} + +// Run spotlessCheck before build +tasks.named('build') { + dependsOn 'spotlessCheck' +} diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/KamcoBackApplication.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/KamcoBackApplication.class new file mode 100644 index 0000000..b6f1b43 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/KamcoBackApplication.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/config/QuerydslConfig.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/config/QuerydslConfig.class new file mode 100644 index 0000000..11f1c1a Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/config/QuerydslConfig.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/controller/MapSheetMngFileJobApiController.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/controller/MapSheetMngFileJobApiController.class new file mode 100644 index 0000000..fb6c3cd Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/controller/MapSheetMngFileJobApiController.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/controller/MapSheetMngFileJobController.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/controller/MapSheetMngFileJobController.class new file mode 100644 index 0000000..890099a Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/controller/MapSheetMngFileJobController.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/ApiResponseDto$ApiResponseCode.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/ApiResponseDto$ApiResponseCode.class new file mode 100644 index 0000000..18effd2 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/ApiResponseDto$ApiResponseCode.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/ApiResponseDto$Error.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/ApiResponseDto$Error.class new file mode 100644 index 0000000..4aec570 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/ApiResponseDto$Error.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/ApiResponseDto$ResponseObj.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/ApiResponseDto$ResponseObj.class new file mode 100644 index 0000000..b60f2a2 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/ApiResponseDto$ResponseObj.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/ApiResponseDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/ApiResponseDto.class new file mode 100644 index 0000000..7e26f32 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/ApiResponseDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$Basic.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$Basic.class new file mode 100644 index 0000000..a13a8c7 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$Basic.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$FilesDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$FilesDto.class new file mode 100644 index 0000000..2c42af8 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$FilesDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$FolderDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$FolderDto.class new file mode 100644 index 0000000..1577579 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$FolderDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$FoldersDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$FoldersDto.class new file mode 100644 index 0000000..5e014dd Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$FoldersDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$SrchFilesDepthDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$SrchFilesDepthDto.class new file mode 100644 index 0000000..cda9ebe Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$SrchFilesDepthDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$SrchFilesDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$SrchFilesDto.class new file mode 100644 index 0000000..8b56425 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$SrchFilesDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$SrchFoldersDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$SrchFoldersDto.class new file mode 100644 index 0000000..9fac1ca Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto$SrchFoldersDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto.class new file mode 100644 index 0000000..0714747 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/FileDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$AddReq.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$AddReq.class new file mode 100644 index 0000000..e71d4d9 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$AddReq.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$DeleteFileReq.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$DeleteFileReq.class new file mode 100644 index 0000000..485811a Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$DeleteFileReq.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$DmlReturn.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$DmlReturn.class new file mode 100644 index 0000000..c2792fa Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$DmlReturn.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$ErrorDataDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$ErrorDataDto.class new file mode 100644 index 0000000..fe11142 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$ErrorDataDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$ErrorSearchReq.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$ErrorSearchReq.class new file mode 100644 index 0000000..c2bf153 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$ErrorSearchReq.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MapSheetState.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MapSheetState.class new file mode 100644 index 0000000..709034f Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MapSheetState.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngDto.class new file mode 100644 index 0000000..c21f331 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngFIleDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngFIleDto.class new file mode 100644 index 0000000..4f74a3d Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngFIleDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngFileAddReq.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngFileAddReq.class new file mode 100644 index 0000000..51f39a7 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngFileAddReq.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngFilesDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngFilesDto.class new file mode 100644 index 0000000..837c6df Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngFilesDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngListCompareDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngListCompareDto.class new file mode 100644 index 0000000..e027ecc Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngListCompareDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngListDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngListDto.class new file mode 100644 index 0000000..a385cca Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngListDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngSearchReq.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngSearchReq.class new file mode 100644 index 0000000..d4055b2 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngSearchReq.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngYyyyDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngYyyyDto.class new file mode 100644 index 0000000..5d10dd8 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$MngYyyyDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$ResisterYearList.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$ResisterYearList.class new file mode 100644 index 0000000..e82f928 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$ResisterYearList.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$SyncCheckStateReqUpdateDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$SyncCheckStateReqUpdateDto.class new file mode 100644 index 0000000..e855489 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$SyncCheckStateReqUpdateDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$TotalListDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$TotalListDto.class new file mode 100644 index 0000000..08ff093 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$TotalListDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$YearSearchReq$YearSearchReqBuilder.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$YearSearchReq$YearSearchReqBuilder.class new file mode 100644 index 0000000..ecc8574 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$YearSearchReq$YearSearchReqBuilder.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$YearSearchReq.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$YearSearchReq.class new file mode 100644 index 0000000..17b0c9f Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto$YearSearchReq.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto.class new file mode 100644 index 0000000..d67f9e3 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$DmlReturn.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$DmlReturn.class new file mode 100644 index 0000000..0025380 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$DmlReturn.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$MngDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$MngDto.class new file mode 100644 index 0000000..e544186 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$MngDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$MngFileAddReq.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$MngFileAddReq.class new file mode 100644 index 0000000..955c826 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$MngFileAddReq.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$MngFilesDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$MngFilesDto.class new file mode 100644 index 0000000..a5c90fe Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$MngFilesDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$MngHstDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$MngHstDto.class new file mode 100644 index 0000000..c0c5f5d Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$MngHstDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$MngListCompareDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$MngListCompareDto.class new file mode 100644 index 0000000..733366a Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$MngListCompareDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$MngSearchReq.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$MngSearchReq.class new file mode 100644 index 0000000..f510a34 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto$MngSearchReq.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto.class new file mode 100644 index 0000000..a65577d Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/dto/MapSheetMngDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/ApiConfigEnum$EnumDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/ApiConfigEnum$EnumDto.class new file mode 100644 index 0000000..aff3c11 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/ApiConfigEnum$EnumDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/ApiConfigEnum.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/ApiConfigEnum.class new file mode 100644 index 0000000..fad51ec Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/ApiConfigEnum.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/CodeDto.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/CodeDto.class new file mode 100644 index 0000000..b0b057c Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/CodeDto.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/CommonUseStatus.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/CommonUseStatus.class new file mode 100644 index 0000000..6e2f34e Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/CommonUseStatus.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/Enums.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/Enums.class new file mode 100644 index 0000000..6b9ca97 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/Enums.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/MngStateType.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/MngStateType.class new file mode 100644 index 0000000..0e02cb7 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/MngStateType.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/SyncStateType.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/SyncStateType.class new file mode 100644 index 0000000..95c1408 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/enums/SyncStateType.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/inferface/CodeExpose.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/inferface/CodeExpose.class new file mode 100644 index 0000000..c5c9cc2 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/inferface/CodeExpose.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/inferface/CodeHidden.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/inferface/CodeHidden.class new file mode 100644 index 0000000..ea41d29 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/inferface/CodeHidden.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/inferface/EnumType.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/inferface/EnumType.class new file mode 100644 index 0000000..6700ff7 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/inferface/EnumType.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/inferface/JsonFormatDttm.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/inferface/JsonFormatDttm.class new file mode 100644 index 0000000..190dc12 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/inferface/JsonFormatDttm.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/core/MapSheetMngFileJobCoreService.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/core/MapSheetMngFileJobCoreService.class new file mode 100644 index 0000000..e80df79 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/core/MapSheetMngFileJobCoreService.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/CommonDateEntity.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/CommonDateEntity.class new file mode 100644 index 0000000..e3895f9 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/CommonDateEntity.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapInkx50kEntity.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapInkx50kEntity.class new file mode 100644 index 0000000..b910f80 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapInkx50kEntity.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapInkx5kEntity.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapInkx5kEntity.class new file mode 100644 index 0000000..e5a36b8 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapInkx5kEntity.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngEntity.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngEntity.class new file mode 100644 index 0000000..942a0ea Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngEntity.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngFileEntity.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngFileEntity.class new file mode 100644 index 0000000..0ef6de2 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngFileEntity.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngHstEntity.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngHstEntity.class new file mode 100644 index 0000000..dc0ddc5 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngHstEntity.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngYearYnEntity.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngYearYnEntity.class new file mode 100644 index 0000000..35c04e4 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngYearYnEntity.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngYearYnEntityId.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngYearYnEntityId.class new file mode 100644 index 0000000..cdc54d8 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngYearYnEntityId.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QCommonDateEntity.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QCommonDateEntity.class new file mode 100644 index 0000000..bc26791 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QCommonDateEntity.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapInkx50kEntity.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapInkx50kEntity.class new file mode 100644 index 0000000..c263e81 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapInkx50kEntity.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapInkx5kEntity.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapInkx5kEntity.class new file mode 100644 index 0000000..fe95f6f Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapInkx5kEntity.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngEntity.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngEntity.class new file mode 100644 index 0000000..96e48f5 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngEntity.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngFileEntity.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngFileEntity.class new file mode 100644 index 0000000..e445378 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngFileEntity.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngHstEntity.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngHstEntity.class new file mode 100644 index 0000000..efb5a5e Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngHstEntity.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngYearYnEntity.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngYearYnEntity.class new file mode 100644 index 0000000..a316104 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngYearYnEntity.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngYearYnEntityId.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngYearYnEntityId.class new file mode 100644 index 0000000..a7e6bca Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngYearYnEntityId.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngFileJobRepository.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngFileJobRepository.class new file mode 100644 index 0000000..a215fb5 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngFileJobRepository.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngFileJobRepositoryCustom.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngFileJobRepositoryCustom.class new file mode 100644 index 0000000..9401aa6 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngFileJobRepositoryCustom.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngFileJobRepositoryImpl.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngFileJobRepositoryImpl.class new file mode 100644 index 0000000..27c0287 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngFileJobRepositoryImpl.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngYearRepository.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngYearRepository.class new file mode 100644 index 0000000..d99fe8d Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngYearRepository.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngYearRepositoryCustom.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngYearRepositoryCustom.class new file mode 100644 index 0000000..3bff8c5 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngYearRepositoryCustom.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngYearRepositoryImpl.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngYearRepositoryImpl.class new file mode 100644 index 0000000..df4075a Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngYearRepositoryImpl.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/service/MapSheetMngFileJobService.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/service/MapSheetMngFileJobService.class new file mode 100644 index 0000000..a229ef7 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/service/MapSheetMngFileJobService.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/service/NameValidator.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/service/NameValidator.class new file mode 100644 index 0000000..99b9ffc Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/service/NameValidator.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/utils/FIleChecker$Basic.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/utils/FIleChecker$Basic.class new file mode 100644 index 0000000..9ed2449 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/utils/FIleChecker$Basic.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/utils/FIleChecker$Folder.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/utils/FIleChecker$Folder.class new file mode 100644 index 0000000..d1fdf3d Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/utils/FIleChecker$Folder.class differ diff --git a/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/utils/FIleChecker.class b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/utils/FIleChecker.class new file mode 100644 index 0000000..8045bc1 Binary files /dev/null and b/imagery-make-dataset/build/classes/java/main/com/kamco/cd/kamcoback/utils/FIleChecker.class differ diff --git a/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QCommonDateEntity.java b/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QCommonDateEntity.java new file mode 100644 index 0000000..2bb519c --- /dev/null +++ b/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QCommonDateEntity.java @@ -0,0 +1,39 @@ +package com.kamco.cd.kamcoback.postgres.entity; + +import static com.querydsl.core.types.PathMetadataFactory.*; + +import com.querydsl.core.types.dsl.*; + +import com.querydsl.core.types.PathMetadata; +import javax.annotation.processing.Generated; +import com.querydsl.core.types.Path; + + +/** + * QCommonDateEntity is a Querydsl query type for CommonDateEntity + */ +@Generated("com.querydsl.codegen.DefaultSupertypeSerializer") +public class QCommonDateEntity extends EntityPathBase { + + private static final long serialVersionUID = 1355779051L; + + public static final QCommonDateEntity commonDateEntity = new QCommonDateEntity("commonDateEntity"); + + public final DateTimePath createdDate = createDateTime("createdDate", java.time.ZonedDateTime.class); + + public final DateTimePath modifiedDate = createDateTime("modifiedDate", java.time.ZonedDateTime.class); + + public QCommonDateEntity(String variable) { + super(CommonDateEntity.class, forVariable(variable)); + } + + public QCommonDateEntity(Path path) { + super(path.getType(), path.getMetadata()); + } + + public QCommonDateEntity(PathMetadata metadata) { + super(CommonDateEntity.class, metadata); + } + +} + diff --git a/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapInkx50kEntity.java b/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapInkx50kEntity.java new file mode 100644 index 0000000..471c2eb --- /dev/null +++ b/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapInkx50kEntity.java @@ -0,0 +1,53 @@ +package com.kamco.cd.kamcoback.postgres.entity; + +import static com.querydsl.core.types.PathMetadataFactory.*; + +import com.querydsl.core.types.dsl.*; + +import com.querydsl.core.types.PathMetadata; +import javax.annotation.processing.Generated; +import com.querydsl.core.types.Path; + + +/** + * QMapInkx50kEntity is a Querydsl query type for MapInkx50kEntity + */ +@Generated("com.querydsl.codegen.DefaultEntitySerializer") +public class QMapInkx50kEntity extends EntityPathBase { + + private static final long serialVersionUID = 1410103956L; + + public static final QMapInkx50kEntity mapInkx50kEntity = new QMapInkx50kEntity("mapInkx50kEntity"); + + public final QCommonDateEntity _super = new QCommonDateEntity(this); + + //inherited + public final DateTimePath createdDate = _super.createdDate; + + public final NumberPath fid = createNumber("fid", Integer.class); + + public final ComparablePath geom = createComparable("geom", org.locationtech.jts.geom.Geometry.class); + + public final StringPath mapidcdNo = createString("mapidcdNo"); + + public final StringPath mapidNm = createString("mapidNm"); + + public final StringPath mapidNo = createString("mapidNo"); + + //inherited + public final DateTimePath modifiedDate = _super.modifiedDate; + + public QMapInkx50kEntity(String variable) { + super(MapInkx50kEntity.class, forVariable(variable)); + } + + public QMapInkx50kEntity(Path path) { + super(path.getType(), path.getMetadata()); + } + + public QMapInkx50kEntity(PathMetadata metadata) { + super(MapInkx50kEntity.class, metadata); + } + +} + diff --git a/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapInkx5kEntity.java b/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapInkx5kEntity.java new file mode 100644 index 0000000..be5ad05 --- /dev/null +++ b/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapInkx5kEntity.java @@ -0,0 +1,67 @@ +package com.kamco.cd.kamcoback.postgres.entity; + +import static com.querydsl.core.types.PathMetadataFactory.*; + +import com.querydsl.core.types.dsl.*; + +import com.querydsl.core.types.PathMetadata; +import javax.annotation.processing.Generated; +import com.querydsl.core.types.Path; +import com.querydsl.core.types.dsl.PathInits; + + +/** + * QMapInkx5kEntity is a Querydsl query type for MapInkx5kEntity + */ +@Generated("com.querydsl.codegen.DefaultEntitySerializer") +public class QMapInkx5kEntity extends EntityPathBase { + + private static final long serialVersionUID = 372911320L; + + private static final PathInits INITS = PathInits.DIRECT2; + + public static final QMapInkx5kEntity mapInkx5kEntity = new QMapInkx5kEntity("mapInkx5kEntity"); + + public final QCommonDateEntity _super = new QCommonDateEntity(this); + + //inherited + public final DateTimePath createdDate = _super.createdDate; + + public final NumberPath fid = createNumber("fid", Integer.class); + + public final ComparablePath geom = createComparable("geom", org.locationtech.jts.geom.Geometry.class); + + public final StringPath mapidcdNo = createString("mapidcdNo"); + + public final StringPath mapidNm = createString("mapidNm"); + + public final QMapInkx50kEntity mapInkx50k; + + //inherited + public final DateTimePath modifiedDate = _super.modifiedDate; + + public final EnumPath useInference = createEnum("useInference", com.kamco.cd.kamcoback.enums.CommonUseStatus.class); + + public QMapInkx5kEntity(String variable) { + this(MapInkx5kEntity.class, forVariable(variable), INITS); + } + + public QMapInkx5kEntity(Path path) { + this(path.getType(), path.getMetadata(), PathInits.getFor(path.getMetadata(), INITS)); + } + + public QMapInkx5kEntity(PathMetadata metadata) { + this(metadata, PathInits.getFor(metadata, INITS)); + } + + public QMapInkx5kEntity(PathMetadata metadata, PathInits inits) { + this(MapInkx5kEntity.class, metadata, inits); + } + + public QMapInkx5kEntity(Class type, PathMetadata metadata, PathInits inits) { + super(type, metadata, inits); + this.mapInkx50k = inits.isInitialized("mapInkx50k") ? new QMapInkx50kEntity(forProperty("mapInkx50k")) : null; + } + +} + diff --git a/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngEntity.java b/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngEntity.java new file mode 100644 index 0000000..a0d38ac --- /dev/null +++ b/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngEntity.java @@ -0,0 +1,65 @@ +package com.kamco.cd.kamcoback.postgres.entity; + +import static com.querydsl.core.types.PathMetadataFactory.*; + +import com.querydsl.core.types.dsl.*; + +import com.querydsl.core.types.PathMetadata; +import javax.annotation.processing.Generated; +import com.querydsl.core.types.Path; + + +/** + * QMapSheetMngEntity is a Querydsl query type for MapSheetMngEntity + */ +@Generated("com.querydsl.codegen.DefaultEntitySerializer") +public class QMapSheetMngEntity extends EntityPathBase { + + private static final long serialVersionUID = 263967671L; + + public static final QMapSheetMngEntity mapSheetMngEntity = new QMapSheetMngEntity("mapSheetMngEntity"); + + public final DateTimePath createdDttm = createDateTime("createdDttm", java.time.ZonedDateTime.class); + + public final NumberPath createdUid = createNumber("createdUid", Long.class); + + public final StringPath mngPath = createString("mngPath"); + + public final StringPath mngState = createString("mngState"); + + public final DateTimePath mngStateDttm = createDateTime("mngStateDttm", java.time.ZonedDateTime.class); + + public final NumberPath mngYyyy = createNumber("mngYyyy", Integer.class); + + public final DateTimePath syncCheckEndDttm = createDateTime("syncCheckEndDttm", java.time.ZonedDateTime.class); + + public final StringPath syncCheckState = createString("syncCheckState"); + + public final DateTimePath syncCheckStrtDttm = createDateTime("syncCheckStrtDttm", java.time.ZonedDateTime.class); + + public final DateTimePath syncEndDttm = createDateTime("syncEndDttm", java.time.ZonedDateTime.class); + + public final StringPath syncState = createString("syncState"); + + public final DateTimePath syncStateDttm = createDateTime("syncStateDttm", java.time.ZonedDateTime.class); + + public final DateTimePath syncStrtDttm = createDateTime("syncStrtDttm", java.time.ZonedDateTime.class); + + public final DateTimePath updatedDttm = createDateTime("updatedDttm", java.time.ZonedDateTime.class); + + public final NumberPath updatedUid = createNumber("updatedUid", Long.class); + + public QMapSheetMngEntity(String variable) { + super(MapSheetMngEntity.class, forVariable(variable)); + } + + public QMapSheetMngEntity(Path path) { + super(path.getType(), path.getMetadata()); + } + + public QMapSheetMngEntity(PathMetadata metadata) { + super(MapSheetMngEntity.class, metadata); + } + +} + diff --git a/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngFileEntity.java b/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngFileEntity.java new file mode 100644 index 0000000..be24f18 --- /dev/null +++ b/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngFileEntity.java @@ -0,0 +1,57 @@ +package com.kamco.cd.kamcoback.postgres.entity; + +import static com.querydsl.core.types.PathMetadataFactory.*; + +import com.querydsl.core.types.dsl.*; + +import com.querydsl.core.types.PathMetadata; +import javax.annotation.processing.Generated; +import com.querydsl.core.types.Path; + + +/** + * QMapSheetMngFileEntity is a Querydsl query type for MapSheetMngFileEntity + */ +@Generated("com.querydsl.codegen.DefaultEntitySerializer") +public class QMapSheetMngFileEntity extends EntityPathBase { + + private static final long serialVersionUID = 224763475L; + + public static final QMapSheetMngFileEntity mapSheetMngFileEntity = new QMapSheetMngFileEntity("mapSheetMngFileEntity"); + + public final BooleanPath fileDel = createBoolean("fileDel"); + + public final StringPath fileExt = createString("fileExt"); + + public final StringPath fileName = createString("fileName"); + + public final StringPath filePath = createString("filePath"); + + public final NumberPath fileSize = createNumber("fileSize", Long.class); + + public final StringPath fileState = createString("fileState"); + + public final NumberPath fileUid = createNumber("fileUid", Long.class); + + public final NumberPath hstUid = createNumber("hstUid", Long.class); + + public final StringPath mapSheetNum = createString("mapSheetNum"); + + public final NumberPath mngYyyy = createNumber("mngYyyy", Integer.class); + + public final StringPath refMapSheetNum = createString("refMapSheetNum"); + + public QMapSheetMngFileEntity(String variable) { + super(MapSheetMngFileEntity.class, forVariable(variable)); + } + + public QMapSheetMngFileEntity(Path path) { + super(path.getType(), path.getMetadata()); + } + + public QMapSheetMngFileEntity(PathMetadata metadata) { + super(MapSheetMngFileEntity.class, metadata); + } + +} + diff --git a/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngHstEntity.java b/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngHstEntity.java new file mode 100644 index 0000000..0d331a7 --- /dev/null +++ b/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngHstEntity.java @@ -0,0 +1,111 @@ +package com.kamco.cd.kamcoback.postgres.entity; + +import static com.querydsl.core.types.PathMetadataFactory.*; + +import com.querydsl.core.types.dsl.*; + +import com.querydsl.core.types.PathMetadata; +import javax.annotation.processing.Generated; +import com.querydsl.core.types.Path; +import com.querydsl.core.types.dsl.PathInits; + + +/** + * QMapSheetMngHstEntity is a Querydsl query type for MapSheetMngHstEntity + */ +@Generated("com.querydsl.codegen.DefaultEntitySerializer") +public class QMapSheetMngHstEntity extends EntityPathBase { + + private static final long serialVersionUID = 2031221176L; + + private static final PathInits INITS = PathInits.DIRECT2; + + public static final QMapSheetMngHstEntity mapSheetMngHstEntity = new QMapSheetMngHstEntity("mapSheetMngHstEntity"); + + public final QCommonDateEntity _super = new QCommonDateEntity(this); + + //inherited + public final DateTimePath createdDate = _super.createdDate; + + public final NumberPath createdUid = createNumber("createdUid", Long.class); + + public final StringPath dataState = createString("dataState"); + + public final DateTimePath dataStateDttm = createDateTime("dataStateDttm", java.time.ZonedDateTime.class); + + public final NumberPath hstUid = createNumber("hstUid", Long.class); + + public final QMapInkx5kEntity mapInkx5kByCode; + + public final NumberPath mapSheetCodeSrc = createNumber("mapSheetCodeSrc", Integer.class); + + public final StringPath mapSheetName = createString("mapSheetName"); + + public final StringPath mapSheetNum = createString("mapSheetNum"); + + public final StringPath mapSheetPath = createString("mapSheetPath"); + + public final NumberPath mngYyyy = createNumber("mngYyyy", Integer.class); + + //inherited + public final DateTimePath modifiedDate = _super.modifiedDate; + + public final StringPath refMapSheetNum = createString("refMapSheetNum"); + + public final NumberPath scaleRatio = createNumber("scaleRatio", Integer.class); + + public final DateTimePath syncCheckEndDttm = createDateTime("syncCheckEndDttm", java.time.ZonedDateTime.class); + + public final StringPath syncCheckState = createString("syncCheckState"); + + public final DateTimePath syncCheckStrtDttm = createDateTime("syncCheckStrtDttm", java.time.ZonedDateTime.class); + + public final StringPath syncCheckTfwFileName = createString("syncCheckTfwFileName"); + + public final StringPath syncCheckTifFileName = createString("syncCheckTifFileName"); + + public final DateTimePath syncEndDttm = createDateTime("syncEndDttm", java.time.ZonedDateTime.class); + + public final StringPath syncState = createString("syncState"); + + public final DateTimePath syncStrtDttm = createDateTime("syncStrtDttm", java.time.ZonedDateTime.class); + + public final StringPath syncTfwFileName = createString("syncTfwFileName"); + + public final StringPath syncTifFileName = createString("syncTifFileName"); + + public final NumberPath tfwSizeBytes = createNumber("tfwSizeBytes", Long.class); + + public final NumberPath tifSizeBytes = createNumber("tifSizeBytes", Long.class); + + public final NumberPath totalSizeBytes = createNumber("totalSizeBytes", Long.class); + + public final NumberPath updatedUid = createNumber("updatedUid", Long.class); + + public final StringPath useInference = createString("useInference"); + + public final DateTimePath useInferenceDttm = createDateTime("useInferenceDttm", java.time.ZonedDateTime.class); + + public QMapSheetMngHstEntity(String variable) { + this(MapSheetMngHstEntity.class, forVariable(variable), INITS); + } + + public QMapSheetMngHstEntity(Path path) { + this(path.getType(), path.getMetadata(), PathInits.getFor(path.getMetadata(), INITS)); + } + + public QMapSheetMngHstEntity(PathMetadata metadata) { + this(metadata, PathInits.getFor(metadata, INITS)); + } + + public QMapSheetMngHstEntity(PathMetadata metadata, PathInits inits) { + this(MapSheetMngHstEntity.class, metadata, inits); + } + + public QMapSheetMngHstEntity(Class type, PathMetadata metadata, PathInits inits) { + super(type, metadata, inits); + this.mapInkx5kByCode = inits.isInitialized("mapInkx5kByCode") ? new QMapInkx5kEntity(forProperty("mapInkx5kByCode"), inits.get("mapInkx5kByCode")) : null; + } + +} + diff --git a/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngYearYnEntity.java b/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngYearYnEntity.java new file mode 100644 index 0000000..7572d11 --- /dev/null +++ b/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngYearYnEntity.java @@ -0,0 +1,55 @@ +package com.kamco.cd.kamcoback.postgres.entity; + +import static com.querydsl.core.types.PathMetadataFactory.*; + +import com.querydsl.core.types.dsl.*; + +import com.querydsl.core.types.PathMetadata; +import javax.annotation.processing.Generated; +import com.querydsl.core.types.Path; +import com.querydsl.core.types.dsl.PathInits; + + +/** + * QMapSheetMngYearYnEntity is a Querydsl query type for MapSheetMngYearYnEntity + */ +@Generated("com.querydsl.codegen.DefaultEntitySerializer") +public class QMapSheetMngYearYnEntity extends EntityPathBase { + + private static final long serialVersionUID = 1594858377L; + + private static final PathInits INITS = PathInits.DIRECT2; + + public static final QMapSheetMngYearYnEntity mapSheetMngYearYnEntity = new QMapSheetMngYearYnEntity("mapSheetMngYearYnEntity"); + + public final DateTimePath createdDttm = createDateTime("createdDttm", java.time.ZonedDateTime.class); + + public final QMapSheetMngYearYnEntityId id; + + public final DateTimePath updatedDttm = createDateTime("updatedDttm", java.time.ZonedDateTime.class); + + public final StringPath yn = createString("yn"); + + public QMapSheetMngYearYnEntity(String variable) { + this(MapSheetMngYearYnEntity.class, forVariable(variable), INITS); + } + + public QMapSheetMngYearYnEntity(Path path) { + this(path.getType(), path.getMetadata(), PathInits.getFor(path.getMetadata(), INITS)); + } + + public QMapSheetMngYearYnEntity(PathMetadata metadata) { + this(metadata, PathInits.getFor(metadata, INITS)); + } + + public QMapSheetMngYearYnEntity(PathMetadata metadata, PathInits inits) { + this(MapSheetMngYearYnEntity.class, metadata, inits); + } + + public QMapSheetMngYearYnEntity(Class type, PathMetadata metadata, PathInits inits) { + super(type, metadata, inits); + this.id = inits.isInitialized("id") ? new QMapSheetMngYearYnEntityId(forProperty("id")) : null; + } + +} + diff --git a/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngYearYnEntityId.java b/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngYearYnEntityId.java new file mode 100644 index 0000000..4b66b9a --- /dev/null +++ b/imagery-make-dataset/build/generated/sources/annotationProcessor/java/main/com/kamco/cd/kamcoback/postgres/entity/QMapSheetMngYearYnEntityId.java @@ -0,0 +1,39 @@ +package com.kamco.cd.kamcoback.postgres.entity; + +import static com.querydsl.core.types.PathMetadataFactory.*; + +import com.querydsl.core.types.dsl.*; + +import com.querydsl.core.types.PathMetadata; +import javax.annotation.processing.Generated; +import com.querydsl.core.types.Path; + + +/** + * QMapSheetMngYearYnEntityId is a Querydsl query type for MapSheetMngYearYnEntityId + */ +@Generated("com.querydsl.codegen.DefaultEmbeddableSerializer") +public class QMapSheetMngYearYnEntityId extends BeanPath { + + private static final long serialVersionUID = -644422012L; + + public static final QMapSheetMngYearYnEntityId mapSheetMngYearYnEntityId = new QMapSheetMngYearYnEntityId("mapSheetMngYearYnEntityId"); + + public final StringPath mapSheetNum = createString("mapSheetNum"); + + public final NumberPath mngYyyy = createNumber("mngYyyy", Integer.class); + + public QMapSheetMngYearYnEntityId(String variable) { + super(MapSheetMngYearYnEntityId.class, forVariable(variable)); + } + + public QMapSheetMngYearYnEntityId(Path path) { + super(path.getType(), path.getMetadata()); + } + + public QMapSheetMngYearYnEntityId(PathMetadata metadata) { + super(MapSheetMngYearYnEntityId.class, metadata); + } + +} + diff --git a/imagery-make-dataset/build/libs/kamco-map-sheet-image-job.jar b/imagery-make-dataset/build/libs/kamco-map-sheet-image-job.jar new file mode 100755 index 0000000..787d0d5 Binary files /dev/null and b/imagery-make-dataset/build/libs/kamco-map-sheet-image-job.jar differ diff --git a/imagery-make-dataset/build/reports/problems/problems-report.html b/imagery-make-dataset/build/reports/problems/problems-report.html new file mode 100644 index 0000000..ac93c5c --- /dev/null +++ b/imagery-make-dataset/build/reports/problems/problems-report.html @@ -0,0 +1,663 @@ + + + + + + + + + + + + + Gradle Configuration Cache + + + +
+ +
+ Loading... +
+ + + + + + diff --git a/imagery-make-dataset/build/resolvedMainClassName b/imagery-make-dataset/build/resolvedMainClassName new file mode 100755 index 0000000..d27cb6f --- /dev/null +++ b/imagery-make-dataset/build/resolvedMainClassName @@ -0,0 +1 @@ +com.kamco.cd.kamcoback.KamcoBackApplication \ No newline at end of file diff --git a/imagery-make-dataset/build/resources/main/application.yml b/imagery-make-dataset/build/resources/main/application.yml new file mode 100755 index 0000000..328045e --- /dev/null +++ b/imagery-make-dataset/build/resources/main/application.yml @@ -0,0 +1,67 @@ +server: + port: 9080 + +spring: + application: + name: imagery-make-dataset + profiles: + active: local # 사용할 프로파일 지정 (ex. dev, prod, test) + + datasource: + url: jdbc:postgresql://192.168.2.127:15432/kamco_cds + #url: jdbc:postgresql://localhost:5432/kamco_cds + username: kamco_cds + password: kamco_cds_Q!W@E#R$ + hikari: + minimum-idle: 1 + maximum-pool-size: 5 + + jpa: + hibernate: + ddl-auto: update # 테이블이 없으면 생성, 있으면 업데이트 + properties: + hibernate: + jdbc: + batch_size: 50 + default_batch_fetch_size: 100 +logging: + level: + root: INFO + org.springframework.web: DEBUG + org.springframework.security: DEBUG + + # 헬스체크 노이즈 핵심만 다운 + org.springframework.security.web.FilterChainProxy: INFO + org.springframework.security.web.authentication.AnonymousAuthenticationFilter: INFO + org.springframework.security.web.authentication.Http403ForbiddenEntryPoint: INFO + org.springframework.web.servlet.DispatcherServlet: INFO +# actuator +management: + health: + readinessstate: + enabled: true + livenessstate: + enabled: true + endpoint: + health: + probes: + enabled: true + show-details: always + endpoints: + jmx: + exposure: + exclude: "*" + web: + base-path: /monitor + exposure: + include: + - "health" + +file: + #sync-root-dir: D:/kamco-nfs/images/ + sync-root-dir: /kamco-nfs/images/ + sync-tmp-dir: ${file.sync-root-dir}/tmp + sync-file-extention: tfw,tif + sync-auto-exception-start-year: 2025 + sync-auto-exception-before-year-cnt: 3 + diff --git a/imagery-make-dataset/build/resources/main/static/chunk_upload_test.html b/imagery-make-dataset/build/resources/main/static/chunk_upload_test.html new file mode 100755 index 0000000..2c331d4 --- /dev/null +++ b/imagery-make-dataset/build/resources/main/static/chunk_upload_test.html @@ -0,0 +1,137 @@ + + + + + Chunk Upload Test + + +

대용량 파일 청크 업로드 테스트

+ +* Chunk 테스트 사이즈 10M (10 * 1024 * 1024) - 성능에 따라 변경가능

+ +* 업로드 API선택

+ +

+* 파일첨부

+

+ +



+* 업로드시 업로드 이력을 추적하기 위해 UUID생성해서 전달(파일병합시 사용)(script 예제참고)

+UUID :

+ +* API 호출시 파일정보 추출해서 자동 할당해야 함.(script 예제참고)

+chunkIndex :

+chunkTotalIndex :

+ +* API 호출시 파일정보 추출해서 자동 할당해야 함.(script 예제참고)

+fileSize :

+ + + +

+* 진행율(%)

+
+

+* 결과메세지

+
+ + + + diff --git a/imagery-make-dataset/build/tmp/bootJar/MANIFEST.MF b/imagery-make-dataset/build/tmp/bootJar/MANIFEST.MF new file mode 100755 index 0000000..f3b6e0b --- /dev/null +++ b/imagery-make-dataset/build/tmp/bootJar/MANIFEST.MF @@ -0,0 +1,12 @@ +Manifest-Version: 1.0 +Main-Class: org.springframework.boot.loader.launch.JarLauncher +Start-Class: com.kamco.cd.kamcoback.KamcoBackApplication +Spring-Boot-Version: 3.5.7 +Spring-Boot-Classes: BOOT-INF/classes/ +Spring-Boot-Lib: BOOT-INF/lib/ +Spring-Boot-Classpath-Index: BOOT-INF/classpath.idx +Spring-Boot-Layers-Index: BOOT-INF/layers.idx +Build-Jdk-Spec: 21 +Implementation-Title: kamco-map-sheet-image-job +Implementation-Version: 0.0.1-SNAPSHOT + diff --git a/imagery-make-dataset/build/tmp/compileJava/compileTransaction/stash-dir/MapSheetMngYearRepositoryImpl.class.uniqueId0 b/imagery-make-dataset/build/tmp/compileJava/compileTransaction/stash-dir/MapSheetMngYearRepositoryImpl.class.uniqueId0 new file mode 100644 index 0000000..d430a9b Binary files /dev/null and b/imagery-make-dataset/build/tmp/compileJava/compileTransaction/stash-dir/MapSheetMngYearRepositoryImpl.class.uniqueId0 differ diff --git a/imagery-make-dataset/build/tmp/compileJava/previous-compilation-data.bin b/imagery-make-dataset/build/tmp/compileJava/previous-compilation-data.bin new file mode 100644 index 0000000..08798a7 Binary files /dev/null and b/imagery-make-dataset/build/tmp/compileJava/previous-compilation-data.bin differ diff --git a/imagery-make-dataset/build/tmp/spotless-register-dependencies b/imagery-make-dataset/build/tmp/spotless-register-dependencies new file mode 100755 index 0000000..56a6051 --- /dev/null +++ b/imagery-make-dataset/build/tmp/spotless-register-dependencies @@ -0,0 +1 @@ +1 \ No newline at end of file diff --git a/imagery-make-dataset/dev.backup b/imagery-make-dataset/dev.backup new file mode 100755 index 0000000..e69de29 diff --git a/imagery-make-dataset/docker-compose-dev.yml b/imagery-make-dataset/docker-compose-dev.yml new file mode 100755 index 0000000..40b4346 --- /dev/null +++ b/imagery-make-dataset/docker-compose-dev.yml @@ -0,0 +1,35 @@ +services: + kamco-changedetection-api: + build: + context: . + dockerfile: Dockerfile-dev + args: + UID: 1000 # manager01 UID + GID: 1000 # manager01 GID + image: kamco-changedetection-api:${IMAGE_TAG:-latest} + container_name: kamco-changedetection-api + user: "1000:1000" + ports: + - "7100:8080" + environment: + - SPRING_PROFILES_ACTIVE=dev + - TZ=Asia/Seoul + volumes: + - /mnt/nfs_share/images:/app/original-images + - /mnt/nfs_share/model_output:/app/model-outputs + - /mnt/nfs_share/train_dataset:/app/train-dataset + - /mnt/nfs_share/tmp:/app/tmp + - /kamco-nfs:/kamco-nfs + networks: + - kamco-cds + restart: unless-stopped + healthcheck: + test: [ "CMD", "curl", "-f", "http://localhost:8080/monitor/health" ] + interval: 10s + timeout: 5s + retries: 5 + start_period: 40s + +networks: + kamco-cds: + external: true diff --git a/imagery-make-dataset/gradle/wrapper/gradle-wrapper.jar b/imagery-make-dataset/gradle/wrapper/gradle-wrapper.jar new file mode 100755 index 0000000..1b33c55 Binary files /dev/null and b/imagery-make-dataset/gradle/wrapper/gradle-wrapper.jar differ diff --git a/imagery-make-dataset/gradle/wrapper/gradle-wrapper.properties b/imagery-make-dataset/gradle/wrapper/gradle-wrapper.properties new file mode 100755 index 0000000..ca025c8 --- /dev/null +++ b/imagery-make-dataset/gradle/wrapper/gradle-wrapper.properties @@ -0,0 +1,7 @@ +distributionBase=GRADLE_USER_HOME +distributionPath=wrapper/dists +distributionUrl=https\://services.gradle.org/distributions/gradle-8.14-bin.zip +networkTimeout=10000 +validateDistributionUrl=true +zipStoreBase=GRADLE_USER_HOME +zipStorePath=wrapper/dists diff --git a/imagery-make-dataset/gradlew b/imagery-make-dataset/gradlew new file mode 100755 index 0000000..23d15a9 --- /dev/null +++ b/imagery-make-dataset/gradlew @@ -0,0 +1,251 @@ +#!/bin/sh + +# +# Copyright © 2015-2021 the original authors. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# https://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +# SPDX-License-Identifier: Apache-2.0 +# + +############################################################################## +# +# Gradle start up script for POSIX generated by Gradle. +# +# Important for running: +# +# (1) You need a POSIX-compliant shell to run this script. If your /bin/sh is +# noncompliant, but you have some other compliant shell such as ksh or +# bash, then to run this script, type that shell name before the whole +# command line, like: +# +# ksh Gradle +# +# Busybox and similar reduced shells will NOT work, because this script +# requires all of these POSIX shell features: +# * functions; +# * expansions «$var», «${var}», «${var:-default}», «${var+SET}», +# «${var#prefix}», «${var%suffix}», and «$( cmd )»; +# * compound commands having a testable exit status, especially «case»; +# * various built-in commands including «command», «set», and «ulimit». +# +# Important for patching: +# +# (2) This script targets any POSIX shell, so it avoids extensions provided +# by Bash, Ksh, etc; in particular arrays are avoided. +# +# The "traditional" practice of packing multiple parameters into a +# space-separated string is a well documented source of bugs and security +# problems, so this is (mostly) avoided, by progressively accumulating +# options in "$@", and eventually passing that to Java. +# +# Where the inherited environment variables (DEFAULT_JVM_OPTS, JAVA_OPTS, +# and GRADLE_OPTS) rely on word-splitting, this is performed explicitly; +# see the in-line comments for details. +# +# There are tweaks for specific operating systems such as AIX, CygWin, +# Darwin, MinGW, and NonStop. +# +# (3) This script is generated from the Groovy template +# https://github.com/gradle/gradle/blob/HEAD/platforms/jvm/plugins-application/src/main/resources/org/gradle/api/internal/plugins/unixStartScript.txt +# within the Gradle project. +# +# You can find Gradle at https://github.com/gradle/gradle/. +# +############################################################################## + +# Attempt to set APP_HOME + +# Resolve links: $0 may be a link +app_path=$0 + +# Need this for daisy-chained symlinks. +while + APP_HOME=${app_path%"${app_path##*/}"} # leaves a trailing /; empty if no leading path + [ -h "$app_path" ] +do + ls=$( ls -ld "$app_path" ) + link=${ls#*' -> '} + case $link in #( + /*) app_path=$link ;; #( + *) app_path=$APP_HOME$link ;; + esac +done + +# This is normally unused +# shellcheck disable=SC2034 +APP_BASE_NAME=${0##*/} +# Discard cd standard output in case $CDPATH is set (https://github.com/gradle/gradle/issues/25036) +APP_HOME=$( cd -P "${APP_HOME:-./}" > /dev/null && printf '%s\n' "$PWD" ) || exit + +# Use the maximum available, or set MAX_FD != -1 to use that value. +MAX_FD=maximum + +warn () { + echo "$*" +} >&2 + +die () { + echo + echo "$*" + echo + exit 1 +} >&2 + +# OS specific support (must be 'true' or 'false'). +cygwin=false +msys=false +darwin=false +nonstop=false +case "$( uname )" in #( + CYGWIN* ) cygwin=true ;; #( + Darwin* ) darwin=true ;; #( + MSYS* | MINGW* ) msys=true ;; #( + NONSTOP* ) nonstop=true ;; +esac + +CLASSPATH="\\\"\\\"" + + +# Determine the Java command to use to start the JVM. +if [ -n "$JAVA_HOME" ] ; then + if [ -x "$JAVA_HOME/jre/sh/java" ] ; then + # IBM's JDK on AIX uses strange locations for the executables + JAVACMD=$JAVA_HOME/jre/sh/java + else + JAVACMD=$JAVA_HOME/bin/java + fi + if [ ! -x "$JAVACMD" ] ; then + die "ERROR: JAVA_HOME is set to an invalid directory: $JAVA_HOME + +Please set the JAVA_HOME variable in your environment to match the +location of your Java installation." + fi +else + JAVACMD=java + if ! command -v java >/dev/null 2>&1 + then + die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH. + +Please set the JAVA_HOME variable in your environment to match the +location of your Java installation." + fi +fi + +# Increase the maximum file descriptors if we can. +if ! "$cygwin" && ! "$darwin" && ! "$nonstop" ; then + case $MAX_FD in #( + max*) + # In POSIX sh, ulimit -H is undefined. That's why the result is checked to see if it worked. + # shellcheck disable=SC2039,SC3045 + MAX_FD=$( ulimit -H -n ) || + warn "Could not query maximum file descriptor limit" + esac + case $MAX_FD in #( + '' | soft) :;; #( + *) + # In POSIX sh, ulimit -n is undefined. That's why the result is checked to see if it worked. + # shellcheck disable=SC2039,SC3045 + ulimit -n "$MAX_FD" || + warn "Could not set maximum file descriptor limit to $MAX_FD" + esac +fi + +# Collect all arguments for the java command, stacking in reverse order: +# * args from the command line +# * the main class name +# * -classpath +# * -D...appname settings +# * --module-path (only if needed) +# * DEFAULT_JVM_OPTS, JAVA_OPTS, and GRADLE_OPTS environment variables. + +# For Cygwin or MSYS, switch paths to Windows format before running java +if "$cygwin" || "$msys" ; then + APP_HOME=$( cygpath --path --mixed "$APP_HOME" ) + CLASSPATH=$( cygpath --path --mixed "$CLASSPATH" ) + + JAVACMD=$( cygpath --unix "$JAVACMD" ) + + # Now convert the arguments - kludge to limit ourselves to /bin/sh + for arg do + if + case $arg in #( + -*) false ;; # don't mess with options #( + /?*) t=${arg#/} t=/${t%%/*} # looks like a POSIX filepath + [ -e "$t" ] ;; #( + *) false ;; + esac + then + arg=$( cygpath --path --ignore --mixed "$arg" ) + fi + # Roll the args list around exactly as many times as the number of + # args, so each arg winds up back in the position where it started, but + # possibly modified. + # + # NB: a `for` loop captures its iteration list before it begins, so + # changing the positional parameters here affects neither the number of + # iterations, nor the values presented in `arg`. + shift # remove old arg + set -- "$@" "$arg" # push replacement arg + done +fi + + +# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script. +DEFAULT_JVM_OPTS='"-Xmx64m" "-Xms64m"' + +# Collect all arguments for the java command: +# * DEFAULT_JVM_OPTS, JAVA_OPTS, and optsEnvironmentVar are not allowed to contain shell fragments, +# and any embedded shellness will be escaped. +# * For example: A user cannot expect ${Hostname} to be expanded, as it is an environment variable and will be +# treated as '${Hostname}' itself on the command line. + +set -- \ + "-Dorg.gradle.appname=$APP_BASE_NAME" \ + -classpath "$CLASSPATH" \ + -jar "$APP_HOME/gradle/wrapper/gradle-wrapper.jar" \ + "$@" + +# Stop when "xargs" is not available. +if ! command -v xargs >/dev/null 2>&1 +then + die "xargs is not available" +fi + +# Use "xargs" to parse quoted args. +# +# With -n1 it outputs one arg per line, with the quotes and backslashes removed. +# +# In Bash we could simply go: +# +# readarray ARGS < <( xargs -n1 <<<"$var" ) && +# set -- "${ARGS[@]}" "$@" +# +# but POSIX shell has neither arrays nor command substitution, so instead we +# post-process each arg (as a line of input to sed) to backslash-escape any +# character that might be a shell metacharacter, then use eval to reverse +# that process (while maintaining the separation between arguments), and wrap +# the whole thing up as a single "set" statement. +# +# This will of course break if any of these variables contains a newline or +# an unmatched quote. +# + +eval "set -- $( + printf '%s\n' "$DEFAULT_JVM_OPTS $JAVA_OPTS $GRADLE_OPTS" | + xargs -n1 | + sed ' s~[^-[:alnum:]+,./:=@_]~\\&~g; ' | + tr '\n' ' ' + )" '"$@"' + +exec "$JAVACMD" "$@" diff --git a/imagery-make-dataset/gradlew.bat b/imagery-make-dataset/gradlew.bat new file mode 100755 index 0000000..db3a6ac --- /dev/null +++ b/imagery-make-dataset/gradlew.bat @@ -0,0 +1,94 @@ +@rem +@rem Copyright 2015 the original author or authors. +@rem +@rem Licensed under the Apache License, Version 2.0 (the "License"); +@rem you may not use this file except in compliance with the License. +@rem You may obtain a copy of the License at +@rem +@rem https://www.apache.org/licenses/LICENSE-2.0 +@rem +@rem Unless required by applicable law or agreed to in writing, software +@rem distributed under the License is distributed on an "AS IS" BASIS, +@rem WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +@rem See the License for the specific language governing permissions and +@rem limitations under the License. +@rem +@rem SPDX-License-Identifier: Apache-2.0 +@rem + +@if "%DEBUG%"=="" @echo off +@rem ########################################################################## +@rem +@rem Gradle startup script for Windows +@rem +@rem ########################################################################## + +@rem Set local scope for the variables with windows NT shell +if "%OS%"=="Windows_NT" setlocal + +set DIRNAME=%~dp0 +if "%DIRNAME%"=="" set DIRNAME=. +@rem This is normally unused +set APP_BASE_NAME=%~n0 +set APP_HOME=%DIRNAME% + +@rem Resolve any "." and ".." in APP_HOME to make it shorter. +for %%i in ("%APP_HOME%") do set APP_HOME=%%~fi + +@rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script. +set DEFAULT_JVM_OPTS="-Xmx64m" "-Xms64m" + +@rem Find java.exe +if defined JAVA_HOME goto findJavaFromJavaHome + +set JAVA_EXE=java.exe +%JAVA_EXE% -version >NUL 2>&1 +if %ERRORLEVEL% equ 0 goto execute + +echo. 1>&2 +echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH. 1>&2 +echo. 1>&2 +echo Please set the JAVA_HOME variable in your environment to match the 1>&2 +echo location of your Java installation. 1>&2 + +goto fail + +:findJavaFromJavaHome +set JAVA_HOME=%JAVA_HOME:"=% +set JAVA_EXE=%JAVA_HOME%/bin/java.exe + +if exist "%JAVA_EXE%" goto execute + +echo. 1>&2 +echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME% 1>&2 +echo. 1>&2 +echo Please set the JAVA_HOME variable in your environment to match the 1>&2 +echo location of your Java installation. 1>&2 + +goto fail + +:execute +@rem Setup the command line + +set CLASSPATH= + + +@rem Execute Gradle +"%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -classpath "%CLASSPATH%" -jar "%APP_HOME%\gradle\wrapper\gradle-wrapper.jar" %* + +:end +@rem End local scope for the variables with windows NT shell +if %ERRORLEVEL% equ 0 goto mainEnd + +:fail +rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of +rem the _cmd.exe /c_ return code! +set EXIT_CODE=%ERRORLEVEL% +if %EXIT_CODE% equ 0 set EXIT_CODE=1 +if not ""=="%GRADLE_EXIT_CONSOLE%" exit %EXIT_CODE% +exit /b %EXIT_CODE% + +:mainEnd +if "%OS%"=="Windows_NT" endlocal + +:omega diff --git a/imagery-make-dataset/http/CommonCode.http b/imagery-make-dataset/http/CommonCode.http new file mode 100755 index 0000000..6083e22 --- /dev/null +++ b/imagery-make-dataset/http/CommonCode.http @@ -0,0 +1,4 @@ +### GET getByCodeId +GET http://localhost:8080/api/code/1 +Content-Type: application/json +### diff --git a/imagery-make-dataset/intellij-java-google-style.xml b/imagery-make-dataset/intellij-java-google-style.xml new file mode 100755 index 0000000..d63d731 --- /dev/null +++ b/imagery-make-dataset/intellij-java-google-style.xml @@ -0,0 +1,598 @@ + + + + + + diff --git a/imagery-make-dataset/pack_offline_bundle_airgap_macos.sh b/imagery-make-dataset/pack_offline_bundle_airgap_macos.sh new file mode 100755 index 0000000..1a34086 --- /dev/null +++ b/imagery-make-dataset/pack_offline_bundle_airgap_macos.sh @@ -0,0 +1,571 @@ +#!/bin/bash +# pack_offline_bundle_airgap_macos.sh +# ============================================================================ +# Gradle Offline Bundle Packer (macOS) +# ============================================================================ +# Version: 4.0 +# +# WORKFLOW: +# 1. [ONLINE] Build project (./gradlew bootJar) - downloads all deps +# 2. [ONLINE] Test run (./gradlew bootRun) - verify app works +# 3. [OFFLINE TEST] Verify offline build works +# 4. Create bundle with all cached dependencies +# +# REQUIREMENTS: +# - Internet connection (for initial build) +# - Project with gradlew +# - macOS 10.13+ (High Sierra or later) +# ============================================================================ + +set -e + +# ============================================================================ +# Configuration +# ============================================================================ +WRAPPER_SEED_PATH="wrapper_jar_seed" +OFFLINE_HOME_NAME="_offline_gradle_home" +BOOTRUN_TIMEOUT_SECONDS=60 + +# Color codes +RED='\033[0;31m' +GREEN='\033[0;32m' +YELLOW='\033[1;33m' +CYAN='\033[0;36m' +GRAY='\033[0;90m' +WHITE='\033[1;37m' +NC='\033[0m' # No Color + +echo "" +echo -e "${CYAN}============================================================${NC}" +echo -e "${CYAN} Gradle Offline Bundle Packer v4.0 (macOS)${NC}" +echo -e "${CYAN}============================================================${NC}" +echo "" +echo -e "${WHITE} This script will:${NC}" +echo -e "${GRAY} 1. Build project with internet (download dependencies)${NC}" +echo -e "${GRAY} 2. Test run application (verify it works)${NC}" +echo -e "${GRAY} 3. Test offline build (verify cache is complete)${NC}" +echo -e "${GRAY} 4. Create offline bundle for air-gapped environment${NC}" +echo "" +echo -e "${CYAN}============================================================${NC}" +echo "" + +# ============================================================================ +# [1/20] Check Current Directory +# ============================================================================ +echo -e "${YELLOW}==[1/20] Check Current Directory ==${NC}" +ROOT="$(pwd)" +echo "ROOT_DIR: $ROOT" +echo "" + +# ============================================================================ +# [2/20] Check Required Files +# ============================================================================ +echo -e "${YELLOW}==[2/20] Check Required Files ==${NC}" + +if [ ! -f "./gradlew" ]; then + echo -e "${RED}ERROR: gradlew not found. Run from project root.${NC}" + exit 1 +fi +chmod +x ./gradlew +echo -e "${GREEN}[OK] gradlew${NC}" + +BUILD_FILE="" +if [ -f "./build.gradle" ]; then + BUILD_FILE="build.gradle" +elif [ -f "./build.gradle.kts" ]; then + BUILD_FILE="build.gradle.kts" +else + echo -e "${RED}ERROR: build.gradle(.kts) not found.${NC}" + exit 1 +fi +echo -e "${GREEN}[OK] $BUILD_FILE${NC}" + +SETTINGS_FILE="" +if [ -f "./settings.gradle" ]; then + SETTINGS_FILE="settings.gradle" + echo -e "${GREEN}[OK] $SETTINGS_FILE${NC}" +elif [ -f "./settings.gradle.kts" ]; then + SETTINGS_FILE="settings.gradle.kts" + echo -e "${GREEN}[OK] $SETTINGS_FILE${NC}" +fi +echo "" + +# ============================================================================ +# [3/20] Check Gradle Wrapper +# ============================================================================ +echo -e "${YELLOW}==[3/20] Check Gradle Wrapper ==${NC}" + +WRAPPER_DIR="$ROOT/gradle/wrapper" +WRAPPER_JAR="$WRAPPER_DIR/gradle-wrapper.jar" +WRAPPER_PROP="$WRAPPER_DIR/gradle-wrapper.properties" + +mkdir -p "$WRAPPER_DIR" + +if [ ! -f "$WRAPPER_PROP" ]; then + echo -e "${RED}ERROR: gradle-wrapper.properties not found.${NC}" + exit 1 +fi + +if [ ! -f "$WRAPPER_JAR" ]; then + SEED_JAR="$ROOT/$WRAPPER_SEED_PATH/gradle-wrapper.jar" + if [ -f "$SEED_JAR" ]; then + cp "$SEED_JAR" "$WRAPPER_JAR" + echo -e "${GREEN}[OK] Wrapper jar injected from seed${NC}" + else + echo -e "${RED}ERROR: gradle-wrapper.jar missing${NC}" + exit 1 + fi +else + echo -e "${GREEN}[OK] gradle-wrapper.jar exists${NC}" +fi + +# Create seed backup +SEED_DIR="$ROOT/$WRAPPER_SEED_PATH" +if [ ! -d "$SEED_DIR" ]; then + mkdir -p "$SEED_DIR" + cp "$WRAPPER_JAR" "$SEED_DIR/gradle-wrapper.jar" +fi +echo "" + +# ============================================================================ +# [4/20] Set GRADLE_USER_HOME (Project Local) +# ============================================================================ +echo -e "${YELLOW}==[4/20] Set GRADLE_USER_HOME ==${NC}" + +OFFLINE_HOME="$ROOT/$OFFLINE_HOME_NAME" +mkdir -p "$OFFLINE_HOME" +export GRADLE_USER_HOME="$OFFLINE_HOME" + +echo -e "${CYAN}GRADLE_USER_HOME = $GRADLE_USER_HOME${NC}" +echo -e "${GRAY}[INFO] All dependencies will be cached in project folder${NC}" +echo "" + +# ============================================================================ +# [5/20] Check Internet Connection +# ============================================================================ +echo -e "${YELLOW}==[5/20] Check Internet Connection ==${NC}" + +HAS_INTERNET=false +TEST_HOSTS=("plugins.gradle.org" "repo.maven.apache.org" "repo1.maven.org") + +for TEST_HOST in "${TEST_HOSTS[@]}"; do + # macOS ping doesn't have -W, use -t instead + if ping -c 1 -t 3 "$TEST_HOST" &>/dev/null; then + HAS_INTERNET=true + echo -e "${GREEN}[OK] Connected to $TEST_HOST${NC}" + break + fi +done + +if [ "$HAS_INTERNET" = false ]; then + # Try DNS resolution as fallback + if nslookup google.com &>/dev/null || host google.com &>/dev/null; then + HAS_INTERNET=true + echo -e "${GREEN}[OK] Internet available (DNS)${NC}" + fi +fi + +if [ "$HAS_INTERNET" = false ]; then + echo "" + echo -e "${RED}============================================================${NC}" + echo -e "${RED} ERROR: No Internet Connection!${NC}" + echo -e "${RED}============================================================${NC}" + echo "" + echo -e "${YELLOW}This script requires internet for initial build.${NC}" + echo -e "${YELLOW}Please connect to internet and run again.${NC}" + echo "" + exit 1 +fi +echo "" + +# ============================================================================ +# [6/20] Initial Gradle Setup +# ============================================================================ +echo -e "${YELLOW}==[6/20] Initial Gradle Setup ==${NC}" +echo -e "${GRAY}[INFO] Downloading Gradle distribution...${NC}" + +if ./gradlew --version &>/dev/null; then + GRADLE_VERSION=$(./gradlew --version 2>&1 | grep "^Gradle" | awk '{print $2}') + echo -e "${GREEN}[OK] Gradle $GRADLE_VERSION${NC}" +else + echo -e "${RED}[ERROR] Gradle setup failed${NC}" + exit 1 +fi +echo "" + +# ============================================================================ +# [7/20] ONLINE BUILD - bootJar (Download All Dependencies) +# ============================================================================ +echo -e "${YELLOW}==[7/20] ONLINE BUILD - bootJar ==${NC}" +echo "" +echo -e "${CYAN}============================================================${NC}" +echo -e "${CYAN} ONLINE BUILD (with Internet)${NC}" +echo -e "${CYAN} Downloading all dependencies to local cache${NC}" +echo -e "${CYAN}============================================================${NC}" +echo "" + +BUILD_SUCCESS=false + +./gradlew clean bootJar --no-daemon +if [ $? -eq 0 ]; then + BUILD_SUCCESS=true + echo "" + echo -e "${GREEN}============================================================${NC}" + echo -e "${GREEN} ONLINE BUILD SUCCESS!${NC}" + echo -e "${GREEN}============================================================${NC}" + echo "" + + if [ -d "./build/libs" ]; then + echo -e "${CYAN}JAR files:${NC}" + ls -lh ./build/libs/*.jar 2>/dev/null | awk '{print " " $9 " (" $5 ")"}' + fi +else + echo "" + echo -e "${RED}============================================================${NC}" + echo -e "${RED} BUILD FAILED!${NC}" + echo -e "${RED}============================================================${NC}" + echo "" + echo -e "${YELLOW}Build failed. Cannot continue.${NC}" + exit 1 +fi +echo "" + +# ============================================================================ +# [8/20] Stop Daemons +# ============================================================================ +echo -e "${YELLOW}==[8/20] Stop Daemons ==${NC}" + +./gradlew --stop &>/dev/null || true +sleep 2 +echo -e "${GREEN}[OK] Daemons stopped${NC}" +echo "" + +# ============================================================================ +# [9/20] ONLINE TEST - bootRun (Verify Application Works) +# ============================================================================ +echo -e "${YELLOW}==[9/20] ONLINE TEST - bootRun ==${NC}" +echo "" +echo -e "${CYAN}============================================================${NC}" +echo -e "${CYAN} Testing application startup (timeout: ${BOOTRUN_TIMEOUT_SECONDS}s)${NC}" +echo -e "${CYAN} Will automatically stop after successful startup${NC}" +echo -e "${CYAN}============================================================${NC}" +echo "" + +BOOTRUN_SUCCESS=false + +# macOS uses gtimeout if available, otherwise perl-based timeout +if command -v gtimeout &>/dev/null; then + gtimeout ${BOOTRUN_TIMEOUT_SECONDS}s ./gradlew bootRun --no-daemon & +else + # Fallback: start in background and kill after timeout + ./gradlew bootRun --no-daemon & +fi +BOOTRUN_PID=$! + +sleep 10 + +if ps -p $BOOTRUN_PID &>/dev/null; then + BOOTRUN_SUCCESS=true + echo "" + echo -e "${GREEN}[OK] Application started successfully${NC}" + kill $BOOTRUN_PID &>/dev/null || true + sleep 2 +else + echo "" + echo -e "${YELLOW}[WARN] Application may not have started properly${NC}" +fi + +# Cleanup - macOS process cleanup +pkill -f "gradle.*bootRun" &>/dev/null || true +sleep 2 +echo "" + +# ============================================================================ +# [10/20] Stop Daemons Again +# ============================================================================ +echo -e "${YELLOW}==[10/20] Stop Daemons Again ==${NC}" + +./gradlew --stop &>/dev/null || true +sleep 2 +echo -e "${GREEN}[OK] Daemons stopped${NC}" +echo "" + +# ============================================================================ +# [11/20] OFFLINE BUILD TEST (Verify Cache Completeness) +# ============================================================================ +echo -e "${YELLOW}==[11/20] OFFLINE BUILD TEST ==${NC}" +echo "" +echo -e "${CYAN}============================================================${NC}" +echo -e "${CYAN} OFFLINE BUILD TEST (--offline flag)${NC}" +echo -e "${CYAN} Verifying all dependencies are cached${NC}" +echo -e "${CYAN}============================================================${NC}" +echo "" + +OFFLINE_SUCCESS=false + +./gradlew clean bootJar --offline --no-daemon +if [ $? -eq 0 ]; then + OFFLINE_SUCCESS=true + echo "" + echo -e "${GREEN}============================================================${NC}" + echo -e "${GREEN} OFFLINE BUILD TEST PASSED!${NC}" + echo -e "${GREEN}============================================================${NC}" + echo "" + echo -e "${GREEN}[OK] All dependencies are cached${NC}" +else + echo "" + echo -e "${RED}============================================================${NC}" + echo -e "${RED} OFFLINE BUILD TEST FAILED!${NC}" + echo -e "${RED}============================================================${NC}" + echo "" + echo -e "${YELLOW}Some dependencies may be missing from cache.${NC}" + echo -e "${YELLOW}The bundle may not work in air-gapped environment.${NC}" + echo "" + + read -p "Continue anyway? (y/N): " -n 1 -r + echo + if [[ ! $REPLY =~ ^[Yy]$ ]]; then + exit 1 + fi +fi +echo "" + +# ============================================================================ +# [12/20] Stop Daemons Before Archive +# ============================================================================ +echo -e "${YELLOW}==[12/20] Stop Daemons Before Archive ==${NC}" + +./gradlew --stop &>/dev/null || true +sleep 2 +echo -e "${GREEN}[OK] Daemons stopped${NC}" +echo "" + +# ============================================================================ +# [13/20] Verify settings.gradle for Offline +# ============================================================================ +echo -e "${YELLOW}==[13/20] Verify settings.gradle ==${NC}" + +if [ -n "$SETTINGS_FILE" ]; then + if grep -q "mavenLocal()" "$SETTINGS_FILE" && grep -q "pluginManagement" "$SETTINGS_FILE"; then + echo -e "${GREEN}[OK] settings.gradle configured for offline${NC}" + else + echo -e "${YELLOW}[WARN] settings.gradle may need offline configuration${NC}" + echo -e "${GRAY}[INFO] Consider adding mavenLocal() to pluginManagement and repositories${NC}" + fi +else + echo -e "${GRAY}[INFO] No settings.gradle found${NC}" +fi +echo "" + +# ============================================================================ +# [14/20] Create Helper Scripts +# ============================================================================ +echo -e "${YELLOW}==[14/20] Create Helper Scripts ==${NC}" + +# run_offline_build.sh +cat > "$ROOT/run_offline_build.sh" << 'EOF' +#!/bin/bash +# run_offline_build.sh - Build JAR offline +export GRADLE_USER_HOME="$(pwd)/_offline_gradle_home" +echo "GRADLE_USER_HOME = $GRADLE_USER_HOME" +echo "" +./gradlew --offline bootJar --no-daemon +if [ $? -eq 0 ]; then + echo "" + echo "BUILD SUCCESS!" + echo "" + echo "JAR files:" + ls -lh ./build/libs/*.jar 2>/dev/null | awk '{print " " $9}' +else + echo "BUILD FAILED" +fi +EOF +chmod +x "$ROOT/run_offline_build.sh" +echo -e "${GREEN}[OK] run_offline_build.sh${NC}" + +# run_offline_bootrun.sh +cat > "$ROOT/run_offline_bootrun.sh" << 'EOF' +#!/bin/bash +# run_offline_bootrun.sh - Run application offline +export GRADLE_USER_HOME="$(pwd)/_offline_gradle_home" +echo "GRADLE_USER_HOME = $GRADLE_USER_HOME" +echo "" +echo "Starting application (Ctrl+C to stop)..." +echo "" +./gradlew --offline bootRun --no-daemon +EOF +chmod +x "$ROOT/run_offline_bootrun.sh" +echo -e "${GREEN}[OK] run_offline_bootrun.sh${NC}" +echo "" + +# ============================================================================ +# [15/20] Final Daemon Cleanup +# ============================================================================ +echo -e "${YELLOW}==[15/20] Final Daemon Cleanup ==${NC}" + +./gradlew --stop &>/dev/null || true +sleep 2 +echo -e "${GREEN}[OK] Daemons stopped${NC}" +echo "" + +# ============================================================================ +# [16/20] Clean Lock Files +# ============================================================================ +echo -e "${YELLOW}==[16/20] Clean Lock Files ==${NC}" + +DAEMON_DIR="$OFFLINE_HOME/daemon" +if [ -d "$DAEMON_DIR" ]; then + rm -rf "$DAEMON_DIR" 2>/dev/null || true +fi + +find "$OFFLINE_HOME" -type f \( -name "*.lock" -o -name "*.log" -o -name "*.tmp" \) -delete 2>/dev/null || true + +echo -e "${GREEN}[OK] Lock files cleaned${NC}" +echo "" + +# ============================================================================ +# [17/20] Calculate Cache Size +# ============================================================================ +echo -e "${YELLOW}==[17/20] Cache Summary ==${NC}" + +CACHES_DIR="$OFFLINE_HOME/caches" +WRAPPER_DISTS="$OFFLINE_HOME/wrapper/dists" + +TOTAL_SIZE=0 + +if [ -d "$CACHES_DIR" ]; then + # macOS uses different options for du + if du -k "$CACHES_DIR" &>/dev/null; then + SIZE=$(du -sk "$CACHES_DIR" 2>/dev/null | cut -f1) + SIZE=$((SIZE * 1024)) # Convert to bytes + else + SIZE=0 + fi + TOTAL_SIZE=$((TOTAL_SIZE + SIZE)) + SIZE_MB=$(awk "BEGIN {printf \"%.2f\", $SIZE / 1048576}") + echo -e "${CYAN}[INFO] Dependencies: ${SIZE_MB} MB${NC}" +fi + +if [ -d "$WRAPPER_DISTS" ]; then + if du -k "$WRAPPER_DISTS" &>/dev/null; then + SIZE=$(du -sk "$WRAPPER_DISTS" 2>/dev/null | cut -f1) + SIZE=$((SIZE * 1024)) + else + SIZE=0 + fi + TOTAL_SIZE=$((TOTAL_SIZE + SIZE)) + SIZE_MB=$(awk "BEGIN {printf \"%.2f\", $SIZE / 1048576}") + echo -e "${CYAN}[INFO] Gradle dist: ${SIZE_MB} MB${NC}" +fi + +TOTAL_MB=$(awk "BEGIN {printf \"%.2f\", $TOTAL_SIZE / 1048576}") +echo -e "${CYAN}[INFO] Total cache: ${TOTAL_MB} MB${NC}" +echo "" + +# ============================================================================ +# [18/20] Create Archive +# ============================================================================ +echo -e "${YELLOW}==[18/20] Create Archive ==${NC}" + +BASE_NAME=$(basename "$ROOT") +TIMESTAMP=$(date +"%Y%m%d_%H%M%S") +PARENT=$(dirname "$ROOT") +ARCHIVE_PATH="${PARENT}/${BASE_NAME}_offline_bundle_${TIMESTAMP}.tar.gz" + +echo "Archive: $ARCHIVE_PATH" +echo -e "${GRAY}[INFO] Creating archive (this may take several minutes)...${NC}" + +# macOS tar with BSD options +tar -czf "$ARCHIVE_PATH" \ + --exclude=".git" \ + --exclude=".idea" \ + --exclude=".DS_Store" \ + --exclude="*.log" \ + --exclude="*.lock" \ + --exclude="_offline_gradle_home/daemon" \ + --exclude="_offline_gradle_home/native" \ + --exclude="_offline_gradle_home/jdks" \ + --exclude="build" \ + --exclude="out" \ + --exclude=".gradle" \ + -C "$ROOT" . + +if [ $? -ne 0 ]; then + echo -e "${RED}ERROR: tar failed${NC}" + exit 1 +fi + +# macOS stat command +ARCHIVE_SIZE=$(stat -f%z "$ARCHIVE_PATH" 2>/dev/null) +ARCHIVE_SIZE_MB=$(awk "BEGIN {printf \"%.2f\", $ARCHIVE_SIZE / 1048576}") +echo -e "${GREEN}[OK] Archive created: ${ARCHIVE_SIZE_MB} MB${NC}" +echo "" + +# ============================================================================ +# [19/20] Verify Archive +# ============================================================================ +echo -e "${YELLOW}==[19/20] Verify Archive ==${NC}" + +CHECKS=( + "gradle/wrapper/gradle-wrapper.jar" + "gradlew" + "_offline_gradle_home/caches" + "run_offline_build.sh" +) + +for CHECK in "${CHECKS[@]}"; do + if tar -tzf "$ARCHIVE_PATH" | grep -q "$CHECK"; then + echo -e " ${GREEN}[OK] $CHECK${NC}" + else + echo -e " ${YELLOW}[WARN] $CHECK${NC}" + fi +done +echo "" + +# ============================================================================ +# [20/20] Complete +# ============================================================================ +echo -e "${GREEN}============================================================${NC}" +echo -e "${GREEN} BUNDLE CREATION COMPLETE!${NC}" +echo -e "${GREEN}============================================================${NC}" +echo "" +echo -e "${CYAN}Archive: $ARCHIVE_PATH${NC}" +echo -e "${CYAN}Size: ${ARCHIVE_SIZE_MB} MB${NC}" +echo "" + +echo -e "${CYAN}============================================================${NC}" +echo -e "${CYAN} Test Results${NC}" +echo -e "${CYAN}============================================================${NC}" +if [ "$BUILD_SUCCESS" = true ]; then + echo -e " Online build (bootJar): ${GREEN}PASSED${NC}" +else + echo -e " Online build (bootJar): ${RED}FAILED${NC}" +fi +if [ "$BOOTRUN_SUCCESS" = true ]; then + echo -e " Online test (bootRun): ${GREEN}PASSED${NC}" +else + echo -e " Online test (bootRun): ${YELLOW}SKIPPED${NC}" +fi +if [ "$OFFLINE_SUCCESS" = true ]; then + echo -e " Offline build test: ${GREEN}PASSED${NC}" +else + echo -e " Offline build test: ${RED}FAILED${NC}" +fi +echo "" + +echo -e "${YELLOW}============================================================${NC}" +echo -e "${YELLOW} Usage in Air-gapped Environment${NC}" +echo -e "${YELLOW}============================================================${NC}" +echo "" +echo -e "${WHITE}Option 1: Use unpack script${NC}" +echo -e "${GRAY} ./unpack_and_offline_build_airgap.sh${NC}" +echo "" +echo -e "${WHITE}Option 2: Manual extraction${NC}" +echo -e "${GRAY} tar -xzf .tar.gz${NC}" +echo -e "${GRAY} cd ${NC}" +echo -e "${GRAY} ./run_offline_build.sh${NC}" +echo "" +echo -e "${WHITE}Option 3: Direct commands${NC}" +echo -e "${GRAY} export GRADLE_USER_HOME=\"./_offline_gradle_home\"${NC}" +echo -e "${GRAY} ./gradlew --offline bootJar --no-daemon${NC}" +echo "" diff --git a/imagery-make-dataset/settings.gradle b/imagery-make-dataset/settings.gradle new file mode 100755 index 0000000..e7deda1 --- /dev/null +++ b/imagery-make-dataset/settings.gradle @@ -0,0 +1,6 @@ +pluginManagement { + plugins { + id 'org.jetbrains.kotlin.jvm' version '2.2.20' + } +} +rootProject.name = 'kamco-map-sheet-image-job' diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/KamcoBackApplication.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/KamcoBackApplication.java new file mode 100755 index 0000000..c6b1ae0 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/KamcoBackApplication.java @@ -0,0 +1,14 @@ +package com.kamco.cd.kamcoback; + +import org.springframework.boot.SpringApplication; +import org.springframework.boot.autoconfigure.SpringBootApplication; +import org.springframework.scheduling.annotation.EnableScheduling; + +@SpringBootApplication +@EnableScheduling +public class KamcoBackApplication { + + public static void main(String[] args) { + SpringApplication.run(KamcoBackApplication.class, args); + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/config/QuerydslConfig.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/config/QuerydslConfig.java new file mode 100755 index 0000000..7db36f4 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/config/QuerydslConfig.java @@ -0,0 +1,18 @@ +package com.kamco.cd.kamcoback.config; + +import com.querydsl.jpa.impl.JPAQueryFactory; +import jakarta.persistence.EntityManager; +import jakarta.persistence.PersistenceContext; +import org.springframework.context.annotation.Bean; +import org.springframework.context.annotation.Configuration; + +@Configuration +public class QuerydslConfig { + + @PersistenceContext private EntityManager entityManager; + + @Bean + public JPAQueryFactory jpaQueryFactory() { + return new JPAQueryFactory(entityManager); + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/controller/MapSheetMngFileJobApiController.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/controller/MapSheetMngFileJobApiController.java new file mode 100755 index 0000000..7d73ca3 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/controller/MapSheetMngFileJobApiController.java @@ -0,0 +1,46 @@ +package com.kamco.cd.kamcoback.controller; + +import com.kamco.cd.kamcoback.dto.ApiResponseDto; +import io.swagger.v3.oas.annotations.Operation; +import io.swagger.v3.oas.annotations.media.Content; +import io.swagger.v3.oas.annotations.media.Schema; +import io.swagger.v3.oas.annotations.responses.ApiResponse; +import io.swagger.v3.oas.annotations.responses.ApiResponses; +import io.swagger.v3.oas.annotations.tags.Tag; +import lombok.RequiredArgsConstructor; +import org.springframework.web.bind.annotation.PutMapping; +import org.springframework.web.bind.annotation.RequestMapping; +import org.springframework.web.bind.annotation.RequestParam; +import org.springframework.web.bind.annotation.RestController; + +@Tag(name = "스캐쥴러 API", description = "스캐쥴러 API") +@RestController +@RequiredArgsConstructor +@RequestMapping({"/api/job"}) +public class MapSheetMngFileJobApiController { + + private final MapSheetMngFileJobController mapSheetMngFileJobController; + + @Operation(summary = "영상관리 파일 싱크 스캐쥴러 Start/Stop", description = "영상관리 파일 싱크 스캐쥴러 Start/Stop API") + @ApiResponses( + value = { + @ApiResponse( + responseCode = "200", + description = "조회 성공", + content = + @Content( + mediaType = "application/json", + schema = @Schema(implementation = String.class))), + @ApiResponse(responseCode = "404", description = "코드를 찾을 수 없음", content = @Content), + @ApiResponse(responseCode = "500", description = "서버 오류", content = @Content) + }) + @PutMapping("/mng-sync-job") + public ApiResponseDto mngSyncOnOff( + @RequestParam boolean jobStart, @RequestParam int pageSize) { + + mapSheetMngFileJobController.setSchedulerEnabled(jobStart); + mapSheetMngFileJobController.setMngSyncPageSize(pageSize); + + return ApiResponseDto.createOK("OK"); + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/controller/MapSheetMngFileJobController.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/controller/MapSheetMngFileJobController.java new file mode 100755 index 0000000..d8c3f9c --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/controller/MapSheetMngFileJobController.java @@ -0,0 +1,136 @@ +package com.kamco.cd.kamcoback.controller; + +import com.kamco.cd.kamcoback.service.MapSheetMngFileJobService; +import lombok.Getter; +import lombok.RequiredArgsConstructor; +import org.springframework.scheduling.annotation.Scheduled; +import org.springframework.stereotype.Component; + +@Component +@RequiredArgsConstructor +public class MapSheetMngFileJobController { + + private final MapSheetMngFileJobService mapSheetMngFileJobService; + + // 현재 상태 확인용 Getter + @Getter private boolean isSchedulerEnabled = true; + @Getter private boolean isFileSyncSchedulerEnabled = false; + @Getter private int mngSyncPageSize = 20; + + // 파일싱크 진행여부 확인하기 + @Scheduled(fixedDelay = 1000 * 10) + public void checkMngFileSync() { + if (!isSchedulerEnabled) return; + + Integer mng = 0; + // isFileSyncSchedulerEnabled = false; + if (mapSheetMngFileJobService.checkMngFileSync() != null) { + mng = mapSheetMngFileJobService.checkMngFileSync(); + this.isFileSyncSchedulerEnabled = true; + System.out.println( + "MngFileSyncJob ON --> mngYyyy : " + + mng + + ", currentTime : " + + System.currentTimeMillis()); + } else { + this.isFileSyncSchedulerEnabled = false; + System.out.println( + "MngFileSyncJob OFF --> mngYyyy : " + + mng + + ", currentTime : " + + System.currentTimeMillis()); + } + } + + @Scheduled(fixedDelay = 1000 * 10) + public void mngFileSyncJob00() { + if (!isSchedulerEnabled || !isFileSyncSchedulerEnabled) return; + + System.out.println("mngFileSyncJob 00 Processing currentTime : " + System.currentTimeMillis()); + mapSheetMngFileJobService.checkMapSheetFileProcess(0, mngSyncPageSize); + } + + @Scheduled(fixedDelay = 1000 * 5) + public void mngFileSyncJob01() { + if (!isSchedulerEnabled || !isFileSyncSchedulerEnabled) return; + + System.out.println("mngFileSyncJob 01 Processing currentTime : " + System.currentTimeMillis()); + mapSheetMngFileJobService.checkMapSheetFileProcess(1, mngSyncPageSize); + } + + @Scheduled(fixedDelay = 1000 * 5) + public void mngFileSyncJob02() { + if (!isSchedulerEnabled || !isFileSyncSchedulerEnabled) return; + + System.out.println("mngFileSyncJob 02 Processing currentTime : " + System.currentTimeMillis()); + mapSheetMngFileJobService.checkMapSheetFileProcess(2, mngSyncPageSize); + } + + @Scheduled(fixedDelay = 1000 * 5) + public void mngFileSyncJob03() { + if (!isSchedulerEnabled || !isFileSyncSchedulerEnabled) return; + + System.out.println("mngFileSyncJob 03 Processing currentTime : " + System.currentTimeMillis()); + mapSheetMngFileJobService.checkMapSheetFileProcess(3, mngSyncPageSize); + } + + @Scheduled(fixedDelay = 1000 * 5) + public void mngFileSyncJob04() { + if (!isSchedulerEnabled || !isFileSyncSchedulerEnabled) return; + + System.out.println("mngFileSyncJob 04 Processing currentTime : " + System.currentTimeMillis()); + mapSheetMngFileJobService.checkMapSheetFileProcess(4, mngSyncPageSize); + } + + @Scheduled(fixedDelay = 1000 * 5) + public void mngFileSyncJob05() { + if (!isSchedulerEnabled || !isFileSyncSchedulerEnabled) return; + + System.out.println("mngFileSyncJob 05 Processing currentTime : " + System.currentTimeMillis()); + mapSheetMngFileJobService.checkMapSheetFileProcess(5, mngSyncPageSize); + } + + @Scheduled(fixedDelay = 1000 * 5) + public void mngFileSyncJob06() { + if (!isSchedulerEnabled || !isFileSyncSchedulerEnabled) return; + + System.out.println("mngFileSyncJob 06 Processing currentTime : " + System.currentTimeMillis()); + mapSheetMngFileJobService.checkMapSheetFileProcess(6, mngSyncPageSize); + } + + @Scheduled(fixedDelay = 1000 * 5) + public void mngFileSyncJob07() { + if (!isSchedulerEnabled || !isFileSyncSchedulerEnabled) return; + + System.out.println("mngFileSyncJob 07 Processing currentTime : " + System.currentTimeMillis()); + mapSheetMngFileJobService.checkMapSheetFileProcess(7, mngSyncPageSize); + } + + @Scheduled(fixedDelay = 1000 * 5) + public void mngFileSyncJob08() { + if (!isSchedulerEnabled || !isFileSyncSchedulerEnabled) return; + + System.out.println("mngFileSyncJob 08 Processing currentTime : " + System.currentTimeMillis()); + mapSheetMngFileJobService.checkMapSheetFileProcess(8, mngSyncPageSize); + } + + @Scheduled(fixedDelay = 1000 * 5) + public void mngFileSyncJob09() { + if (!isSchedulerEnabled || !isFileSyncSchedulerEnabled) return; + + System.out.println("mngFileSyncJob 09 Processing currentTime : " + System.currentTimeMillis()); + mapSheetMngFileJobService.checkMapSheetFileProcess(9, mngSyncPageSize); + } + + // 3. 외부에서 플래그를 변경할 수 있는 Setter 메서드 + public void setSchedulerEnabled(boolean enabled) { + this.isSchedulerEnabled = enabled; + this.isFileSyncSchedulerEnabled = false; + System.out.println("스케줄러 동작 상태 변경됨: " + (enabled ? "ON" : "OFF")); + } + + public void setMngSyncPageSize(int pageSize) { + this.mngSyncPageSize = pageSize; + System.out.println("스케줄러 처리 개수 변경됨: " + pageSize); + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/dto/ApiResponseDto.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/dto/ApiResponseDto.java new file mode 100755 index 0000000..56313e3 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/dto/ApiResponseDto.java @@ -0,0 +1,223 @@ +package com.kamco.cd.kamcoback.dto; + +import com.fasterxml.jackson.annotation.JsonIgnore; +import com.fasterxml.jackson.annotation.JsonInclude; +import com.kamco.cd.kamcoback.inferface.EnumType; +import lombok.Getter; +import lombok.RequiredArgsConstructor; +import lombok.ToString; +import org.springframework.http.HttpStatus; + +@Getter +@ToString +public class ApiResponseDto { + + private T data; + + @JsonInclude(JsonInclude.Include.NON_NULL) + private Error error; + + @JsonInclude(JsonInclude.Include.NON_NULL) + private T errorData; + + @JsonIgnore private HttpStatus httpStatus; + + @JsonIgnore private Long errorLogUid; + + public ApiResponseDto(T data) { + this.data = data; + } + + private ApiResponseDto(T data, HttpStatus httpStatus) { + this.data = data; + this.httpStatus = httpStatus; + } + + public ApiResponseDto(ApiResponseCode code) { + this.error = new Error(code.getId(), code.getMessage()); + } + + public ApiResponseDto(ApiResponseCode code, String message) { + this.error = new Error(code.getId(), message); + } + + public ApiResponseDto(ApiResponseCode code, String message, HttpStatus httpStatus) { + this.error = new Error(code.getId(), message); + this.httpStatus = httpStatus; + } + + public ApiResponseDto( + ApiResponseCode code, String message, HttpStatus httpStatus, Long errorLogUid) { + this.error = new Error(code.getId(), message); + this.httpStatus = httpStatus; + this.errorLogUid = errorLogUid; + } + + public ApiResponseDto(ApiResponseCode code, String message, T errorData) { + this.error = new Error(code.getId(), message); + this.errorData = errorData; + } + + // HTTP 상태 코드가 내장된 ApiResponseDto 반환 메서드들 + public static ApiResponseDto createOK(T data) { + return new ApiResponseDto<>(data, HttpStatus.CREATED); + } + + public static ApiResponseDto ok(T data) { + return new ApiResponseDto<>(data, HttpStatus.OK); + } + + public static ApiResponseDto okObject(ResponseObj data) { + if (data.getCode().equals(ApiResponseCode.OK)) { + return new ApiResponseDto<>(data, HttpStatus.NO_CONTENT); + } else { + return new ApiResponseDto<>(data.getCode(), data.getMessage(), HttpStatus.CONFLICT); + } + } + + public static ApiResponseDto deleteOk(T data) { + return new ApiResponseDto<>(data, HttpStatus.NO_CONTENT); + } + + public static ApiResponseDto createException(ApiResponseCode code) { + return new ApiResponseDto<>(code); + } + + public static ApiResponseDto createException(ApiResponseCode code, String message) { + return new ApiResponseDto<>(code, message); + } + + public static ApiResponseDto createException( + ApiResponseCode code, String message, HttpStatus httpStatus) { + return new ApiResponseDto<>(code, message, httpStatus); + } + + public static ApiResponseDto createException( + ApiResponseCode code, String message, HttpStatus httpStatus, Long errorLogUid) { + return new ApiResponseDto<>(code, message, httpStatus, errorLogUid); + } + + public static ApiResponseDto createException( + ApiResponseCode code, String message, T data) { + return new ApiResponseDto<>(code, message, data); + } + + @Getter + public static class Error { + + private final String code; + private final String message; + + public Error(String code, String message) { + this.code = code; + this.message = message; + } + } + + /** Error가 아닌 Business상 성공이거나 실패인 경우, 메세지 함께 전달하기 위한 object */ + @Getter + public static class ResponseObj { + + private final ApiResponseCode code; + private final String message; + + public ResponseObj(ApiResponseCode code, String message) { + this.code = code; + this.message = message; + } + } + + @Getter + @RequiredArgsConstructor + public enum ApiResponseCode implements EnumType { + + // @formatter:off + OK("요청이 성공하였습니다."), + BAD_REQUEST("요청 파라미터가 잘못되었습니다."), + BAD_GATEWAY("네트워크 상태가 불안정합니다."), + ALREADY_EXIST_MALL("이미 등록된 쇼핑센터입니다."), + NOT_FOUND_MAP("지도를 찾을 수 없습니다."), + UNAUTHORIZED("권한이 없습니다."), + CONFLICT("이미 등록된 컨텐츠입니다."), + NOT_FOUND("Resource를 찾을 수 없습니다."), + NOT_FOUND_DATA("데이터를 찾을 수 없습니다."), + NOT_FOUND_WEATHER_DATA("날씨 데이터를 찾을 수 없습니다."), + FAIL_SEND_MESSAGE("메시지를 전송하지 못했습니다."), + TOO_MANY_CONNECTED_MACHINES("연결된 기기가 너무 많습니다."), + UNAUTHENTICATED("인증에 실패하였습니다."), + INVALID_TOKEN("잘못된 토큰입니다."), + EXPIRED_TOKEN("만료된 토큰입니다."), + INTERNAL_SERVER_ERROR("서버에 문제가 발생 하였습니다."), + FORBIDDEN("권한을 확인해주세요."), + INVALID_PASSWORD("잘못된 비밀번호 입니다."), + NOT_FOUND_CAR_IN("입차정보가 없습니다."), + WRONG_STATUS("잘못된 상태입니다."), + FAIL_VERIFICATION("인증에 실패하였습니다."), + INVALID_EMAIL("잘못된 형식의 이메일입니다."), + REQUIRED_EMAIL("이메일은 필수 항목입니다."), + WRONG_PASSWORD("잘못된 패스워드입니다."), + DUPLICATE_EMAIL("이미 가입된 이메일입니다."), + DUPLICATE_DATA("이미 등록되어 있습니다."), + DATA_INTEGRITY_ERROR("데이터 무결성이 위반되어 요청을 처리할수 없습니다."), + FOREIGN_KEY_ERROR("참조 중인 데이터가 있어 삭제할 수 없습니다."), + DUPLICATE_EMPLOYEEID("이미 가입된 사번입니다."), + NOT_FOUND_USER_FOR_EMAIL("이메일로 유저를 찾을 수 없습니다."), + NOT_FOUND_USER("사용자를 찾을 수 없습니다."), + UNPROCESSABLE_ENTITY("이 데이터는 삭제할 수 없습니다."), + LOGIN_ID_NOT_FOUND("아이디를 잘못 입력하셨습니다."), + LOGIN_PASSWORD_MISMATCH("비밀번호를 잘못 입력하셨습니다."), + LOGIN_PASSWORD_EXCEEDED("비밀번호 오류 횟수를 초과하여 이용하실 수 없습니다.\n로그인 오류에 대해 관리자에게 문의하시기 바랍니다."), + INACTIVE_ID("사용할 수 없는 계정입니다."), + INVALID_EMAIL_TOKEN( + "You can only reset your password within 24 hours from when the email was sent.\n" + + "To reset your password again, please submit a new request through \"Forgot" + + " Password.\""), + PAYLOAD_TOO_LARGE("업로드 용량 제한을 초과했습니다."), + NOT_FOUND_TARGET_YEAR("기준년도 도엽을 찾을 수 없습니다."), + NOT_FOUND_COMPARE_YEAR("비교년도 도엽을 찾을 수 없습니다."), + FAIL_SAVE_MAP_SHEET("도엽 저장 중 오류가 발생했습니다."), + FAIL_CREATE_MAP_SHEET_FILE("도엽 설정파일 생성 중 오류가 발생했습니다."), + ; + // @formatter:on + private final String message; + + @Override + public String getId() { + return name(); + } + + @Override + public String getText() { + return message; + } + + public static ApiResponseCode getCode(String name) { + return ApiResponseCode.valueOf(name.toUpperCase()); + } + + public static String getMessage(String name) { + return ApiResponseCode.valueOf(name.toUpperCase()).getText(); + } + + public static ApiResponseCode from(String codeName, HttpStatus status) { + + if (codeName != null && !codeName.isBlank()) { + try { + return ApiResponseCode.valueOf(codeName.toUpperCase()); + } catch (IllegalArgumentException ignore) { + // fallback + } + } + + if (status != null) { + try { + return ApiResponseCode.valueOf(status.name()); + } catch (IllegalArgumentException ignore) { + // fallback + } + } + + return INTERNAL_SERVER_ERROR; + } + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/dto/FileDto.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/dto/FileDto.java new file mode 100755 index 0000000..45f7191 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/dto/FileDto.java @@ -0,0 +1,159 @@ +package com.kamco.cd.kamcoback.dto; + +import io.swagger.v3.oas.annotations.media.Schema; +import jakarta.validation.constraints.NotNull; +import java.util.List; +import lombok.AllArgsConstructor; +import lombok.Getter; +import lombok.NoArgsConstructor; +import lombok.Setter; + +public class FileDto { + + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class SrchFoldersDto { + @Schema(description = "디렉토리경로(ROOT:/app/original-images)", example = "") + @NotNull + private String dirPath; + } + + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class SrchFilesDto { + @Schema(description = "디렉토리경로", example = "D:\\kamco\\2022\\캠코_2021_2022_34602060_D1") + @NotNull + private String dirPath; + + @Schema(description = "전체(*), cpg,dbf,geojson등", example = "*") + @NotNull + private String extension; + + @Schema(description = "전체(*), 3878687.tif", example = "*") + @NotNull + private String fileNm; + + @Schema(description = "파일명(name), 최종수정일(date)", example = "name") + @NotNull + private String sortType; + + @Schema(description = "파일시작위치", example = "1") + @NotNull + private Integer startPos = 0; + + @Schema(description = "파일종료위치", example = "100") + @NotNull + private Integer endPos = 100; + } + + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class SrchFilesDepthDto extends SrchFilesDto { + @Schema(description = "최대폴더Depth", example = "5") + @NotNull + private Integer maxDepth; + } + + @Schema(name = "FolderDto", description = "폴더 정보") + @Getter + public static class FolderDto { + private final String folderNm; + private final String parentFolderNm; + private final String parentPath; + private final String fullPath; + private final int depth; + private final long childCnt; + private final String lastModified; + private final Boolean isValid; + + public FolderDto( + String folderNm, + String parentFolderNm, + String parentPath, + String fullPath, + int depth, + long childCnt, + String lastModified, + Boolean isValid) { + this.folderNm = folderNm; + this.parentFolderNm = parentFolderNm; + this.parentPath = parentPath; + this.fullPath = fullPath; + this.depth = depth; + this.childCnt = childCnt; + this.lastModified = lastModified; + this.isValid = isValid; + } + } + + @Schema(name = "FoldersDto", description = "폴더목록 정보") + @Getter + public static class FoldersDto { + private final String dirPath; + private final int folderTotCnt; + private final int folderErrTotCnt; + private final List folders; + + public FoldersDto( + String dirPath, int folderTotCnt, int folderErrTotCnt, List folders) { + + this.dirPath = dirPath; + this.folderTotCnt = folderTotCnt; + this.folderErrTotCnt = folderErrTotCnt; + this.folders = folders; + } + } + + @Schema(name = "File Basic", description = "파일 기본 정보") + @Getter + public static class Basic { + + private final String fileNm; + private final String parentFolderNm; + private final String parentPath; + private final String fullPath; + private final String extension; + private final long fileSize; + private final String lastModified; + + public Basic( + String fileNm, + String parentFolderNm, + String parentPath, + String fullPath, + String extension, + long fileSize, + String lastModified) { + this.fileNm = fileNm; + this.parentFolderNm = parentFolderNm; + this.parentPath = parentPath; + this.fullPath = fullPath; + this.extension = extension; + this.fileSize = fileSize; + this.lastModified = lastModified; + } + } + + @Schema(name = "FilesDto", description = "파일 목록 정보") + @Getter + public static class FilesDto { + private final String dirPath; + private final int fileTotCnt; + private final long fileTotSize; + private final List files; + + public FilesDto(String dirPath, int fileTotCnt, long fileTotSize, List files) { + + this.dirPath = dirPath; + this.fileTotCnt = fileTotCnt; + this.fileTotSize = fileTotSize; + this.files = files; + } + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/dto/MapSheetDto.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/dto/MapSheetDto.java new file mode 100755 index 0000000..9a7e43e --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/dto/MapSheetDto.java @@ -0,0 +1,467 @@ +package com.kamco.cd.kamcoback.dto; + +import com.fasterxml.jackson.annotation.JsonIgnore; +import com.kamco.cd.kamcoback.enums.Enums; +import com.kamco.cd.kamcoback.enums.MngStateType; +import com.kamco.cd.kamcoback.enums.SyncStateType; +import com.kamco.cd.kamcoback.inferface.EnumType; +import com.kamco.cd.kamcoback.inferface.JsonFormatDttm; +import io.swagger.v3.oas.annotations.media.Schema; +import java.time.ZonedDateTime; +import java.util.List; +import lombok.AllArgsConstructor; +import lombok.Builder; +import lombok.Getter; +import lombok.NoArgsConstructor; +import lombok.Setter; +import org.springframework.data.domain.PageRequest; +import org.springframework.data.domain.Pageable; +import org.springframework.data.domain.Sort; + +public class MapSheetDto { + + @Schema(name = "MngSearchReq", description = "영상관리 검색 요청") + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class MngSearchReq { + + // 페이징 파라미터 + @Schema(description = "페이지 번호 (0부터 시작) ", example = "0") + private int page = 0; + + @Schema(description = "페이지 크기", example = "20") + private int size = 20; + + @Schema(description = "년도", example = "2025") + private Integer mngYyyy; + + public Pageable toPageable() { + return PageRequest.of(page, size); + } + } + + @Schema(name = "MngAddReq", description = "영상관리 생성 요청") + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class AddReq { + + @Schema(description = "관리년도", example = "2022") + private int mngYyyy; + + @Schema(description = "선택폴더경로", example = "D:\\app\\original-images\\2022") + private String mngPath; + + @JsonIgnore private Long createdUid; + } + + @Schema(name = "DeleteFileReq", description = "파일 삭제 요청") + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class DeleteFileReq { + + @Schema(description = "파일 경로", example = "/app/original-images/2024/00000001.tif") + private String filePath; + } + + @Schema(name = "MngDto", description = "영상관리 검색 리턴") + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class MngDto { + + private int rowNum; + private int mngYyyy; + private String mngPath; + private String mngState; + private String syncState; + private String syncDataCheckState; + private Long syncTotCnt; + private Long syncStateDoneCnt; + private Long syncDataCheckDoneCnt; + private Long syncNotPaireCnt; + private Long syncNotPaireExecCnt; + private Long syncDuplicateCnt; + private Long syncDuplicateExecCnt; + private Long syncFaultCnt; + private Long syncFaultExecCnt; + private Long syncNoFileCnt; + private Long syncNoFileExecCnt; + @JsonFormatDttm private ZonedDateTime rgstStrtDttm; + @JsonFormatDttm private ZonedDateTime rgstEndDttm; + + public String getSyncState() { + + if (this.syncStateDoneCnt == 0) { + return "NOTYET"; + } else if (this.syncStateDoneCnt < this.syncTotCnt) { + return "PROCESSING"; + } + + return "DONE"; + } + + public String getDataCheckState() { + + if (this.syncDataCheckDoneCnt == 0) { + return "NOTYET"; + } else if (this.syncDataCheckDoneCnt < this.syncTotCnt) { + return "PROCESSING"; + } + + return "DONE"; + } + + public double getSyncStateDoneRate() { + if (this.syncTotCnt == null || this.syncTotCnt == 0) { + return 0.0; + } + return (double) this.syncStateDoneCnt / this.syncTotCnt * 100.0; + } + + public double getSyncDataCheckDoneRate() { + if (this.syncTotCnt == null || this.syncTotCnt == 0) { + return 0.0; + } + return (double) this.syncDataCheckDoneCnt / this.syncTotCnt * 100.0; + } + + public long getSyncErrorTotCnt() { + return this.syncNotPaireCnt + this.syncDuplicateCnt + this.syncFaultCnt; + } + + public long getSyncErrorExecTotCnt() { + return this.syncNotPaireExecCnt + this.syncDuplicateExecCnt + this.syncFaultExecCnt; + } + + public String getMngState() { + + String mngState = "DONE"; + + if (this.syncStateDoneCnt == 0) { + mngState = "NOTYET"; + } else if (this.syncStateDoneCnt < this.syncTotCnt) { + mngState = "PROCESSING"; + } + + if ((this.syncNotPaireExecCnt + this.syncDuplicateExecCnt + this.syncFaultExecCnt) > 0) { + mngState = "TAKINGERROR"; + } + + return mngState; + } + + public String getMngStateName() { + String enumId = this.getMngState(); + if (enumId == null || enumId.isEmpty()) { + enumId = "NOTYET"; + } + + MngStateType type = Enums.fromId(MngStateType.class, enumId); + return type.getText(); + } + } + + @Schema(name = "ErrorSearchReq", description = "영상관리 오류데이터 검색 요청") + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class ErrorSearchReq { + + // 페이징 파라미터 + @Schema(description = "페이지 번호 (0부터 시작) ", example = "0") + private int page = 0; + + @Schema(description = "페이지 크기", example = "20") + private int size = 20; + + @Schema(description = "정렬", example = "id desc") + private String sort; + + @Schema(description = "오류종류(페어누락:NOTPAIR,중복파일:DUPLICATE,손상파일:FAULT)", example = "NOTPAIR") + private String syncState; + + @Schema(description = "처리유형(처리:DONE,미처리:NOTYET)", example = "DONE") + private String syncCheckState; + + @Schema(description = "검색어", example = "부산3959") + private String searchValue; + + @Schema(description = "년도", example = "2025") + private Integer mngYyyy; + + public Pageable toPageable() { + if (sort != null && !sort.isEmpty()) { + String[] sortParams = sort.split(","); + String property = sortParams[0]; + Sort.Direction direction = + sortParams.length > 1 ? Sort.Direction.fromString(sortParams[1]) : Sort.Direction.ASC; + return PageRequest.of(page, size, Sort.by(direction, property)); + } + return PageRequest.of(page, size); + } + } + + @Schema(name = "ErrorDataDto", description = "영상관리 오류데이터 검색 리턴") + @Getter + @Setter + public static class ErrorDataDto { + + private Long hstUid; + private Integer mngYyyy; + private String mapSheetNum; + private String refMapSheetNum; + private String map50kName; + private String map5kName; + private String mapSrcName; + private Integer mapCodeSrc; + @JsonFormatDttm private ZonedDateTime createdDttm; + + private String syncState; + private String syncStateName; + private String syncTfwFileName; + private String syncTifFileName; + + private String errorCheckState; + private String errorCheckStateName; + private String errorCheckTfwFileName; + private String errorCheckTifFileName; + + // private List fileArray; + + public ErrorDataDto( + Long hstUid, + Integer mngYyyy, + String mapSheetNum, + String refMapSheetNum, + String map50kName, + String map5kName, + String mapSrcName, + Integer mapCodeSrc, + ZonedDateTime createdDttm, + String syncState, + String syncTfwFileName, + String syncTifFileName, + String errorCheckState, + String errorCheckTfwFileName, + String errorCheckTifFileName) { + this.hstUid = hstUid; + this.mngYyyy = mngYyyy; + this.mapSheetNum = mapSheetNum; + this.refMapSheetNum = refMapSheetNum; + this.map50kName = map50kName; + this.map5kName = map5kName; + this.mapSrcName = mapSrcName; + this.mapCodeSrc = mapCodeSrc; + this.createdDttm = createdDttm; + this.syncState = syncState; + this.syncStateName = getSyncStateName(syncState); + this.syncTfwFileName = syncTfwFileName; + this.syncTifFileName = syncTifFileName; + this.errorCheckState = errorCheckState; + this.errorCheckStateName = getSyncStateName(errorCheckState); + this.errorCheckTfwFileName = errorCheckTfwFileName; + this.errorCheckTifFileName = errorCheckTifFileName; + } + + private String getSyncStateName(String enumId) { + if (enumId == null || enumId.isEmpty()) { + enumId = "NOTYET"; + } + + SyncStateType type = Enums.fromId(SyncStateType.class, enumId); + return type.getText(); + } + } + + @Schema(name = "SyncCheckStateReqUpdateDto", description = "영상관리 오류처리 상태변경요청") + @Getter + @Setter + public static class SyncCheckStateReqUpdateDto { + + private Long hstUid; + private String filePath; + private String syncCheckTfwFileName; + private String syncCheckTifFileName; + private String syncCheckState; + } + + @Schema(name = "MngFIleDto", description = "관리파일정보") + @Getter + @Setter + public static class MngFIleDto { + + private Long fileUid; + private String filePath; + private String fileName; + private Long fileSize; + private String fileState; + private Long hstUid; + } + + @Schema(name = "DmlReturn", description = "영상관리 DML 수행 후 리턴") + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class DmlReturn { + + private String flag; + private String message; + } + + @Schema(name = "MngFileAddReq", description = "영상관리파일 등록 요청") + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class MngFileAddReq { + + private int mngYyyy; + private String mapSheetNum; + private String refMapSheetNum; + private String filePath; + private String fileName; + private String fileExt; + private Long hstUid; + private Long fileSize; + private String fileState; + } + + @Schema(name = "MngFilesDto", description = "영상파일내역 검색 리턴") + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class MngFilesDto { + + private long fileUid; + private int mngYyyy; + private String mapSheetNum; + private String refMapSheetNum; + private String filePath; + private String fileName; + private String fileExt; + private Long hstUid; + private Long fileSize; + } + + @Schema(name = "ResisterYearList", description = "영상파일 등록을 위한 연도 list") + @Getter + public static class ResisterYearList { + + private Integer current; + private List years; + + public ResisterYearList(Integer current, List years) { + this.current = current; + this.years = years; + } + } + + @Getter + @AllArgsConstructor + public enum MapSheetState implements EnumType { + // @formatter:off + DONE("완료"), + NOTYET("처리대기"); + // @formatter:on + + private final String message; + + @Override + public String getId() { + return name(); + } + + @Override + public String getText() { + return message; + } + } + + // 연도리스틀 조회시 사용하는 request Dto + @Getter + @Setter + @NoArgsConstructor + public static class YearSearchReq { + + private String status; + + // 페이징 파라미터 + private int page = 0; + private int size = 20; + private String sort; + + @Builder + public YearSearchReq(String status, int page, int size, String sort) { + this.status = status; + this.page = page; + this.size = size; + this.sort = sort; + } + + public Pageable toPageable() { + if (sort != null && !sort.isEmpty()) { + String[] sortParams = sort.split(","); + String property = sortParams[0]; + Sort.Direction direction = + sortParams.length > 1 ? Sort.Direction.fromString(sortParams[1]) : Sort.Direction.ASC; + return PageRequest.of(page, size, Sort.by(direction, property)); + } + return PageRequest.of(page, size); + } + } + + @Schema(name = "MngListDto", description = "영상파일내역 검색 목록") + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class MngListDto { + + private int mngYyyy; + private String mapSheetNum; + private String mapSheetName; + private Integer beforeYear; + private Boolean isSuccess; + } + + @Schema(name = "MngListDto", description = "영상파일내역 검색 목록") + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class MngListCompareDto { + + private String mngYyyy; + private String mapSheetNum; + private Integer beforeYear; + } + + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class TotalListDto { + + private String mapSheetNum; + private Integer beforeYear; + } + + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class MngYyyyDto { + private Integer yyyy; + private String mngPath; + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/dto/MapSheetMngDto.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/dto/MapSheetMngDto.java new file mode 100755 index 0000000..57185c7 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/dto/MapSheetMngDto.java @@ -0,0 +1,138 @@ +package com.kamco.cd.kamcoback.dto; + +import com.kamco.cd.kamcoback.inferface.JsonFormatDttm; +import io.swagger.v3.oas.annotations.media.Schema; +import java.time.ZonedDateTime; +import lombok.AllArgsConstructor; +import lombok.Getter; +import lombok.NoArgsConstructor; +import lombok.Setter; +import org.springframework.data.domain.PageRequest; +import org.springframework.data.domain.Pageable; + +public class MapSheetMngDto { + + @Schema(name = "MngSearchReq", description = "영상관리 검색 요청") + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class MngSearchReq { + + // 페이징 파라미터 + @Schema(description = "페이지 번호 (0부터 시작) ", example = "0") + private int page = 0; + + @Schema(description = "페이지 크기", example = "20") + private int size = 20; + + @Schema(description = "년도", example = "2025") + private Integer mngYyyy; + + public Pageable toPageable() { + return PageRequest.of(page, size); + } + } + + @Schema(name = "MngDto", description = "영상관리 검색 리턴") + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class MngDto { + private int rowNum; + private int mngYyyy; + private String mngState; + private String syncState; + private String syncCheckState; + private Long syncTotCnt; + private Long syncStateDoneCnt; + private Long syncCheckStateDoneCnt; + private Long syncNotFileCnt; + private Long syncTypeErrorCnt; + private Long syncSizeErrorCnt; + @JsonFormatDttm private ZonedDateTime rgstStrtDttm; + @JsonFormatDttm private ZonedDateTime rgstEndDttm; + } + + @Schema(name = "MngHstDto", description = "영상관리내역 검색 리턴") + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class MngHstDto { + private long hstUid; + private int mngYyyy; + private String mapSheetNum; + private String refMapSheetNum; + private String dataState; + private String syncState; + private String syncCheckState; + @JsonFormatDttm private ZonedDateTime syncStrtDttm; + @JsonFormatDttm private ZonedDateTime syncEndDttm; + @JsonFormatDttm private ZonedDateTime syncCheckStrtDttm; + @JsonFormatDttm private ZonedDateTime syncCheckEndDttm; + + private String mapSheetPath; + private String syncTifFileName; + private String syncTfwFileName; + private String useInference; + private String syncMngPath; + } + + @Schema(name = "MngFileAddReq", description = "영상관리파일 등록 요청") + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class MngFileAddReq { + private int mngYyyy; + private String mapSheetNum; + private String refMapSheetNum; + private String filePath; + private String fileName; + private String fileExt; + private Long hstUid; + private Long fileSize; + private String fileState; + } + + @Schema(name = "MngFilesDto", description = "영상관리내역 검색 리턴") + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class MngFilesDto { + private long fileUid; + private int mngYyyy; + private String mapSheetNum; + private String refMapSheetNum; + private String filePath; + private String fileName; + private String fileExt; + private Long hstUid; + private Long fileSize; + } + + @Schema(name = "MngListCompareDto", description = "영상파일 비교가능 이전년도정보") + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class MngListCompareDto { + + private String mngYyyy; + private String mapSheetNum; + private Integer beforeYear; + } + + @Schema(name = "DmlReturn", description = "영상관리 DML 수행 후 리턴") + @Getter + @Setter + @NoArgsConstructor + @AllArgsConstructor + public static class DmlReturn { + private String flag; + private String message; + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/enums/ApiConfigEnum.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/enums/ApiConfigEnum.java new file mode 100755 index 0000000..79cd405 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/enums/ApiConfigEnum.java @@ -0,0 +1,22 @@ +package com.kamco.cd.kamcoback.enums; + +import lombok.EqualsAndHashCode; +import lombok.Getter; + +public class ApiConfigEnum { + + @Getter + @EqualsAndHashCode(of = "enumValue") + public static class EnumDto { + + private final T enumValue; + private final String id; + private final String text; + + public EnumDto(T enumValue, String id, String text) { + this.enumValue = enumValue; + this.id = id; + this.text = text; + } + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/enums/CodeDto.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/enums/CodeDto.java new file mode 100755 index 0000000..ae4bc00 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/enums/CodeDto.java @@ -0,0 +1,20 @@ +package com.kamco.cd.kamcoback.enums; + +public class CodeDto { + + private String code; + private String name; + + public CodeDto(String code, String name) { + this.code = code; + this.name = name; + } + + public String getCode() { + return code; + } + + public String getName() { + return name; + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/enums/CommonUseStatus.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/enums/CommonUseStatus.java new file mode 100755 index 0000000..bcd0b39 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/enums/CommonUseStatus.java @@ -0,0 +1,46 @@ +package com.kamco.cd.kamcoback.enums; + +import com.kamco.cd.kamcoback.enums.ApiConfigEnum.EnumDto; +import com.kamco.cd.kamcoback.inferface.EnumType; +import java.util.Arrays; +import lombok.AllArgsConstructor; +import lombok.Getter; + +/** + * Common usage status used across the system. + * + *

This enum represents whether a resource is active, excluded from processing, or inactive. It + * is commonly used for filtering, business rules, and status management. + */ +@Getter +@AllArgsConstructor +public enum CommonUseStatus implements EnumType { + + // @formatter:off + USE("USE", "사용중", 100) + /** Actively used and available */ + , + EXCEPT("EXCEPT", "영구 추론제외", 200) + /** Explicitly excluded from use or processing */ + , + AUTO_EXCEPT("AUTO_EXCEPT", "자동추론 제외", 300), + NOT_USE("NOT_USE", "사용안함", 999) +/** Not used or disabled */ +; + // @formatter:on + + private String id; + private String text; + private int ordering; + + public static CommonUseStatus getEnumById(String id) { + return Arrays.stream(CommonUseStatus.values()) + .filter(x -> x.getId().equals(id)) + .findFirst() + .orElse(CommonUseStatus.NOT_USE); + } + + public EnumDto getEnumDto() { + return new EnumDto<>(this, this.id, this.text); + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/enums/Enums.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/enums/Enums.java new file mode 100755 index 0000000..c87856e --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/enums/Enums.java @@ -0,0 +1,86 @@ +package com.kamco.cd.kamcoback.enums; + +import com.kamco.cd.kamcoback.inferface.CodeExpose; +import com.kamco.cd.kamcoback.inferface.CodeHidden; +import com.kamco.cd.kamcoback.inferface.EnumType; +import java.util.Arrays; +import java.util.HashMap; +import java.util.List; +import java.util.Map; +import java.util.Set; +import org.reflections.Reflections; + +public class Enums { + + private static final String BASE_PACKAGE = "com.kamco.cd.kamcoback"; + + /** 노출 가능한 enum만 모아둔 맵 key: enum simpleName (예: RoleType) value: enum Class */ + private static final Map>> exposedEnumMap = scanExposedEnumMap(); + + // code로 enum 찾기 + public static & EnumType> E fromId(Class enumClass, String id) { + if (id == null) { + return null; + } + + for (E e : enumClass.getEnumConstants()) { + if (id.equalsIgnoreCase(e.getId())) { + return e; + } + } + return null; + } + + // enum -> CodeDto list + public static List toList(Class> enumClass) { + Object[] enums = enumClass.getEnumConstants(); + + return Arrays.stream(enums) + .map(e -> (EnumType) e) + .filter(e -> !isHidden(enumClass, (Enum) e)) + .map(e -> new CodeDto(e.getId(), e.getText())) + .toList(); + } + + private static boolean isHidden(Class> enumClass, Enum e) { + try { + return enumClass.getField(e.name()).isAnnotationPresent(CodeHidden.class); + } catch (NoSuchFieldException ex) { + return false; + } + } + + /** 특정 타입(enum)만 조회 /codes/{type} -> type = RoleType 같은 값 */ + public static List getCodes(String type) { + Class> enumClass = exposedEnumMap.get(type); + if (enumClass == null) { + throw new IllegalArgumentException("지원하지 않는 코드 타입: " + type); + } + return toList(enumClass); + } + + /** 전체 enum 코드 조회 */ + public static Map> getAllCodes() { + Map> result = new HashMap<>(); + for (Map.Entry>> e : exposedEnumMap.entrySet()) { + result.put(e.getKey(), toList(e.getValue())); + } + return result; + } + + /** CodeExpose + EnumType 인 enum만 스캔해서 Map 구성 */ + private static Map>> scanExposedEnumMap() { + Reflections reflections = new Reflections(BASE_PACKAGE); + + Set> types = reflections.getTypesAnnotatedWith(CodeExpose.class); + + Map>> result = new HashMap<>(); + + for (Class clazz : types) { + if (clazz.isEnum() && EnumType.class.isAssignableFrom(clazz)) { + result.put(clazz.getSimpleName(), (Class>) clazz); + } + } + return result; + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/enums/MngStateType.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/enums/MngStateType.java new file mode 100755 index 0000000..f5fd245 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/enums/MngStateType.java @@ -0,0 +1,26 @@ +package com.kamco.cd.kamcoback.enums; + +import com.kamco.cd.kamcoback.inferface.EnumType; +import lombok.AllArgsConstructor; +import lombok.Getter; + +@Getter +@AllArgsConstructor +public enum MngStateType implements EnumType { + NOTYET("동기화 시작"), + PROCESSING("데이터 체크"), + DONE("동기화 작업 종료"), + TAKINGERROR("오류 데이터 처리중"); + + private final String desc; + + @Override + public String getId() { + return name(); + } + + @Override + public String getText() { + return desc; + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/enums/SyncStateType.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/enums/SyncStateType.java new file mode 100755 index 0000000..adba1cc --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/enums/SyncStateType.java @@ -0,0 +1,33 @@ +package com.kamco.cd.kamcoback.enums; + +import com.kamco.cd.kamcoback.inferface.CodeExpose; +import com.kamco.cd.kamcoback.inferface.CodeHidden; +import com.kamco.cd.kamcoback.inferface.EnumType; +import lombok.AllArgsConstructor; +import lombok.Getter; + +@CodeExpose +@Getter +@AllArgsConstructor +public enum SyncStateType implements EnumType { + @CodeHidden + NOTYET("미처리"), + NOFILE("파일없음"), + NOTPAIR("페어파일누락"), + DUPLICATE("파일중복"), + TYPEERROR("손상파일"), + @CodeHidden + DONE("완료"); + + private final String desc; + + @Override + public String getId() { + return name(); + } + + @Override + public String getText() { + return desc; + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/inferface/CodeExpose.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/inferface/CodeExpose.java new file mode 100755 index 0000000..aa50c0e --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/inferface/CodeExpose.java @@ -0,0 +1,10 @@ +package com.kamco.cd.kamcoback.inferface; + +import java.lang.annotation.ElementType; +import java.lang.annotation.Retention; +import java.lang.annotation.RetentionPolicy; +import java.lang.annotation.Target; + +@Target(ElementType.TYPE) +@Retention(RetentionPolicy.RUNTIME) +public @interface CodeExpose {} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/inferface/CodeHidden.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/inferface/CodeHidden.java new file mode 100755 index 0000000..7516dfe --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/inferface/CodeHidden.java @@ -0,0 +1,10 @@ +package com.kamco.cd.kamcoback.inferface; + +import java.lang.annotation.ElementType; +import java.lang.annotation.Retention; +import java.lang.annotation.RetentionPolicy; +import java.lang.annotation.Target; + +@Retention(RetentionPolicy.RUNTIME) +@Target(ElementType.FIELD) +public @interface CodeHidden {} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/inferface/EnumType.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/inferface/EnumType.java new file mode 100755 index 0000000..2fa5f21 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/inferface/EnumType.java @@ -0,0 +1,8 @@ +package com.kamco.cd.kamcoback.inferface; + +public interface EnumType { + + String getId(); + + String getText(); +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/inferface/JsonFormatDttm.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/inferface/JsonFormatDttm.java new file mode 100755 index 0000000..aabb352 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/inferface/JsonFormatDttm.java @@ -0,0 +1,19 @@ +package com.kamco.cd.kamcoback.inferface; + +import com.fasterxml.jackson.annotation.JacksonAnnotationsInside; +import com.fasterxml.jackson.annotation.JsonFormat; +import java.lang.annotation.Documented; +import java.lang.annotation.ElementType; +import java.lang.annotation.Retention; +import java.lang.annotation.RetentionPolicy; +import java.lang.annotation.Target; + +@Target({ElementType.FIELD, ElementType.METHOD}) +@Retention(RetentionPolicy.RUNTIME) +@Documented +@JacksonAnnotationsInside +@JsonFormat( + shape = JsonFormat.Shape.STRING, + pattern = "yyyy-MM-dd'T'HH:mm:ssXXX", + timezone = "Asia/Seoul") +public @interface JsonFormatDttm {} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/core/MapSheetMngFileJobCoreService.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/core/MapSheetMngFileJobCoreService.java new file mode 100755 index 0000000..024a40d --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/core/MapSheetMngFileJobCoreService.java @@ -0,0 +1,83 @@ +package com.kamco.cd.kamcoback.postgres.core; + +import com.kamco.cd.kamcoback.dto.MapSheetMngDto; +import com.kamco.cd.kamcoback.dto.MapSheetMngDto.MngDto; +import com.kamco.cd.kamcoback.dto.MapSheetMngDto.MngHstDto; +import com.kamco.cd.kamcoback.enums.CommonUseStatus; +import com.kamco.cd.kamcoback.postgres.entity.MapSheetMngFileEntity; +import com.kamco.cd.kamcoback.postgres.repository.MapSheetMngFileJobRepository; +import com.kamco.cd.kamcoback.postgres.repository.MapSheetMngYearRepository; +import jakarta.validation.Valid; +import java.util.List; +import lombok.RequiredArgsConstructor; +import org.springframework.data.domain.Page; +import org.springframework.stereotype.Service; + +@Service +@RequiredArgsConstructor +public class MapSheetMngFileJobCoreService { + + private final MapSheetMngFileJobRepository mapSheetMngFileJobRepository; + private final MapSheetMngYearRepository mapSheetMngYearRepository; + + public Page findMapSheetMngList( + MapSheetMngDto.@Valid MngSearchReq searchReq) { + return mapSheetMngFileJobRepository.findMapSheetMngList(searchReq); + } + + public List findTargetMapSheetFileList(long targetNum, int pageSize) { + return mapSheetMngFileJobRepository.findTargetMapSheetFileList(targetNum, pageSize); + } + + public MapSheetMngDto.DmlReturn mngHstDataSyncStateUpdate( + @Valid MapSheetMngDto.MngHstDto updateReq) { + + mapSheetMngFileJobRepository.mngHstDataSyncStateUpdate(updateReq); + + return new MapSheetMngDto.DmlReturn("success", updateReq.getHstUid() + ""); + } + + public MapSheetMngDto.DmlReturn mngFileSave(@Valid MapSheetMngDto.MngFileAddReq addReq) { + + MapSheetMngFileEntity entity = new MapSheetMngFileEntity(); + entity.setMngYyyy(addReq.getMngYyyy()); + entity.setMapSheetNum(addReq.getMapSheetNum()); + entity.setRefMapSheetNum(addReq.getRefMapSheetNum()); + entity.setFilePath(addReq.getFilePath()); + entity.setFileName(addReq.getFileName()); + entity.setFileExt(addReq.getFileExt()); + entity.setHstUid(addReq.getHstUid()); + entity.setFileSize(addReq.getFileSize()); + entity.setFileState(addReq.getFileState()); + + MapSheetMngFileEntity saved = mapSheetMngFileJobRepository.save(entity); + // int hstCnt = mapSheetMngRepository.insertMapSheetOrgDataToMapSheetMngHst(saved.getMngYyyy()); + + return new MapSheetMngDto.DmlReturn("success", saved.getFileUid().toString()); + } + + public Long findByMngYyyyTargetMapSheetNotYetCount(int mngYyyy) { + return mapSheetMngFileJobRepository.findByMngYyyyTargetMapSheetNotYetCount(mngYyyy); + } + + public void mngDataState(int mngYyyy, String mngState) { + mapSheetMngFileJobRepository.mngDataState(mngYyyy, mngState); + } + + public Integer findNotYetMapSheetMng() { + return mapSheetMngFileJobRepository.findNotYetMapSheetMng(); + } + + public Long findByHstMapSheetBeforeYyyyListCount(int strtYyyy, int endYyyy, String mapSheetNum) { + return mapSheetMngFileJobRepository.findByHstMapSheetBeforeYyyyListCount( + strtYyyy, endYyyy, mapSheetNum); + } + + public void updateException5kMapSheet(String mapSheetNum, CommonUseStatus commonUseStatus) { + mapSheetMngFileJobRepository.updateException5kMapSheet(mapSheetNum, commonUseStatus); + } + + public void saveSheetMngYear() { + mapSheetMngYearRepository.saveFileInfo(); + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/CommonDateEntity.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/CommonDateEntity.java new file mode 100755 index 0000000..cb9abe9 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/CommonDateEntity.java @@ -0,0 +1,34 @@ +package com.kamco.cd.kamcoback.postgres.entity; + +import jakarta.persistence.Column; +import jakarta.persistence.MappedSuperclass; +import jakarta.persistence.PrePersist; +import jakarta.persistence.PreUpdate; +import java.time.ZonedDateTime; +import lombok.Getter; +import org.springframework.data.annotation.CreatedDate; +import org.springframework.data.annotation.LastModifiedDate; + +@Getter +@MappedSuperclass +public class CommonDateEntity { + + @CreatedDate + @Column(name = "created_dttm", updatable = false, nullable = false) + private ZonedDateTime createdDate; + + @LastModifiedDate + @Column(name = "updated_dttm", nullable = false) + private ZonedDateTime modifiedDate; + + @PrePersist + protected void onPersist() { + this.createdDate = ZonedDateTime.now(); + this.modifiedDate = ZonedDateTime.now(); + } + + @PreUpdate + protected void onUpdate() { + this.modifiedDate = ZonedDateTime.now(); + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapInkx50kEntity.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapInkx50kEntity.java new file mode 100755 index 0000000..c483e5c --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapInkx50kEntity.java @@ -0,0 +1,48 @@ +package com.kamco.cd.kamcoback.postgres.entity; + +import jakarta.persistence.Column; +import jakarta.persistence.Entity; +import jakarta.persistence.GeneratedValue; +import jakarta.persistence.GenerationType; +import jakarta.persistence.Id; +import jakarta.persistence.SequenceGenerator; +import jakarta.persistence.Table; +import lombok.Getter; +import lombok.NoArgsConstructor; +import lombok.Setter; +import org.locationtech.jts.geom.Geometry; + +@Getter +@Setter +@Table(name = "tb_map_inkx_50k") +@Entity +@NoArgsConstructor +public class MapInkx50kEntity extends CommonDateEntity { + + @Id + @GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "tb_map_inkx_50k_fid_seq_gen") + @SequenceGenerator( + name = "tb_map_inkx_50k_fid_seq_gen", + sequenceName = "tb_map_inkx_50k_fid_seq", + allocationSize = 1) + private Integer fid; + + @Column(name = "mapidcd_no") + private String mapidcdNo; + + @Column(name = "mapid_nm") + private String mapidNm; + + @Column(name = "mapid_no") + private String mapidNo; + + @Column(name = "geom") + private Geometry geom; + + public MapInkx50kEntity(String mapidcdNo, String mapidNm, String mapidNo, Geometry geom) { + this.mapidcdNo = mapidcdNo; + this.mapidNm = mapidNm; + this.mapidNo = mapidNo; + this.geom = geom; + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapInkx5kEntity.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapInkx5kEntity.java new file mode 100755 index 0000000..53b6a0a --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapInkx5kEntity.java @@ -0,0 +1,69 @@ +package com.kamco.cd.kamcoback.postgres.entity; + +import com.kamco.cd.kamcoback.enums.CommonUseStatus; +import jakarta.persistence.Column; +import jakarta.persistence.Entity; +import jakarta.persistence.EnumType; +import jakarta.persistence.Enumerated; +import jakarta.persistence.FetchType; +import jakarta.persistence.GeneratedValue; +import jakarta.persistence.GenerationType; +import jakarta.persistence.Id; +import jakarta.persistence.JoinColumn; +import jakarta.persistence.ManyToOne; +import jakarta.persistence.SequenceGenerator; +import jakarta.persistence.Table; +import lombok.Getter; +import lombok.NoArgsConstructor; +import lombok.Setter; +import org.locationtech.jts.geom.Geometry; + +@Getter +@Setter +@Table(name = "tb_map_inkx_5k") +@Entity +@NoArgsConstructor +public class MapInkx5kEntity extends CommonDateEntity { + + @Id + @GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "tb_map_inkx_5k_fid_seq_gen") + @SequenceGenerator( + name = "tb_map_inkx_5k_fid_seq_gen", + sequenceName = "tb_map_inkx_5k_fid_seq", + allocationSize = 1) + private Integer fid; + + @Column(name = "mapidcd_no") + private String mapidcdNo; + + @Column(name = "mapid_nm") + private String mapidNm; + + @Column(name = "geom") + private Geometry geom; + + @ManyToOne(fetch = FetchType.LAZY) + @JoinColumn(name = "fid_k50", referencedColumnName = "fid") + private MapInkx50kEntity mapInkx50k; + + // 사용상태 USE, + @Column(name = "use_inference") + @Enumerated(EnumType.STRING) + private CommonUseStatus useInference; + + // Constructor + public MapInkx5kEntity( + String mapidcdNo, String mapidNm, Geometry geom, MapInkx50kEntity mapInkx50k) { + this.mapidcdNo = mapidcdNo; + this.mapidNm = mapidNm; + this.geom = geom; + this.mapInkx50k = mapInkx50k; + // 생성시 default 사용함 (사용,제외,사용안함) + this.useInference = CommonUseStatus.USE; + } + + // 변경 사용상태 (추론사용여부) + public void updateUseInference(CommonUseStatus useInference) { + this.useInference = useInference; + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngEntity.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngEntity.java new file mode 100755 index 0000000..a92036e --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngEntity.java @@ -0,0 +1,72 @@ +package com.kamco.cd.kamcoback.postgres.entity; + +import jakarta.persistence.Column; +import jakarta.persistence.Entity; +import jakarta.persistence.Id; +import jakarta.persistence.Table; +import jakarta.validation.constraints.Size; +import java.time.ZonedDateTime; +import lombok.Getter; +import lombok.Setter; +import org.hibernate.annotations.ColumnDefault; + +@Getter +@Setter +@Entity +@Table(name = "tb_map_sheet_mng") +public class MapSheetMngEntity { + + @Id + @Column(name = "mng_yyyy", nullable = false) + private Integer mngYyyy; + + @Size(max = 20) + @ColumnDefault("'NOTYET'") + @Column(name = "mng_state", length = 20) + private String mngState = "NOTYET"; + + @Size(max = 20) + @ColumnDefault("'NOTYET'") + @Column(name = "sync_state", length = 20) + private String syncState = "NOTYET"; + + @Column(name = "mng_state_dttm") + private ZonedDateTime mngStateDttm = ZonedDateTime.now(); + + @Column(name = "sync_state_dttm") + private ZonedDateTime syncStateDttm = ZonedDateTime.now(); + + @Column(name = "created_dttm") + private ZonedDateTime createdDttm = ZonedDateTime.now(); + + @Column(name = "created_uid") + private Long createdUid; + + @Column(name = "updated_dttm") + private ZonedDateTime updatedDttm = ZonedDateTime.now(); + + @Column(name = "updated_uid") + private Long updatedUid; + + @Size(max = 255) + @ColumnDefault("'NULL::character varying'") + @Column(name = "mng_path") + private String mngPath; + + @Size(max = 20) + @ColumnDefault("'NOTYET'") + @Column(name = "sync_check_state", length = 20) + private String syncCheckState = "NOTYET"; + + @Column(name = "sync_strt_dttm") + private ZonedDateTime syncStrtDttm; + + @Column(name = "sync_end_dttm") + private ZonedDateTime syncEndDttm; + + @Column(name = "sync_check_strt_dttm") + private ZonedDateTime syncCheckStrtDttm; + + @Column(name = "sync_check_end_dttm") + private ZonedDateTime syncCheckEndDttm; +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngFileEntity.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngFileEntity.java new file mode 100755 index 0000000..cadf2ce --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngFileEntity.java @@ -0,0 +1,63 @@ +package com.kamco.cd.kamcoback.postgres.entity; + +import jakarta.persistence.Column; +import jakarta.persistence.Entity; +import jakarta.persistence.GeneratedValue; +import jakarta.persistence.GenerationType; +import jakarta.persistence.Id; +import jakarta.persistence.Table; +import jakarta.validation.constraints.NotNull; +import jakarta.validation.constraints.Size; +import lombok.Getter; +import lombok.Setter; +import org.hibernate.annotations.ColumnDefault; + +@Getter +@Setter +@Entity +@Table(name = "tb_map_sheet_mng_files") +public class MapSheetMngFileEntity { + + @Id + @GeneratedValue(strategy = GenerationType.IDENTITY) + @Column(name = "file_uid", nullable = false) + private Long fileUid; + + @NotNull + @Column(name = "mng_yyyy", nullable = false) + private Integer mngYyyy; + + @NotNull + @Column(name = "map_sheet_num", nullable = false) + private String mapSheetNum; + + @Column(name = "ref_map_sheet_num") + private String refMapSheetNum; + + @Size(max = 255) + @Column(name = "file_path") + private String filePath; + + @Size(max = 100) + @Column(name = "file_name", length = 100) + private String fileName; + + @Size(max = 20) + @Column(name = "file_ext", length = 20) + private String fileExt; + + @Column(name = "hst_uid") + private Long hstUid; + + @Column(name = "file_size") + private Long fileSize; + + @Size(max = 20) + @Column(name = "file_state", length = 20) + private String fileState; + + @NotNull + @ColumnDefault("false") + @Column(name = "file_del", nullable = false) + private Boolean fileDel = false; +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngHstEntity.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngHstEntity.java new file mode 100755 index 0000000..b23a9b2 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngHstEntity.java @@ -0,0 +1,169 @@ +package com.kamco.cd.kamcoback.postgres.entity; + +import jakarta.persistence.Column; +import jakarta.persistence.Entity; +import jakarta.persistence.FetchType; +import jakarta.persistence.GeneratedValue; +import jakarta.persistence.GenerationType; +import jakarta.persistence.Id; +import jakarta.persistence.JoinColumn; +import jakarta.persistence.ManyToOne; +import jakarta.persistence.Table; +import jakarta.validation.constraints.Size; +import java.time.ZonedDateTime; +import lombok.AccessLevel; +import lombok.Getter; +import lombok.NoArgsConstructor; + +/** + * This class represents the entity for managing the history of map sheets. It is mapped to the + * database table "tb_map_sheet_mng_hst" and contains various properties related to the 1:5k map + * sheet information, as well as metadata for file synchronization and management. + * + *

This entity: - Includes a primary key (hstUid) for unique identification. - Maintains + * information associated with map sheets such as code, name, scale ratio, and paths. - Tracks + * states, timestamps, and data synchronization details. - Maintains relationships with the + * `MapInkx5kEntity` entity through a many-to-one association. - Provides functionality to update + * file information and sizes (`tifSizeBytes`, `tfwSizeBytes`, and `totalSizeBytes`). + * + *

It extends the `CommonDateEntity` class to include common date management fields, such as + * creation and modification timestamps. + * + *

The `@Getter` annotation generates getter methods for all fields, while the access to setters + * is restricted to enforce controlled modifications. The entity uses `@NoArgsConstructor` with + * `AccessLevel.PROTECTED` to restrict direct instantiation. The `updateFileInfos` method allows + * dynamic updates of specific file information. + * + *

Fields include: - hstUid: Unique identifier for the history record. - mngYyyy: Year associated + * with the management record. - mapInkx5kByCode: Reference to the related `MapInkx5kEntity` object. + * - mapSheetNum: Map sheet number identifying specific map. - mapSheetName: Name of the map sheet. + * - mapSheetCodeSrc: Source code of the map sheet. - scaleRatio: Scale ratio of the map. - + * dataState: State/status of the map sheet data. - dataStateDttm: Timestamp of the data state. - + * useInference: Indicator or metadata for inference usage. - useInferenceDttm: Timestamp for + * inference-related use. - mapSheetPath: Path or location of the map sheet file. - refMapSheetNum: + * Reference to a related map sheet number. - createdUid: User ID of the record creator. - + * updatedUid: User ID of the last updater. - syncState and related fields: Fields to manage + * synchronization states and processes. - tifSizeBytes, tfwSizeBytes, totalSizeBytes: Fields to + * track file size details. - sync file name fields: Stores names of files relevant for + * synchronization and verification. + * + *

This entity is essential for tracking and managing map sheet revisions, status, and usage in a + * system leveraging 1:5k map data. + */ +@Getter +// entity의 접근제어를 위해 @setter를 사용 x +// @Setter +@NoArgsConstructor(access = AccessLevel.PROTECTED) +@Entity +// 영상관리이력 +@Table(name = "tb_map_sheet_mng_hst") +public class MapSheetMngHstEntity extends CommonDateEntity { + + @Id + @GeneratedValue(strategy = GenerationType.IDENTITY) + @Column(name = "hst_uid") + private Long hstUid; // id + + @Column(name = "mng_yyyy") + private Integer mngYyyy; // 년도 + + // JPA 연관관계: MapInkx5k 참조 (PK 기반) 소속도엽번호 1:5k + @ManyToOne(fetch = FetchType.LAZY) + @JoinColumn(name = "map_sheet_code", referencedColumnName = "fid") + private MapInkx5kEntity mapInkx5kByCode; + + // TODO 1:5k 관련 정보 추후 제거 필요 + @Column(name = "map_sheet_num") + private String mapSheetNum; // 도엽번호 + + @Column(name = "map_sheet_name") + private String mapSheetName; + + // TODO END + + // 도엽파일이 저장된 경로 + @Column(name = "map_sheet_code_src") + private Integer mapSheetCodeSrc; + + // 도엽비율? + @Column(name = "scale_ratio") + private Integer scaleRatio; + + @Column(name = "data_state", length = 20) + private String dataState; + + @Column(name = "data_state_dttm") + private ZonedDateTime dataStateDttm; + + @Column(name = "use_inference") + private String useInference; + + @Column(name = "use_inference_dttm") + private ZonedDateTime useInferenceDttm; + + @Column(name = "map_sheet_path") + private String mapSheetPath; + + @Column(name = "ref_map_sheet_num") + private String refMapSheetNum; + + @Column(name = "created_uid") + private Long createdUid; + + @Column(name = "updated_uid") + private Long updatedUid; + + @Size(max = 20) + @Column(name = "sync_state", length = 20) + private String syncState; + + @Size(max = 20) + @Column(name = "sync_check_state", length = 20) + private String syncCheckState; + + @Column(name = "sync_strt_dttm") + private ZonedDateTime syncStrtDttm; + + @Column(name = "sync_end_dttm") + private ZonedDateTime syncEndDttm; + + @Column(name = "sync_check_strt_dttm") + private ZonedDateTime syncCheckStrtDttm; + + @Column(name = "sync_check_end_dttm") + private ZonedDateTime syncCheckEndDttm; + + @Column(name = "tif_size_bytes") + private Long tifSizeBytes; + + @Column(name = "tfw_size_bytes") + private Long tfwSizeBytes; + + @Column(name = "total_size_bytes") + private Long totalSizeBytes; + + @Size(max = 100) + @Column(name = "sync_tif_file_name", length = 100) + private String syncTifFileName; + + @Size(max = 100) + @Column(name = "sync_tfw_file_name", length = 100) + private String syncTfwFileName; + + @Size(max = 100) + @Column(name = "sync_check_tif_file_name", length = 100) + private String syncCheckTifFileName; + + @Size(max = 100) + @Column(name = "sync_check_tfw_file_name", length = 100) + private String syncCheckTfwFileName; + + // 파일정보 업데이트 + public void updateFileInfos(Long tifSizeBytes, Long tfwSizeBytes) { + tifSizeBytes = tifSizeBytes == null ? 0L : tifSizeBytes; + tfwSizeBytes = tfwSizeBytes == null ? 0L : tfwSizeBytes; + this.tifSizeBytes = tifSizeBytes; + this.tfwSizeBytes = tfwSizeBytes; + this.totalSizeBytes = tifSizeBytes + tfwSizeBytes; + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngYearYnEntity.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngYearYnEntity.java new file mode 100644 index 0000000..8cc97b7 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngYearYnEntity.java @@ -0,0 +1,34 @@ +package com.kamco.cd.kamcoback.postgres.entity; + +import jakarta.persistence.Column; +import jakarta.persistence.EmbeddedId; +import jakarta.persistence.Entity; +import jakarta.persistence.Table; +import jakarta.validation.constraints.NotNull; +import java.time.ZonedDateTime; +import lombok.Getter; +import lombok.Setter; +import org.hibernate.annotations.ColumnDefault; + +@Getter +@Setter +@Entity +@Table(name = "tb_map_sheet_mng_year_yn") +public class MapSheetMngYearYnEntity { + + @EmbeddedId private MapSheetMngYearYnEntityId id; + + @NotNull + @Column(name = "yn", nullable = false, length = Integer.MAX_VALUE) + private String yn; + + @NotNull + @ColumnDefault("now()") + @Column(name = "created_dttm", nullable = false) + private ZonedDateTime createdDttm; + + @NotNull + @ColumnDefault("now()") + @Column(name = "updated_dttm", nullable = false) + private ZonedDateTime updatedDttm; +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngYearYnEntityId.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngYearYnEntityId.java new file mode 100644 index 0000000..45e8bcc --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/entity/MapSheetMngYearYnEntityId.java @@ -0,0 +1,46 @@ +package com.kamco.cd.kamcoback.postgres.entity; + +import jakarta.persistence.Column; +import jakarta.persistence.Embeddable; +import jakarta.validation.constraints.NotNull; +import jakarta.validation.constraints.Size; +import java.io.Serializable; +import java.util.Objects; +import lombok.Getter; +import lombok.Setter; +import org.hibernate.Hibernate; + +@Getter +@Setter +@Embeddable +public class MapSheetMngYearYnEntityId implements Serializable { + + private static final long serialVersionUID = 6282262062316057898L; + + @Size(max = 20) + @NotNull + @Column(name = "map_sheet_num", nullable = false, length = 20) + private String mapSheetNum; + + @NotNull + @Column(name = "mng_yyyy", nullable = false) + private Integer mngYyyy; + + @Override + public boolean equals(Object o) { + if (this == o) { + return true; + } + if (o == null || Hibernate.getClass(this) != Hibernate.getClass(o)) { + return false; + } + MapSheetMngYearYnEntityId entity = (MapSheetMngYearYnEntityId) o; + return Objects.equals(this.mngYyyy, entity.mngYyyy) + && Objects.equals(this.mapSheetNum, entity.mapSheetNum); + } + + @Override + public int hashCode() { + return Objects.hash(mngYyyy, mapSheetNum); + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngFileJobRepository.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngFileJobRepository.java new file mode 100755 index 0000000..09f1534 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngFileJobRepository.java @@ -0,0 +1,7 @@ +package com.kamco.cd.kamcoback.postgres.repository; + +import com.kamco.cd.kamcoback.postgres.entity.MapSheetMngFileEntity; +import org.springframework.data.jpa.repository.JpaRepository; + +public interface MapSheetMngFileJobRepository + extends JpaRepository, MapSheetMngFileJobRepositoryCustom {} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngFileJobRepositoryCustom.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngFileJobRepositoryCustom.java new file mode 100755 index 0000000..bac0923 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngFileJobRepositoryCustom.java @@ -0,0 +1,27 @@ +package com.kamco.cd.kamcoback.postgres.repository; + +import com.kamco.cd.kamcoback.dto.MapSheetMngDto; +import com.kamco.cd.kamcoback.dto.MapSheetMngDto.MngDto; +import com.kamco.cd.kamcoback.dto.MapSheetMngDto.MngHstDto; +import com.kamco.cd.kamcoback.enums.CommonUseStatus; +import java.util.List; +import org.springframework.data.domain.Page; + +public interface MapSheetMngFileJobRepositoryCustom { + + Page findMapSheetMngList(MapSheetMngDto.MngSearchReq searchReq); + + void mngHstDataSyncStateUpdate(MapSheetMngDto.MngHstDto updateReq); + + List findTargetMapSheetFileList(long targetNum, int pageSize); + + Long findByMngYyyyTargetMapSheetNotYetCount(int mngYyyy); + + public void mngDataState(int mngYyyy, String mngState); + + public Integer findNotYetMapSheetMng(); + + public Long findByHstMapSheetBeforeYyyyListCount(int strtYyyy, int endYyyy, String mapSheetNum); + + public void updateException5kMapSheet(String mapSheetNum, CommonUseStatus commonUseStatus); +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngFileJobRepositoryImpl.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngFileJobRepositoryImpl.java new file mode 100755 index 0000000..caf0406 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngFileJobRepositoryImpl.java @@ -0,0 +1,267 @@ +package com.kamco.cd.kamcoback.postgres.repository; + +import static com.kamco.cd.kamcoback.postgres.entity.QMapInkx5kEntity.mapInkx5kEntity; +import static com.kamco.cd.kamcoback.postgres.entity.QMapSheetMngEntity.mapSheetMngEntity; +import static com.kamco.cd.kamcoback.postgres.entity.QMapSheetMngHstEntity.mapSheetMngHstEntity; + +import com.kamco.cd.kamcoback.dto.MapSheetMngDto; +import com.kamco.cd.kamcoback.dto.MapSheetMngDto.MngDto; +import com.kamco.cd.kamcoback.dto.MapSheetMngDto.MngHstDto; +import com.kamco.cd.kamcoback.dto.MapSheetMngDto.MngSearchReq; +import com.kamco.cd.kamcoback.enums.CommonUseStatus; +import com.querydsl.core.BooleanBuilder; +import com.querydsl.core.types.Projections; +import com.querydsl.core.types.dsl.CaseBuilder; +import com.querydsl.core.types.dsl.Expressions; +import com.querydsl.core.types.dsl.StringExpression; +import com.querydsl.jpa.impl.JPAQueryFactory; +import jakarta.persistence.EntityManager; +import jakarta.persistence.PersistenceContext; +import java.time.ZonedDateTime; +import java.util.List; +import lombok.RequiredArgsConstructor; +import org.springframework.data.domain.Page; +import org.springframework.data.domain.PageImpl; +import org.springframework.data.domain.Pageable; +import org.springframework.stereotype.Repository; + +@Repository +@RequiredArgsConstructor +public class MapSheetMngFileJobRepositoryImpl implements MapSheetMngFileJobRepositoryCustom { + + private final JPAQueryFactory queryFactory; + private final StringExpression NULL_STRING = Expressions.stringTemplate("cast(null as text)"); + + @PersistenceContext private EntityManager em; + + @Override + public Integer findNotYetMapSheetMng() { + Integer countQuery = + queryFactory + .select(mapSheetMngEntity.mngYyyy) + .from(mapSheetMngEntity) + .where( + mapSheetMngEntity + .mngState + .eq("NOTYET") + .or(mapSheetMngEntity.mngState.eq("PROCESSING"))) + .limit(1) + .fetchOne(); + + return countQuery; + } + + @Override + public Long findByMngYyyyTargetMapSheetNotYetCount(int mngYyyy) { + Long countQuery = + queryFactory + .select(mapSheetMngHstEntity.mngYyyy.count()) + .from(mapSheetMngHstEntity) + .where( + mapSheetMngHstEntity + .mngYyyy + .eq(mngYyyy) + .and(mapSheetMngHstEntity.syncState.eq("NOTYET"))) + .fetchOne(); + + return countQuery; + } + + public void mngDataState(int mngYyyy, String mngState) { + long updateCount = + queryFactory + .update(mapSheetMngEntity) + .set(mapSheetMngEntity.mngState, mngState) + .set(mapSheetMngEntity.syncState, mngState) + .set(mapSheetMngEntity.syncCheckState, mngState) + .where(mapSheetMngEntity.mngYyyy.eq(mngYyyy)) + .execute(); + } + + @Override + public Page findMapSheetMngList(MapSheetMngDto.MngSearchReq searchReq) { + + Pageable pageable = searchReq.toPageable(); + BooleanBuilder whereBuilder = new BooleanBuilder(); + + if (searchReq.getMngYyyy() != null) { + whereBuilder.and(mapSheetMngEntity.mngYyyy.eq(searchReq.getMngYyyy())); + } + + List foundContent = + queryFactory + .select( + Projections.constructor( + MapSheetMngDto.MngDto.class, + Expressions.numberTemplate( + Integer.class, + "row_number() over(order by {0} desc)", + mapSheetMngEntity.createdDttm), + mapSheetMngEntity.mngYyyy, + mapSheetMngEntity.mngState, + mapSheetMngEntity.syncState, + mapSheetMngEntity.syncCheckState, + mapSheetMngHstEntity.count(), + new CaseBuilder() + .when(mapSheetMngHstEntity.syncState.eq("DONE")) + .then(1L) + .otherwise(0L) + .sum() + .as("syncStateDoneCnt"), + new CaseBuilder() + .when(mapSheetMngHstEntity.syncCheckState.eq("DONE")) + .then(1L) + .otherwise(0L) + .sum(), + new CaseBuilder() + .when(mapSheetMngHstEntity.dataState.eq("NOT")) + .then(1L) + .otherwise(0L) + .sum(), + new CaseBuilder() + .when(mapSheetMngHstEntity.dataState.eq("TYPEERROR")) + .then(1L) + .otherwise(0L) + .sum(), + new CaseBuilder() + .when(mapSheetMngHstEntity.dataState.eq("SIZEERROR")) + .then(1L) + .otherwise(0L) + .sum(), + mapSheetMngHstEntity.syncStrtDttm.min(), + mapSheetMngHstEntity.syncCheckEndDttm.max())) + .from(mapSheetMngEntity) + .leftJoin(mapSheetMngHstEntity) + .on(mapSheetMngEntity.mngYyyy.eq(mapSheetMngHstEntity.mngYyyy)) + .where(whereBuilder) + .offset(pageable.getOffset()) + .limit(pageable.getPageSize()) + .orderBy(mapSheetMngEntity.createdDttm.desc()) + .groupBy(mapSheetMngEntity.mngYyyy) + .fetch(); + + Long countQuery = + queryFactory + .select(mapSheetMngEntity.mngYyyy.count()) + .from(mapSheetMngEntity) + .where(whereBuilder) + .fetchOne(); + + return new PageImpl<>(foundContent, pageable, countQuery); + } + + public void mngHstDataSyncStateUpdate(MapSheetMngDto.MngHstDto updateReq) { + + ZonedDateTime now = ZonedDateTime.now(); + + if (updateReq.getDataState().equals("DONE")) { + long updateCount = + queryFactory + .update(mapSheetMngHstEntity) + .set(mapSheetMngHstEntity.dataState, updateReq.getDataState()) + .set(mapSheetMngHstEntity.dataStateDttm, now) + .set(mapSheetMngHstEntity.syncState, updateReq.getSyncState()) + .set(mapSheetMngHstEntity.syncEndDttm, now) + .set(mapSheetMngHstEntity.syncCheckState, "NOTYET") + .set(mapSheetMngHstEntity.syncCheckStrtDttm, now) + .set(mapSheetMngHstEntity.syncCheckEndDttm, now) + .set(mapSheetMngHstEntity.mapSheetPath, updateReq.getMapSheetPath()) + .set(mapSheetMngHstEntity.syncTfwFileName, updateReq.getSyncTfwFileName()) + .set(mapSheetMngHstEntity.syncTifFileName, updateReq.getSyncTifFileName()) + .set(mapSheetMngHstEntity.useInference, updateReq.getUseInference()) + .where(mapSheetMngHstEntity.hstUid.eq(updateReq.getHstUid())) + .execute(); + } else { + long updateCount = + queryFactory + .update(mapSheetMngHstEntity) + .set(mapSheetMngHstEntity.dataState, updateReq.getDataState()) + .set(mapSheetMngHstEntity.dataStateDttm, now) + .set(mapSheetMngHstEntity.syncState, updateReq.getSyncState()) + .set(mapSheetMngHstEntity.syncStrtDttm, now) + .set(mapSheetMngHstEntity.syncEndDttm, now) + .set(mapSheetMngHstEntity.syncCheckState, "NOTYET") + .set(mapSheetMngHstEntity.syncCheckStrtDttm, now) + .set(mapSheetMngHstEntity.syncCheckEndDttm, now) + .set(mapSheetMngHstEntity.mapSheetPath, updateReq.getMapSheetPath()) + .set(mapSheetMngHstEntity.syncTfwFileName, updateReq.getSyncTfwFileName()) + .set(mapSheetMngHstEntity.syncTifFileName, updateReq.getSyncTifFileName()) + .set(mapSheetMngHstEntity.useInference, updateReq.getUseInference()) + .where(mapSheetMngHstEntity.hstUid.eq(updateReq.getHstUid())) + .execute(); + } + } + + @Override + public List findTargetMapSheetFileList(long targetNum, int pageSize) { + // Pageable pageable = searchReq.toPageable(); + + List foundContent = + queryFactory + .select( + Projections.constructor( + MngHstDto.class, + mapSheetMngHstEntity.hstUid, + mapSheetMngHstEntity.mngYyyy, + mapSheetMngHstEntity.mapSheetNum, + mapSheetMngHstEntity.refMapSheetNum, + mapSheetMngHstEntity.dataState, + mapSheetMngHstEntity.syncState, + mapSheetMngHstEntity.syncCheckState, + mapSheetMngHstEntity.syncStrtDttm, + mapSheetMngHstEntity.syncEndDttm, + mapSheetMngHstEntity.syncCheckStrtDttm, + mapSheetMngHstEntity.syncCheckEndDttm, + mapSheetMngHstEntity.mapSheetPath, + mapSheetMngHstEntity.syncCheckTfwFileName, + mapSheetMngHstEntity.syncCheckTifFileName, + mapSheetMngHstEntity.useInference, + mapSheetMngEntity.mngPath)) + .from(mapSheetMngHstEntity) + .join(mapSheetMngEntity) + .on(mapSheetMngEntity.mngYyyy.eq(mapSheetMngHstEntity.mngYyyy)) + .where( + mapSheetMngHstEntity.syncState.eq("NOTYET"), + mapSheetMngHstEntity.hstUid.mod(10L).eq(targetNum)) + .limit(pageSize) + .orderBy(mapSheetMngHstEntity.hstUid.asc()) + .fetch(); + + return foundContent; + } + + @Override + public Long findByHstMapSheetBeforeYyyyListCount(int strtYyyy, int endYyyy, String mapSheetNum) { + + Long countQuery = + queryFactory + .select(mapSheetMngHstEntity.mngYyyy.count()) + .from(mapSheetMngHstEntity) + .where( + mapSheetMngHstEntity + .mngYyyy + .goe(strtYyyy) + .and(mapSheetMngHstEntity.mngYyyy.loe(endYyyy)) + .and(mapSheetMngHstEntity.mapSheetNum.eq(mapSheetNum)) + .and(mapSheetMngHstEntity.useInference.eq("USE")) + .and( + mapSheetMngHstEntity + .syncState + .eq("DONE") + .or(mapSheetMngHstEntity.syncCheckState.eq("DONE")))) + .fetchOne(); + + return countQuery; + } + + @Override + public void updateException5kMapSheet(String mapSheetNum, CommonUseStatus commonUseStatus) { + long updateCount = + queryFactory + .update(mapInkx5kEntity) + .set(mapInkx5kEntity.useInference, commonUseStatus) + .set(mapInkx5kEntity.modifiedDate, ZonedDateTime.now()) + .where(mapInkx5kEntity.mapidcdNo.eq(mapSheetNum)) + .execute(); + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngYearRepository.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngYearRepository.java new file mode 100644 index 0000000..fd805c7 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngYearRepository.java @@ -0,0 +1,9 @@ +package com.kamco.cd.kamcoback.postgres.repository; + +import com.kamco.cd.kamcoback.postgres.entity.MapSheetMngYearYnEntity; +import com.kamco.cd.kamcoback.postgres.entity.MapSheetMngYearYnEntityId; +import org.springframework.data.jpa.repository.JpaRepository; + +public interface MapSheetMngYearRepository + extends JpaRepository, + MapSheetMngYearRepositoryCustom {} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngYearRepositoryCustom.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngYearRepositoryCustom.java new file mode 100644 index 0000000..6587bbc --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngYearRepositoryCustom.java @@ -0,0 +1,10 @@ +package com.kamco.cd.kamcoback.postgres.repository; + +import com.kamco.cd.kamcoback.dto.MapSheetMngDto.MngListCompareDto; +import java.util.List; + +public interface MapSheetMngYearRepositoryCustom { + void saveFileInfo(); + + List findByHstMapSheetCompareList(int mngYyyy, List mapIds); +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngYearRepositoryImpl.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngYearRepositoryImpl.java new file mode 100644 index 0000000..ad96c23 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/postgres/repository/MapSheetMngYearRepositoryImpl.java @@ -0,0 +1,101 @@ +package com.kamco.cd.kamcoback.postgres.repository; + +import com.kamco.cd.kamcoback.dto.MapSheetMngDto.MngListCompareDto; +import com.kamco.cd.kamcoback.postgres.entity.QMapSheetMngYearYnEntity; +import com.querydsl.core.types.Projections; +import com.querydsl.core.types.dsl.Expressions; +import com.querydsl.core.types.dsl.StringExpression; +import com.querydsl.jpa.impl.JPAQueryFactory; +import jakarta.persistence.EntityManager; +import java.util.List; +import lombok.RequiredArgsConstructor; +import org.springframework.stereotype.Repository; + +@Repository +@RequiredArgsConstructor +public class MapSheetMngYearRepositoryImpl implements MapSheetMngYearRepositoryCustom { + + private final JPAQueryFactory queryFactory; + private final EntityManager em; + + /** 변화탐지 실행 가능 비교년도 저장 */ + @Override + public void saveFileInfo() { + + em.createNativeQuery("TRUNCATE TABLE tb_map_sheet_mng_year_yn").executeUpdate(); + + String sql = + """ + WITH bounds AS ( + SELECT + map_sheet_num, + MIN(mng_yyyy::int) AS min_y, + MAX(mng_yyyy::int) AS max_y + FROM tb_map_sheet_mng_files + GROUP BY map_sheet_num + ), + years AS ( + SELECT + b.map_sheet_num, + gs.y AS mng_yyyy + FROM bounds b + CROSS JOIN LATERAL generate_series(b.min_y, b.max_y) AS gs(y) + ), + exist AS ( + SELECT DISTINCT + map_sheet_num, + mng_yyyy::int AS mng_yyyy + FROM tb_map_sheet_mng_files + ), + src AS ( + SELECT + y.map_sheet_num, + y.mng_yyyy, + CASE + WHEN e.map_sheet_num IS NULL THEN 'N' + ELSE 'Y' + END AS yn + FROM years y + LEFT JOIN exist e + ON e.map_sheet_num = y.map_sheet_num + AND e.mng_yyyy = y.mng_yyyy + ) + INSERT INTO tb_map_sheet_mng_year_yn + (map_sheet_num, mng_yyyy, yn) + SELECT + map_sheet_num, + mng_yyyy, + yn + FROM src + ON CONFLICT (map_sheet_num, mng_yyyy) + DO UPDATE SET + yn = EXCLUDED.yn, + updated_dttm = now() + """; + + em.createNativeQuery(sql).executeUpdate(); + } + + /** + * 변화탐지 실행 가능 비교년도 조회 + * + * @param mngYyyy + * @param mapIds + * @return + */ + @Override + public List findByHstMapSheetCompareList(int mngYyyy, List mapIds) { + QMapSheetMngYearYnEntity y = QMapSheetMngYearYnEntity.mapSheetMngYearYnEntity; + + StringExpression mngYyyyStr = Expressions.stringTemplate("concat({0}, '')", mngYyyy); + + return queryFactory + .select( + Projections.constructor( + MngListCompareDto.class, mngYyyyStr, y.id.mapSheetNum, y.id.mngYyyy.max())) + .from(y) + .where(y.id.mapSheetNum.in(mapIds), y.yn.eq("Y"), y.id.mngYyyy.loe(mngYyyy)) + .groupBy(y.id.mapSheetNum) + .fetch(); + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/service/MapSheetMngFileJobService.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/service/MapSheetMngFileJobService.java new file mode 100755 index 0000000..f3579b1 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/service/MapSheetMngFileJobService.java @@ -0,0 +1,393 @@ +package com.kamco.cd.kamcoback.service; + +import static java.lang.String.CASE_INSENSITIVE_ORDER; + +import com.kamco.cd.kamcoback.dto.FileDto; +import com.kamco.cd.kamcoback.dto.FileDto.SrchFilesDepthDto; +import com.kamco.cd.kamcoback.dto.MapSheetMngDto.DmlReturn; +import com.kamco.cd.kamcoback.dto.MapSheetMngDto.MngFileAddReq; +import com.kamco.cd.kamcoback.dto.MapSheetMngDto.MngHstDto; +import com.kamco.cd.kamcoback.enums.CommonUseStatus; +import com.kamco.cd.kamcoback.postgres.core.MapSheetMngFileJobCoreService; +import com.kamco.cd.kamcoback.utils.FIleChecker; +import com.kamco.cd.kamcoback.utils.FIleChecker.Basic; +import com.kamco.cd.kamcoback.dto.MapSheetMngDto; +import java.io.File; +import java.io.IOException; +import java.nio.file.Files; +import java.nio.file.Path; +import java.nio.file.Paths; +import java.nio.file.attribute.FileTime; +import java.text.SimpleDateFormat; +import java.util.ArrayList; +import java.util.Arrays; +import java.util.Comparator; +import java.util.Date; +import java.util.List; +import java.util.Set; +import java.util.stream.Collectors; +import java.util.stream.Stream; +import lombok.RequiredArgsConstructor; +import org.apache.commons.io.FilenameUtils; +import org.springframework.beans.factory.annotation.Value; +import org.springframework.stereotype.Service; +import org.springframework.transaction.annotation.Transactional; + +@Service +@RequiredArgsConstructor +@Transactional(readOnly = true) +public class MapSheetMngFileJobService { + + private final MapSheetMngFileJobCoreService mapSheetMngFileJobCoreService; + + @Value("${file.sync-root-dir}") + private String syncRootDir; + + @Value("${file.sync-tmp-dir}") + private String syncTmpDir; + + @Value("${file.sync-file-extention}") + private String syncFileExtention; + + @Value("${file.sync-auto-exception-start-year}") + private int syncAutoExceptionStartYear; + + @Value("${file.sync-auto-exception-before-year-cnt}") + private int syncAutoExceptionBeforeYearCnt; + + public Integer checkMngFileSync() { + return mapSheetMngFileJobCoreService.findNotYetMapSheetMng(); + } + + @Transactional + public void checkMapSheetFileProcess(long targetNum, int mngSyncPageSize) { + + List mapSheetFileNotYetList = findTargetMapSheetFileList(targetNum, mngSyncPageSize); + + Long hstUid = 0L; + String syncState = ""; + String syncCheckState = ""; + String fileState = ""; + String dataState = ""; + int mngYyyy = 0; + + SrchFilesDepthDto srchDto = new SrchFilesDepthDto(); + List basicList = new ArrayList<>(); + + if (mapSheetFileNotYetList.size() >= 1) { + mngYyyy = mapSheetFileNotYetList.get(0).getMngYyyy(); + } + + for (MngHstDto item : mapSheetFileNotYetList) { + + // 5K도엽 자동추론제외 + Long exceptCheckCnt = + this.mapSheetAutoExceptionUpdate(item.getMngYyyy(), item.getMapSheetNum()); + + // 도엽별 파일 체크 진행중으로 변경 + item.setDataState("PROCESSING"); + item.setUseInference("USE"); + if (exceptCheckCnt == 0) { + item.setUseInference("EXCEPT"); + } + mngHstDataSyncStateUpdate(item); + + // 1. MngHstDto 객체의 필드 값에 접근 + srchDto.setMaxDepth(10); + srchDto.setDirPath(item.getSyncMngPath()); + srchDto.setExtension("tif,tfw"); + srchDto.setFileNm(item.getMapSheetNum()); + + System.out.println( + "UID: " + + hstUid + + ", 상태: " + + syncState + + ", 관리경로: " + + item.getSyncMngPath() + + ", 파일명 " + + item.getMapSheetNum() + + " .tif,tfw"); + + // 도엽번호로 파일 찾기 + basicList = + FIleChecker.getFilesFromAllDepth( + srchDto.getDirPath(), + srchDto.getFileNm(), + srchDto.getExtension(), + srchDto.getMaxDepth(), + srchDto.getSortType(), + 0, + 100); + + int tfwCnt = + (int) + basicList.stream().filter(dto -> dto.getExtension().toString().equals("tfw")).count(); + + int tifCnt = + (int) + basicList.stream().filter(dto -> dto.getExtension().toString().equals("tif")).count(); + + syncState = ""; + syncCheckState = ""; + + if (tfwCnt == 0 && tifCnt == 0) { + syncState = "NOFILE"; + } + + for (Basic item2 : basicList) { + MngFileAddReq addReq = new MngFileAddReq(); + addReq.setMngYyyy(item.getMngYyyy()); + addReq.setMapSheetNum(item.getMapSheetNum()); + addReq.setRefMapSheetNum(item.getRefMapSheetNum()); + addReq.setFilePath(item2.getParentPath()); + addReq.setFileName(item2.getFileNm()); + addReq.setFileExt(item2.getExtension()); + addReq.setFileSize(item2.getFileSize()); + addReq.setHstUid(item.getHstUid()); + + fileState = "DONE"; + if ("tfw".equalsIgnoreCase(item2.getExtension())) { + if (tifCnt == 0) { + fileState = "NOTPAIR"; + syncState = fileState; + } else if (tfwCnt > 1) { + fileState = "DUPLICATE"; + syncState = fileState; + } else if (item2.getFileSize() == 0) { + fileState = "TYPEERROR"; + syncState = fileState; + } else if (!FIleChecker.checkTfw(item2.getFullPath())) { + fileState = "TYPEERROR"; + syncState = fileState; + } + + item.setMapSheetPath(item2.getParentPath()); + item.setSyncTfwFileName(item2.getFileNm()); + + } else if ("tif".equalsIgnoreCase(item2.getExtension())) { + if (tfwCnt == 0) { + fileState = "NOTPAIR"; + syncState = fileState; + } else if (tifCnt > 1) { + fileState = "DUPLICATE"; + syncState = fileState; + } else if (item2.getFileSize() == 0) { + fileState = "TYPEERROR"; + syncState = fileState; + } else if (!FIleChecker.cmmndGdalInfo(item2.getFullPath())) { + fileState = "TYPEERROR"; + syncState = fileState; + } + + item.setMapSheetPath(item2.getParentPath()); + item.setSyncTifFileName(item2.getFileNm()); + } + + addReq.setFileState(fileState); + DmlReturn DmlReturn = mngDataSave(addReq); + } + + // 도엽별 파일 체크 완료로 변경 + item.setDataState("DONE"); + + if (syncState.isEmpty()) { + syncState = "DONE"; + } + + item.setSyncState(syncState); + + mngHstDataSyncStateUpdate(item); + } + + // 사용할 수 있는 이전 년도 도엽 테이블 저장 + mapSheetMngFileJobCoreService.saveSheetMngYear(); + + Long notyetCnt = this.mngDataStateDoneUpdate(mngYyyy); + } + + public int checkIsNoFile(List basicList) { + if (basicList == null || basicList.size() == 0) { + return 0; + } + + return basicList.size(); + } + + public Long mngDataStateDoneUpdate(int mngYyyy) { + + Long notyetCnt = 0L; + if (mngYyyy > 0) { + notyetCnt = findByMngYyyyTargetMapSheetNotYetCount(mngYyyy); + if (notyetCnt == 0) { + mapSheetMngFileJobCoreService.mngDataState(mngYyyy, "DONE"); + } else { + mapSheetMngFileJobCoreService.mngDataState(mngYyyy, "PROCESSING"); + } + } + + return notyetCnt; + } + + public Long mapSheetAutoExceptionUpdate(int mngYyyy, String mapSheetNum) { + + // 2025년 이전 파일싱크는 무조건 이전3년이 존재하지 않으므로 자동추론제외를 진행하지 않는다.(전년도 파일이 무조건 존재하는 것으로 리턴) + // if (syncAutoExceptionStartYear > mngYyyy) { + // return 1L; + // } + + // int strtYyyy = mngYyyy - syncAutoExceptionBeforeYearCnt + 1; + int strtYyyy = 2020; + int endYyyy = mngYyyy; + + // 본년도+이전년도가 3개년인 도엽 확인 -> 2020년도부터 현재까지 + Long beforeCnt = + mapSheetMngFileJobCoreService.findByHstMapSheetBeforeYyyyListCount( + strtYyyy, endYyyy, mapSheetNum); + + if (beforeCnt == 0) { + System.out.println("mapSheetAutoExceptionUpdate inference == 자동추론제외"); + mapSheetMngFileJobCoreService.updateException5kMapSheet( + mapSheetNum, CommonUseStatus.AUTO_EXCEPT); + } else { + // 하나라도 있으면 USE + mapSheetMngFileJobCoreService.updateException5kMapSheet(mapSheetNum, CommonUseStatus.USE); + } + + return beforeCnt; + } + + public List findTargetMapSheetFileList(long targetNum, int pageSize) { + return mapSheetMngFileJobCoreService.findTargetMapSheetFileList(targetNum, pageSize); + } + + public Long findByMngYyyyTargetMapSheetNotYetCount(int mngYyyy) { + return mapSheetMngFileJobCoreService.findByMngYyyyTargetMapSheetNotYetCount(mngYyyy); + } + + public DmlReturn mngHstDataSyncStateUpdate(MngHstDto UpdateReq) { + return mapSheetMngFileJobCoreService.mngHstDataSyncStateUpdate(UpdateReq); + } + + public DmlReturn mngDataSave(MngFileAddReq AddReq) { + return mapSheetMngFileJobCoreService.mngFileSave(AddReq); + } + + public List getFilesDepthAll(SrchFilesDepthDto srchDto) { + + Path startPath = Paths.get(srchDto.getDirPath()); + int maxDepth = srchDto.getMaxDepth(); + String dirPath = srchDto.getDirPath(); + String targetFileNm = srchDto.getFileNm(); + String extension = srchDto.getExtension(); + String sortType = srchDto.getSortType(); + + int startPos = srchDto.getStartPos(); + int endPos = srchDto.getEndPos(); + int limit = endPos - startPos + 1; + + Set targetExtensions = createExtensionSet(extension); + + List fileDtoList = new ArrayList<>(); + SimpleDateFormat dttmFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss"); + + int fileTotCnt = 0; + long fileTotSize = 0; + + try (Stream stream = Files.walk(startPath, maxDepth)) { + + fileDtoList = + stream + .filter(Files::isRegularFile) + .filter( + p -> + extension == null + || extension.equals("") + || extension.equals("*") + || targetExtensions.contains(extractExtension(p))) + .sorted(getFileComparator(sortType)) + .filter(p -> p.getFileName().toString().contains(targetFileNm)) + .skip(startPos) + .limit(limit) + .map( + path -> { + int depth = path.getNameCount(); + + String fileNm = path.getFileName().toString(); + String ext = FilenameUtils.getExtension(fileNm); + String parentFolderNm = path.getParent().getFileName().toString(); + String parentPath = path.getParent().toString(); + String fullPath = path.toAbsolutePath().toString(); + + File file = new File(fullPath); + long fileSize = file.length(); + String lastModified = dttmFormat.format(new Date(file.lastModified())); + + return new FileDto.Basic( + fileNm, parentFolderNm, parentPath, fullPath, ext, fileSize, lastModified); + }) + .collect(Collectors.toList()); + + // fileTotCnt = fileDtoList.size(); + // fileTotSize = fileDtoList.stream().mapToLong(FileDto.Basic::getFileSize).sum(); + + } catch (IOException e) { + System.err.println("파일 I/O 오류 발생: " + e.getMessage()); + } + + return fileDtoList; + } + + public Set createExtensionSet(String extensionString) { + if (extensionString == null || extensionString.isBlank()) { + return Set.of(); + } + + // "java, class" -> ["java", " class"] -> [".java", ".class"] + return Arrays.stream(extensionString.split(",")) + .map(ext -> ext.trim()) + .filter(ext -> !ext.isEmpty()) + .map(ext -> "." + ext.toLowerCase()) + .collect(Collectors.toSet()); + } + + public String extractExtension(Path path) { + String filename = path.getFileName().toString(); + int lastDotIndex = filename.lastIndexOf('.'); + + // 확장자가 없거나 파일명이 .으로 끝나는 경우 + if (lastDotIndex == -1 || lastDotIndex == filename.length() - 1) { + return ""; // 빈 문자열 반환 + } + + // 확장자 추출 및 소문자 변환 + return filename.substring(lastDotIndex).toLowerCase(); + } + + public Comparator getFileComparator(String sortType) { + + // 파일 이름 비교 기본 Comparator (대소문자 무시) + Comparator nameComparator = + Comparator.comparing(path -> path.getFileName().toString(), CASE_INSENSITIVE_ORDER); + + Comparator dateComparator = + Comparator.comparing( + path -> { + try { + return Files.getLastModifiedTime(path); + } catch (IOException e) { + return FileTime.fromMillis(0); + } + }); + + if ("name desc".equalsIgnoreCase(sortType)) { + return nameComparator.reversed(); + } else if ("date".equalsIgnoreCase(sortType)) { + return dateComparator; + } else if ("date desc".equalsIgnoreCase(sortType)) { + return dateComparator.reversed(); + } else { + return nameComparator; + } + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/service/NameValidator.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/service/NameValidator.java new file mode 100755 index 0000000..fe9464e --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/service/NameValidator.java @@ -0,0 +1,43 @@ +package com.kamco.cd.kamcoback.service; + +import java.util.regex.Matcher; +import java.util.regex.Pattern; + +public class NameValidator { + + private static final String HANGUL_REGEX = ".*\\p{IsHangul}.*"; + private static final Pattern HANGUL_PATTERN = Pattern.compile(HANGUL_REGEX); + + private static final String WHITESPACE_REGEX = ".*\\s.*"; + private static final Pattern WHITESPACE_PATTERN = Pattern.compile(WHITESPACE_REGEX); + + public static boolean containsKorean(String str) { + if (str == null || str.isEmpty()) { + return false; + } + Matcher matcher = HANGUL_PATTERN.matcher(str); + return matcher.matches(); + } + + public static boolean containsWhitespaceRegex(String str) { + if (str == null || str.isEmpty()) { + return false; + } + + Matcher matcher = WHITESPACE_PATTERN.matcher(str); + // find()를 사용하여 문자열 내에서 패턴이 일치하는 부분이 있는지 확인 + return matcher.find(); + } + + public static boolean isNullOrEmpty(String str) { + if (str == null) { + return true; + } + + if (str.isEmpty()) { + return true; + } + + return false; + } +} diff --git a/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/utils/FIleChecker.java b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/utils/FIleChecker.java new file mode 100755 index 0000000..e8cfbd9 --- /dev/null +++ b/imagery-make-dataset/src/main/java/com/kamco/cd/kamcoback/utils/FIleChecker.java @@ -0,0 +1,751 @@ +package com.kamco.cd.kamcoback.utils; + +import static java.lang.String.CASE_INSENSITIVE_ORDER; + +import com.kamco.cd.kamcoback.service.NameValidator; +import io.swagger.v3.oas.annotations.media.Schema; +import java.io.BufferedReader; +import java.io.File; +import java.io.FileInputStream; +import java.io.FileOutputStream; +import java.io.FileReader; +import java.io.IOException; +import java.io.InputStream; +import java.io.InputStreamReader; +import java.nio.file.Files; +import java.nio.file.Path; +import java.nio.file.Paths; +import java.nio.file.attribute.FileTime; +import java.security.MessageDigest; +import java.security.NoSuchAlgorithmException; +import java.text.SimpleDateFormat; +import java.util.ArrayList; +import java.util.Arrays; +import java.util.Comparator; +import java.util.Date; +import java.util.List; +import java.util.Set; +import java.util.function.Predicate; +import java.util.stream.Collectors; +import java.util.stream.Stream; +import java.util.zip.ZipEntry; +import java.util.zip.ZipInputStream; +import lombok.Getter; +import org.apache.commons.io.FilenameUtils; +import org.geotools.coverage.grid.GridCoverage2D; +import org.geotools.gce.geotiff.GeoTiffReader; +import org.springframework.util.FileSystemUtils; +import org.springframework.web.multipart.MultipartFile; + +public class FIleChecker { + + static SimpleDateFormat dttmFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss"); + + public static boolean isValidFile(String pathStr) { + + Path path = Paths.get(pathStr); + + if (!Files.exists(path)) { + return false; + } + + if (!Files.isRegularFile(path)) { + return false; + } + + if (!Files.isReadable(path)) { + return false; + } + + try { + if (Files.size(path) <= 0) { + return false; + } + } catch (IOException e) { + return false; + } + + return true; + } + + public static boolean verifyFileIntegrity(Path path, String expectedHash) + throws IOException, NoSuchAlgorithmException { + + // 1. 알고리즘 선택 (SHA-256 권장, MD5는 보안상 비추천) + MessageDigest digest = MessageDigest.getInstance("SHA-256"); + + try (InputStream fis = Files.newInputStream(path)) { + byte[] buffer = new byte[8192]; // 8KB 버퍼 + int bytesRead; + while ((bytesRead = fis.read(buffer)) != -1) { + digest.update(buffer, 0, bytesRead); + } + } + + // 3. 계산된 바이트 배열을 16진수 문자열로 변환 + StringBuilder sb = new StringBuilder(); + for (byte b : digest.digest()) { + sb.append(String.format("%02x", b)); + } + String actualHash = sb.toString(); + + return actualHash.equalsIgnoreCase(expectedHash); + } + + public static boolean checkTfw(String filePath) { + + File file = new File(filePath); + + if (!file.exists()) { + return false; + } + + // 1. 파일의 모든 라인을 읽어옴 + List lines = new ArrayList<>(); + try (BufferedReader br = new BufferedReader(new FileReader(file))) { + String line; + while ((line = br.readLine()) != null) { + if (!line.trim().isEmpty()) { // 빈 줄 제외 + lines.add(Double.parseDouble(line.trim())); + } + } + } catch (IOException ignored) { + return false; + } + + // 2. 6줄이 맞는지 확인 + if (lines.size() < 6) { + // System.out.println("유효하지 않은 TFW 파일입니다. (데이터 부족)"); + return false; + } + + return true; + } + + public static boolean checkGeoTiff(String filePath) { + + File file = new File(filePath); + + if (!file.exists()) { + return false; + } + + GeoTiffReader reader = null; + try { + // 1. 파일 포맷 및 헤더 확인 + reader = new GeoTiffReader(file); + + // 2. 실제 데이터 로딩 (여기서 파일 깨짐 여부 확인됨) + // null을 넣으면 전체 영역을 읽지 않고 메타데이터 위주로 체크하여 빠름 + GridCoverage2D coverage = reader.read(null); + + if (coverage == null) return false; + + // 3. GIS 필수 정보(좌표계)가 있는지 확인 + // if (coverage.getCoordinateReferenceSystem() == null) { + // GeoTIFF가 아니라 일반 TIFF일 수도 있음(이미지는 정상이지만, 좌표계(CRS) 정보가 없습니다.) + // } + + return true; + + } catch (Exception e) { + System.err.println("손상된 TIF 파일입니다: " + e.getMessage()); + return false; + } finally { + // 리소스 해제 (필수) + if (reader != null) reader.dispose(); + } + } + + public static Boolean cmmndGdalInfo(String filePath) { + + File file = new File(filePath); + + if (!file.exists()) { + System.err.println("파일이 존재하지 않습니다: " + filePath); + return false; + } + + boolean hasDriver = false; + + // 운영체제 감지 + String osName = System.getProperty("os.name").toLowerCase(); + boolean isWindows = osName.contains("win"); + boolean isMac = osName.contains("mac"); + boolean isUnix = osName.contains("nix") || osName.contains("nux") || osName.contains("aix"); + + // gdalinfo 경로 찾기 (일반적인 설치 경로 우선 확인) + String gdalinfoPath = findGdalinfoPath(); + if (gdalinfoPath == null) { + System.err.println("gdalinfo 명령어를 찾을 수 없습니다. GDAL이 설치되어 있는지 확인하세요."); + System.err.println("macOS: brew install gdal"); + System.err.println("Ubuntu/Debian: sudo apt-get install gdal-bin"); + System.err.println("CentOS/RHEL: sudo yum install gdal"); + return false; + } + + List command = new ArrayList<>(); + + if (isWindows) { + // 윈도우용 + command.add("cmd.exe"); // 윈도우 명령 프롬프트 실행 + command.add("/c"); // 명령어를 수행하고 종료한다는 옵션 + command.add("gdalinfo"); + command.add(filePath); + command.add("|"); + command.add("findstr"); + command.add("/i"); + command.add("Geo"); + } else if (isMac || isUnix) { + // 리눅스, 맥용 + command.add("sh"); + command.add("-c"); + command.add(gdalinfoPath + " \"" + filePath + "\" | grep -i Geo"); + } else { + System.err.println("지원하지 않는 운영체제: " + osName); + return false; + } + + ProcessBuilder processBuilder = new ProcessBuilder(command); + processBuilder.redirectErrorStream(true); + + Process process = null; + BufferedReader reader = null; + try { + System.out.println("gdalinfo 명령어 실행 시작: " + filePath); + process = processBuilder.start(); + + reader = new BufferedReader(new InputStreamReader(process.getInputStream())); + + String line; + while ((line = reader.readLine()) != null) { + // System.out.println("gdalinfo 출력: " + line); + if (line.contains("Driver: GTiff/GeoTIFF")) { + hasDriver = true; + break; + } + } + + int exitCode = process.waitFor(); + System.out.println("gdalinfo 종료 코드: " + exitCode); + + // 프로세스가 정상 종료되지 않았고 Driver를 찾지 못한 경우 + if (exitCode != 0 && !hasDriver) { + System.err.println("gdalinfo 명령 실행 실패. Exit code: " + exitCode); + } + + } catch (IOException e) { + System.err.println("gdalinfo 실행 중 I/O 오류 발생: " + e.getMessage()); + e.printStackTrace(); + return false; + } catch (InterruptedException e) { + System.err.println("gdalinfo 실행 중 인터럽트 발생: " + e.getMessage()); + Thread.currentThread().interrupt(); + return false; + } catch (Exception e) { + System.err.println("gdalinfo 실행 중 예상치 못한 오류 발생: " + e.getMessage()); + e.printStackTrace(); + return false; + } finally { + // 리소스 정리 + if (reader != null) { + try { + reader.close(); + } catch (IOException e) { + System.err.println("BufferedReader 종료 중 오류: " + e.getMessage()); + } + } + if (process != null) { + process.destroy(); + } + } + + return hasDriver; + } + + public static boolean mkDir(String dirPath) { + Path uploadTargetPath = Paths.get(dirPath); + try { + Files.createDirectories(uploadTargetPath); + } catch (IOException e) { + return false; + } + + return true; + } + + public static List getFolderAll(String dirPath, String sortType, int maxDepth) { + + Path startPath = Paths.get(dirPath); + + List folderList = List.of(); + + try (Stream stream = Files.walk(startPath, maxDepth)) { + + folderList = + stream + .filter(Files::isDirectory) + .filter(p -> !p.toString().equals(dirPath)) + .map( + path -> { + int depth = path.getNameCount(); + + String folderNm = path.getFileName().toString(); + String parentFolderNm = path.getParent().getFileName().toString(); + String parentPath = path.getParent().toString(); + String fullPath = path.toAbsolutePath().toString(); + + boolean isValid = + !NameValidator.containsKorean(folderNm) + && !NameValidator.containsWhitespaceRegex(folderNm); + + File file = new File(fullPath); + int childCnt = getChildFolderCount(file); + String lastModified = getLastModified(file); + + return new Folder( + folderNm, + parentFolderNm, + parentPath, + fullPath, + depth, + childCnt, + lastModified, + isValid); + }) + .collect(Collectors.toList()); + + if (sortType.equals("name") || sortType.equals("name asc")) { + folderList.sort( + Comparator.comparing( + Folder::getFolderNm, CASE_INSENSITIVE_ORDER // 대소문자 구분 없이 + )); + } else if (sortType.equals("name desc")) { + folderList.sort( + Comparator.comparing( + Folder::getFolderNm, CASE_INSENSITIVE_ORDER // 대소문자 구분 없이 + ) + .reversed()); + } else if (sortType.equals("dttm desc")) { + folderList.sort( + Comparator.comparing( + Folder::getLastModified, CASE_INSENSITIVE_ORDER // 대소문자 구분 없이 + ) + .reversed()); + } else { + folderList.sort( + Comparator.comparing( + Folder::getLastModified, CASE_INSENSITIVE_ORDER // 대소문자 구분 없이 + )); + } + + } catch (IOException e) { + throw new RuntimeException(e); + } + + return folderList; + } + + public static List getFolderAll(String dirPath) { + return getFolderAll(dirPath, "name", 1); + } + + public static List getFolderAll(String dirPath, String sortType) { + return getFolderAll(dirPath, sortType, 1); + } + + public static int getChildFolderCount(String dirPath) { + File directory = new File(dirPath); + File[] childFolders = directory.listFiles(File::isDirectory); + + int childCnt = 0; + if (childFolders != null) { + childCnt = childFolders.length; + } + + return childCnt; + } + + public static int getChildFolderCount(File directory) { + File[] childFolders = directory.listFiles(File::isDirectory); + + int childCnt = 0; + if (childFolders != null) { + childCnt = childFolders.length; + } + + return childCnt; + } + + public static String getLastModified(String dirPath) { + File file = new File(dirPath); + return dttmFormat.format(new Date(file.lastModified())); + } + + public static String getLastModified(File file) { + return dttmFormat.format(new Date(file.lastModified())); + } + + public static List getFilesFromAllDepth( + String dir, + String targetFileNm, + String extension, + int maxDepth, + String sortType, + int startPos, + int endPos) { + + Path startPath = Paths.get(dir); + String dirPath = dir; + + int limit = endPos - startPos + 1; + + Set targetExtensions = createExtensionSet(extension); + + List fileList = new ArrayList<>(); + SimpleDateFormat dttmFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss"); + + Predicate isTargetName = + p -> { + if (targetFileNm == null + || targetFileNm.trim().isEmpty() + || targetFileNm.trim().equals("*")) { + return true; // 전체 파일 허용 + } + return p.getFileName().toString().contains(targetFileNm); + }; + + try (Stream stream = Files.walk(startPath, maxDepth)) { + + fileList = + stream + .filter(Files::isRegularFile) + .filter( + p -> + extension == null + || extension.equals("") + || extension.equals("*") + || targetExtensions.contains(extractExtension(p))) + .sorted(getFileComparator(sortType)) + .filter(isTargetName) + .skip(startPos) + .limit(limit) + .map( + path -> { + // int depth = path.getNameCount(); + + String fileNm = path.getFileName().toString(); + String ext = FilenameUtils.getExtension(fileNm); + String parentFolderNm = path.getParent().getFileName().toString(); + String parentPath = path.getParent().toString(); + String fullPath = path.toAbsolutePath().toString(); + + File file = new File(fullPath); + long fileSize = file.length(); + String lastModified = dttmFormat.format(new Date(file.lastModified())); + + return new Basic( + fileNm, parentFolderNm, parentPath, fullPath, ext, fileSize, lastModified); + }) + .collect(Collectors.toList()); + + } catch (IOException e) { + System.err.println("파일 I/O 오류 발생: " + e.getMessage()); + } + + return fileList; + } + + public static List getFilesFromAllDepth( + String dir, String targetFileNm, String extension) { + + return FIleChecker.getFilesFromAllDepth(dir, targetFileNm, extension, 100, "name", 0, 100); + } + + public static int getFileCountFromAllDepth(String dir, String targetFileNm, String extension) { + + List basicList = FIleChecker.getFilesFromAllDepth(dir, targetFileNm, extension); + + return (int) + basicList.stream().filter(dto -> dto.getExtension().toString().equals(extension)).count(); + } + + public static Long getFileTotSize(List files) { + + Long fileTotSize = 0L; + if (files != null || files.size() > 0) { + fileTotSize = files.stream().mapToLong(Basic::getFileSize).sum(); + } + + return fileTotSize; + } + + public static boolean multipartSaveTo(MultipartFile mfile, String targetPath) { + Path tmpSavePath = Paths.get(targetPath); + + boolean fileUpload = true; + try { + mfile.transferTo(tmpSavePath); + } catch (IOException e) { + // throw new RuntimeException(e); + return false; + } + + return true; + } + + public static boolean multipartChunkSaveTo( + MultipartFile mfile, String targetPath, int chunkIndex) { + File dest = new File(targetPath, String.valueOf(chunkIndex)); + + boolean fileUpload = true; + try { + mfile.transferTo(dest); + } catch (IOException e) { + return false; + } + + return true; + } + + public static boolean deleteFolder(String path) { + return FileSystemUtils.deleteRecursively(new File(path)); + } + + public static boolean deleteFile(String filePath) { + Path path = Paths.get(filePath); + try { + return Files.deleteIfExists(path); + } catch (IOException e) { + return false; + } + } + + public static boolean validationMultipart(MultipartFile mfile) { + // 파일 유효성 검증 + if (mfile == null || mfile.isEmpty() || mfile.getSize() == 0) { + return false; + } + + return true; + } + + public static void unzip(String fileName, String destDirectory) throws IOException { + File destDir = new File(destDirectory); + if (!destDir.exists()) { + destDir.mkdirs(); // 대상 폴더가 없으면 생성 + } + + String zipFilePath = destDirectory + "/" + fileName; + + try (ZipInputStream zis = new ZipInputStream(new FileInputStream(zipFilePath))) { + ZipEntry zipEntry = zis.getNextEntry(); + + while (zipEntry != null) { + File newFile = newFile(destDir, zipEntry); + + if (zipEntry.isDirectory()) { + if (!newFile.isDirectory() && !newFile.mkdirs()) { + throw new IOException("디렉토리 생성 실패: " + newFile); + } + + } else { + + // 상위 디렉토리가 없는 경우 생성 + File parent = newFile.getParentFile(); + if (!parent.exists() && !parent.mkdirs()) { + throw new IOException("상위 디렉토리 생성 실패: " + parent); + } + + // 실제 파일 쓰기 + try (FileOutputStream fos = new FileOutputStream(newFile)) { + byte[] buffer = new byte[1024]; + int len; + while ((len = zis.read(buffer)) > 0) { + fos.write(buffer, 0, len); + } + } + } + zipEntry = zis.getNextEntry(); + } + zis.closeEntry(); + } + } + + public static File newFile(File destinationDir, ZipEntry zipEntry) throws IOException { + File destFile = new File(destinationDir, zipEntry.getName()); + + String destDirPath = destinationDir.getCanonicalPath(); + String destFilePath = destFile.getCanonicalPath(); + + if (!destFilePath.startsWith(destDirPath + File.separator)) { + throw new IOException("엔트리가 대상 디렉토리를 벗어남: " + zipEntry.getName()); + } + + return destFile; + } + + public static boolean checkExtensions(String fileName, String ext) { + if (fileName == null) return false; + + if (!fileName.substring(fileName.lastIndexOf('.') + 1).toLowerCase().equals(ext)) { + return false; + } + + return true; + } + + public static Set createExtensionSet(String extensionString) { + if (extensionString == null || extensionString.isBlank()) { + return Set.of(); + } + + // "java, class" -> ["java", " class"] -> [".java", ".class"] + return Arrays.stream(extensionString.split(",")) + .map(ext -> ext.trim()) + .filter(ext -> !ext.isEmpty()) + .map(ext -> "." + ext.toLowerCase()) + .collect(Collectors.toSet()); + } + + public static String extractExtension(Path path) { + String filename = path.getFileName().toString(); + int lastDotIndex = filename.lastIndexOf('.'); + + // 확장자가 없거나 파일명이 .으로 끝나는 경우 + if (lastDotIndex == -1 || lastDotIndex == filename.length() - 1) { + return ""; // 빈 문자열 반환 + } + + // 확장자 추출 및 소문자 변환 + return filename.substring(lastDotIndex).toLowerCase(); + } + + public static Comparator getFileComparator(String sortType) { + + // 파일 이름 비교 기본 Comparator (대소문자 무시) + Comparator nameComparator = + Comparator.comparing(path -> path.getFileName().toString(), CASE_INSENSITIVE_ORDER); + + Comparator dateComparator = + Comparator.comparing( + path -> { + try { + return Files.getLastModifiedTime(path); + } catch (IOException e) { + return FileTime.fromMillis(0); + } + }); + + if ("name desc".equalsIgnoreCase(sortType)) { + return nameComparator.reversed(); + } else if ("date".equalsIgnoreCase(sortType)) { + return dateComparator; + } else if ("date desc".equalsIgnoreCase(sortType)) { + return dateComparator.reversed(); + } else { + return nameComparator; + } + } + + private static String findGdalinfoPath() { + // 일반적인 설치 경로 확인 + String[] possiblePaths = { + "/usr/local/bin/gdalinfo", // Homebrew (macOS) + "/opt/homebrew/bin/gdalinfo", // Homebrew (Apple Silicon macOS) + "/usr/bin/gdalinfo", // Linux + "gdalinfo" // PATH에 있는 경우 + }; + + for (String path : possiblePaths) { + if (isCommandAvailable(path)) { + return path; + } + } + + return null; + } + + private static boolean isCommandAvailable(String command) { + try { + ProcessBuilder pb = new ProcessBuilder(command, "--version"); + pb.redirectErrorStream(true); + Process process = pb.start(); + + // 프로세스 완료 대기 (최대 5초) + boolean finished = process.waitFor(5, java.util.concurrent.TimeUnit.SECONDS); + + if (!finished) { + process.destroy(); + return false; + } + + // 종료 코드가 0이면 정상 (일부 명령어는 --version에서 다른 코드 반환할 수 있음) + return process.exitValue() == 0 || process.exitValue() == 1; + } catch (Exception e) { + return false; + } + } + + @Schema(name = "Folder", description = "폴더 정보") + @Getter + public static class Folder { + private final String folderNm; + private final String parentFolderNm; + private final String parentPath; + private final String fullPath; + private final int depth; + private final long childCnt; + private final String lastModified; + private final Boolean isValid; + + public Folder( + String folderNm, + String parentFolderNm, + String parentPath, + String fullPath, + int depth, + long childCnt, + String lastModified, + Boolean isValid) { + this.folderNm = folderNm; + this.parentFolderNm = parentFolderNm; + this.parentPath = parentPath; + this.fullPath = fullPath; + this.depth = depth; + this.childCnt = childCnt; + this.lastModified = lastModified; + this.isValid = isValid; + } + } + + @Schema(name = "File Basic", description = "파일 기본 정보") + @Getter + public static class Basic { + + private final String fileNm; + private final String parentFolderNm; + private final String parentPath; + private final String fullPath; + private final String extension; + private final long fileSize; + private final String lastModified; + + public Basic( + String fileNm, + String parentFolderNm, + String parentPath, + String fullPath, + String extension, + long fileSize, + String lastModified) { + this.fileNm = fileNm; + this.parentFolderNm = parentFolderNm; + this.parentPath = parentPath; + this.fullPath = fullPath; + this.extension = extension; + this.fileSize = fileSize; + this.lastModified = lastModified; + } + } +} diff --git a/imagery-make-dataset/src/main/resources/application.yml b/imagery-make-dataset/src/main/resources/application.yml new file mode 100755 index 0000000..5d1aadc --- /dev/null +++ b/imagery-make-dataset/src/main/resources/application.yml @@ -0,0 +1,4 @@ +server: + port: 9080 + + diff --git a/imagery-make-dataset/src/main/resources/application_dev.yml b/imagery-make-dataset/src/main/resources/application_dev.yml new file mode 100755 index 0000000..ab6930c --- /dev/null +++ b/imagery-make-dataset/src/main/resources/application_dev.yml @@ -0,0 +1,67 @@ +server: + port: 9080 + +spring: + application: + name: imagery-make-dataset + profiles: + active: dev # 사용할 프로파일 지정 (ex. dev, prod, test) + + datasource: + url: jdbc:postgresql://192.168.2.127:15432/kamco_cds + #url: jdbc:postgresql://localhost:5432/kamco_cds + username: kamco_cds + password: kamco_cds_Q!W@E#R$ + hikari: + minimum-idle: 1 + maximum-pool-size: 5 + + jpa: + hibernate: + ddl-auto: update # 테이블이 없으면 생성, 있으면 업데이트 + properties: + hibernate: + jdbc: + batch_size: 50 + default_batch_fetch_size: 100 +logging: + level: + root: INFO + org.springframework.web: DEBUG + org.springframework.security: DEBUG + + # 헬스체크 노이즈 핵심만 다운 + org.springframework.security.web.FilterChainProxy: INFO + org.springframework.security.web.authentication.AnonymousAuthenticationFilter: INFO + org.springframework.security.web.authentication.Http403ForbiddenEntryPoint: INFO + org.springframework.web.servlet.DispatcherServlet: INFO +# actuator +management: + health: + readinessstate: + enabled: true + livenessstate: + enabled: true + endpoint: + health: + probes: + enabled: true + show-details: always + endpoints: + jmx: + exposure: + exclude: "*" + web: + base-path: /monitor + exposure: + include: + - "health" + +file: + #sync-root-dir: D:/kamco-nfs/images/ + sync-root-dir: /kamco-nfs/images/ + sync-tmp-dir: ${file.sync-root-dir}/tmp + sync-file-extention: tfw,tif + sync-auto-exception-start-year: 2025 + sync-auto-exception-before-year-cnt: 3 + diff --git a/imagery-make-dataset/src/main/resources/application_local.yml b/imagery-make-dataset/src/main/resources/application_local.yml new file mode 100755 index 0000000..328045e --- /dev/null +++ b/imagery-make-dataset/src/main/resources/application_local.yml @@ -0,0 +1,67 @@ +server: + port: 9080 + +spring: + application: + name: imagery-make-dataset + profiles: + active: local # 사용할 프로파일 지정 (ex. dev, prod, test) + + datasource: + url: jdbc:postgresql://192.168.2.127:15432/kamco_cds + #url: jdbc:postgresql://localhost:5432/kamco_cds + username: kamco_cds + password: kamco_cds_Q!W@E#R$ + hikari: + minimum-idle: 1 + maximum-pool-size: 5 + + jpa: + hibernate: + ddl-auto: update # 테이블이 없으면 생성, 있으면 업데이트 + properties: + hibernate: + jdbc: + batch_size: 50 + default_batch_fetch_size: 100 +logging: + level: + root: INFO + org.springframework.web: DEBUG + org.springframework.security: DEBUG + + # 헬스체크 노이즈 핵심만 다운 + org.springframework.security.web.FilterChainProxy: INFO + org.springframework.security.web.authentication.AnonymousAuthenticationFilter: INFO + org.springframework.security.web.authentication.Http403ForbiddenEntryPoint: INFO + org.springframework.web.servlet.DispatcherServlet: INFO +# actuator +management: + health: + readinessstate: + enabled: true + livenessstate: + enabled: true + endpoint: + health: + probes: + enabled: true + show-details: always + endpoints: + jmx: + exposure: + exclude: "*" + web: + base-path: /monitor + exposure: + include: + - "health" + +file: + #sync-root-dir: D:/kamco-nfs/images/ + sync-root-dir: /kamco-nfs/images/ + sync-tmp-dir: ${file.sync-root-dir}/tmp + sync-file-extention: tfw,tif + sync-auto-exception-start-year: 2025 + sync-auto-exception-before-year-cnt: 3 + diff --git a/imagery-make-dataset/src/main/resources/application_prod.yml b/imagery-make-dataset/src/main/resources/application_prod.yml new file mode 100755 index 0000000..1282d26 --- /dev/null +++ b/imagery-make-dataset/src/main/resources/application_prod.yml @@ -0,0 +1,67 @@ +server: + port: 9080 + +spring: + application: + name: imagery-make-dataset + profiles: + active: prod # 사용할 프로파일 지정 (ex. dev, prod, test) + + datasource: + url: jdbc:postgresql://192.168.2.127:15432/kamco_cds + #url: jdbc:postgresql://localhost:5432/kamco_cds + username: kamco_cds + password: kamco_cds_Q!W@E#R$ + hikari: + minimum-idle: 1 + maximum-pool-size: 5 + + jpa: + hibernate: + ddl-auto: update # 테이블이 없으면 생성, 있으면 업데이트 + properties: + hibernate: + jdbc: + batch_size: 50 + default_batch_fetch_size: 100 +logging: + level: + root: INFO + org.springframework.web: DEBUG + org.springframework.security: DEBUG + + # 헬스체크 노이즈 핵심만 다운 + org.springframework.security.web.FilterChainProxy: INFO + org.springframework.security.web.authentication.AnonymousAuthenticationFilter: INFO + org.springframework.security.web.authentication.Http403ForbiddenEntryPoint: INFO + org.springframework.web.servlet.DispatcherServlet: INFO +# actuator +management: + health: + readinessstate: + enabled: true + livenessstate: + enabled: true + endpoint: + health: + probes: + enabled: true + show-details: always + endpoints: + jmx: + exposure: + exclude: "*" + web: + base-path: /monitor + exposure: + include: + - "health" + +file: + #sync-root-dir: D:/kamco-nfs/images/ + sync-root-dir: /kamco-nfs/images/ + sync-tmp-dir: ${file.sync-root-dir}/tmp + sync-file-extention: tfw,tif + sync-auto-exception-start-year: 2025 + sync-auto-exception-before-year-cnt: 3 + diff --git a/imagery-make-dataset/src/main/resources/static/chunk_upload_test.html b/imagery-make-dataset/src/main/resources/static/chunk_upload_test.html new file mode 100755 index 0000000..2c331d4 --- /dev/null +++ b/imagery-make-dataset/src/main/resources/static/chunk_upload_test.html @@ -0,0 +1,137 @@ + + + + + Chunk Upload Test + + +

대용량 파일 청크 업로드 테스트

+ +* Chunk 테스트 사이즈 10M (10 * 1024 * 1024) - 성능에 따라 변경가능

+ +* 업로드 API선택

+ +

+* 파일첨부

+

+ +



+* 업로드시 업로드 이력을 추적하기 위해 UUID생성해서 전달(파일병합시 사용)(script 예제참고)

+UUID :

+ +* API 호출시 파일정보 추출해서 자동 할당해야 함.(script 예제참고)

+chunkIndex :

+chunkTotalIndex :

+ +* API 호출시 파일정보 추출해서 자동 할당해야 함.(script 예제참고)

+fileSize :

+ + + +

+* 진행율(%)

+
+

+* 결과메세지

+
+ + + + diff --git a/imagery-make-dataset/unpack_and_offline_build_airgap_macos.sh b/imagery-make-dataset/unpack_and_offline_build_airgap_macos.sh new file mode 100755 index 0000000..4ab128e --- /dev/null +++ b/imagery-make-dataset/unpack_and_offline_build_airgap_macos.sh @@ -0,0 +1,359 @@ +#!/bin/bash +# unpack_and_offline_build_airgap_macos.sh +# ============================================================================ +# Execution Environment: OFFLINE (Air-gapped, No Internet) +# Purpose: Extract bundle and run offline build +# ============================================================================ +# macOS Bash Script +# Version: 3.1 +# +# IMPORTANT: This script automatically: +# 1. Extracts the archive +# 2. Sets GRADLE_USER_HOME to project local cache +# 3. Configures settings.gradle for offline resolution +# 4. Runs build with --offline flag +# ============================================================================ + +set -e + +# ============================================================================ +# Configuration +# ============================================================================ +WRAPPER_SEED_PATH="wrapper_jar_seed" +OFFLINE_HOME_NAME="_offline_gradle_home" + +# Color codes +RED='\033[0;31m' +GREEN='\033[0;32m' +YELLOW='\033[1;33m' +CYAN='\033[0;36m' +GRAY='\033[0;90m' +WHITE='\033[1;37m' +NC='\033[0m' # No Color + +echo "" +echo -e "${CYAN}============================================================${NC}" +echo -e "${CYAN} Gradle Offline Build Runner (macOS)${NC}" +echo -e "${CYAN} Environment: AIR-GAPPED (No Internet)${NC}" +echo -e "${CYAN} Mode: Fully Offline (--offline enforced)${NC}" +echo -e "${CYAN}============================================================${NC}" +echo "" + +# ============================================================================ +# [1/16] Check Current Directory +# ============================================================================ +echo -e "${YELLOW}==[1/16] Check Current Directory ==${NC}" +START_DIR="$(pwd)" +echo "PWD: $START_DIR" +echo "" + +# ============================================================================ +# [2/16] Select Archive +# ============================================================================ +echo -e "${YELLOW}==[2/16] Select Archive ==${NC}" + +ARCHIVE="" +if [ $# -ge 1 ]; then + ARCHIVE="$1" +fi + +if [ -z "$ARCHIVE" ]; then + # Auto-detect most recent .tar.gz file (macOS compatible) + ARCHIVE=$(find "$START_DIR" -maxdepth 1 -type f \( -name "*.tar.gz" -o -name "*.tgz" \) -exec stat -f "%m %N" {} \; 2>/dev/null | sort -rn | head -1 | cut -d' ' -f2-) + + if [ -z "$ARCHIVE" ]; then + echo -e "${RED}[ERROR] No archive found${NC}" + ls -lh "$START_DIR" + exit 1 + fi + + echo -e "${CYAN}[AUTO] $(basename "$ARCHIVE")${NC}" +else + if [ ! -f "$ARCHIVE" ]; then + ARCHIVE="$START_DIR/$ARCHIVE" + fi + echo -e "${CYAN}[USER] $(basename "$ARCHIVE")${NC}" +fi + +if [ ! -f "$ARCHIVE" ]; then + echo -e "${RED}ERROR: Archive not found: $ARCHIVE${NC}" + exit 1 +fi + +# macOS stat command +ARCHIVE_SIZE=$(stat -f%z "$ARCHIVE" 2>/dev/null) +ARCHIVE_SIZE_MB=$(awk "BEGIN {printf \"%.2f\", $ARCHIVE_SIZE / 1048576}") +echo "Size: ${ARCHIVE_SIZE_MB} MB" +echo "" + +# ============================================================================ +# [3/16] Check tar +# ============================================================================ +echo -e "${YELLOW}==[3/16] Check tar ==${NC}" + +if ! command -v tar &>/dev/null; then + echo -e "${RED}ERROR: tar not found${NC}" + exit 1 +fi +echo -e "${GREEN}[OK] tar found${NC}" +echo "" + +# ============================================================================ +# [4/16] Extract Archive +# ============================================================================ +echo -e "${YELLOW}==[4/16] Extract Archive ==${NC}" +echo -e "${GRAY}[INFO] Extracting...${NC}" + +tar -xzf "$ARCHIVE" -C "$START_DIR" +if [ $? -ne 0 ]; then + echo -e "${RED}ERROR: Extraction failed${NC}" + exit 1 +fi +echo -e "${GREEN}[OK] Extracted${NC}" +echo "" + +# ============================================================================ +# [5/16] Set Permissions +# ============================================================================ +echo -e "${YELLOW}==[5/16] Set Permissions ==${NC}" + +chmod -R u+rw "$START_DIR" 2>/dev/null || true +# Remove extended attributes that macOS may add +xattr -cr "$START_DIR" 2>/dev/null || true +echo -e "${GREEN}[OK] Permissions set${NC}" +echo "" + +# ============================================================================ +# [6/16] Find Project Root +# ============================================================================ +echo -e "${YELLOW}==[6/16] Find Project Root ==${NC}" + +GRADLEW=$(find "$START_DIR" -name "gradlew" -type f 2>/dev/null | sort | head -1) +if [ -z "$GRADLEW" ]; then + echo -e "${RED}ERROR: gradlew not found${NC}" + exit 1 +fi + +PROJECT_DIR=$(dirname "$GRADLEW") +echo -e "${CYAN}Project: $PROJECT_DIR${NC}" +cd "$PROJECT_DIR" +echo "" + +# ============================================================================ +# [7/16] Fix Permissions +# ============================================================================ +echo -e "${YELLOW}==[7/16] Fix Permissions ==${NC}" + +chmod +x ./gradlew +find . -name "*.sh" -type f -exec chmod +x {} \; 2>/dev/null || true +# Remove quarantine attributes that macOS adds to downloaded files +xattr -d com.apple.quarantine ./gradlew 2>/dev/null || true +find . -name "*.jar" -exec xattr -d com.apple.quarantine {} \; 2>/dev/null || true +echo -e "${GREEN}[OK] Permissions fixed${NC}" +echo "" + +# ============================================================================ +# [8/16] Verify Wrapper +# ============================================================================ +echo -e "${YELLOW}==[8/16] Verify Wrapper ==${NC}" + +WRAPPER_DIR="$PROJECT_DIR/gradle/wrapper" +WRAPPER_JAR="$WRAPPER_DIR/gradle-wrapper.jar" +WRAPPER_PROP="$WRAPPER_DIR/gradle-wrapper.properties" + +if [ ! -f "$WRAPPER_PROP" ]; then + echo -e "${RED}ERROR: gradle-wrapper.properties missing${NC}" + exit 1 +fi + +if [ ! -f "$WRAPPER_JAR" ]; then + SEED_JAR="$PROJECT_DIR/$WRAPPER_SEED_PATH/gradle-wrapper.jar" + if [ -f "$SEED_JAR" ]; then + mkdir -p "$WRAPPER_DIR" + cp "$SEED_JAR" "$WRAPPER_JAR" + echo -e "${GREEN}[OK] Injected from seed${NC}" + else + echo -e "${RED}ERROR: wrapper jar missing${NC}" + exit 1 + fi +else + echo -e "${GREEN}[OK] Wrapper verified${NC}" +fi +echo "" + +# ============================================================================ +# [9/16] Set GRADLE_USER_HOME +# ============================================================================ +echo -e "${YELLOW}==[9/16] Set GRADLE_USER_HOME ==${NC}" + +OFFLINE_HOME="$PROJECT_DIR/$OFFLINE_HOME_NAME" +if [ ! -d "$OFFLINE_HOME" ]; then + echo -e "${RED}ERROR: _offline_gradle_home not found in archive${NC}" + exit 1 +fi + +export GRADLE_USER_HOME="$OFFLINE_HOME" +echo -e "${CYAN}GRADLE_USER_HOME = $GRADLE_USER_HOME${NC}" + +# Check cache +CACHES_DIR="$OFFLINE_HOME/caches" +if [ -d "$CACHES_DIR" ]; then + # macOS du command + if du -k "$CACHES_DIR" &>/dev/null; then + CACHE_SIZE=$(du -sk "$CACHES_DIR" 2>/dev/null | cut -f1) + CACHE_SIZE=$((CACHE_SIZE * 1024)) + else + CACHE_SIZE=0 + fi + CACHE_SIZE_MB=$(awk "BEGIN {printf \"%.2f\", $CACHE_SIZE / 1048576}") + echo -e "${CYAN}[INFO] Cache size: ${CACHE_SIZE_MB} MB${NC}" +else + echo -e "${YELLOW}[WARN] No cache folder found${NC}" +fi +echo "" + +# ============================================================================ +# [10/16] Verify settings.gradle +# ============================================================================ +echo -e "${YELLOW}==[10/16] Verify settings.gradle ==${NC}" + +SETTINGS_FILE="" +if [ -f "./settings.gradle" ]; then + SETTINGS_FILE="settings.gradle" +elif [ -f "./settings.gradle.kts" ]; then + SETTINGS_FILE="settings.gradle.kts" +fi + +if [ -n "$SETTINGS_FILE" ]; then + if grep -q "mavenLocal()" "$SETTINGS_FILE" && grep -q "pluginManagement" "$SETTINGS_FILE"; then + echo -e "${GREEN}[OK] settings.gradle configured for offline${NC}" + else + echo -e "${YELLOW}[WARN] settings.gradle may not be configured for offline${NC}" + echo -e "${GRAY}[INFO] Build may fail if plugins not cached${NC}" + fi +fi +echo "" + +# ============================================================================ +# [11/16] Test Gradle +# ============================================================================ +echo -e "${YELLOW}==[11/16] Test Gradle ==${NC}" + +GRADLE_WORKS=false +if ./gradlew --offline --version &>/dev/null; then + GRADLE_WORKS=true + echo -e "${GREEN}[OK] Gradle working in offline mode${NC}" +else + echo -e "${YELLOW}[WARN] Gradle --version failed${NC}" +fi +echo "" + +# ============================================================================ +# [12/16] Stop Daemon +# ============================================================================ +echo -e "${YELLOW}==[12/16] Stop Daemon ==${NC}" + +./gradlew --stop &>/dev/null || true +sleep 2 +echo -e "${GREEN}[OK] Daemon stopped${NC}" +echo "" + +# ============================================================================ +# [13/16] Run Offline Build +# ============================================================================ +echo -e "${YELLOW}==[13/16] Run Offline Build ==${NC}" +echo "" +echo -e "${CYAN}============================================================${NC}" +echo -e "${CYAN} Building with --offline flag${NC}" +echo -e "${CYAN} All dependencies from local cache${NC}" +echo -e "${CYAN}============================================================${NC}" +echo "" + +BUILD_SUCCESS=false +BUILD_TASK="" + +# Try bootJar +echo -e "${GRAY}[TRY] --offline bootJar...${NC}" +if ./gradlew --offline clean bootJar --no-daemon; then + BUILD_SUCCESS=true + BUILD_TASK="bootJar" +fi + +# Try jar +if [ "$BUILD_SUCCESS" = false ]; then + echo -e "${GRAY}[TRY] --offline jar...${NC}" + if ./gradlew --offline clean jar --no-daemon; then + BUILD_SUCCESS=true + BUILD_TASK="jar" + fi +fi + +# Try build +if [ "$BUILD_SUCCESS" = false ]; then + echo -e "${GRAY}[TRY] --offline build...${NC}" + if ./gradlew --offline build --no-daemon; then + BUILD_SUCCESS=true + BUILD_TASK="build" + fi +fi + +echo "" +if [ "$BUILD_SUCCESS" = true ]; then + echo -e "${GREEN}============================================================${NC}" + echo -e "${GREEN} BUILD SUCCESS! (task: $BUILD_TASK)${NC}" + echo -e "${GREEN}============================================================${NC}" +else + echo -e "${RED}============================================================${NC}" + echo -e "${RED} BUILD FAILED!${NC}" + echo -e "${RED}============================================================${NC}" + echo "" + echo -e "${YELLOW}Possible causes:${NC}" + echo -e "${WHITE} - Dependencies not in cache${NC}" + echo -e "${WHITE} - Plugin resolution failed${NC}" + echo -e "${WHITE} - Need complete build in online env first${NC}" + exit 1 +fi +echo "" + +# ============================================================================ +# [14/16] Show Build Output +# ============================================================================ +echo -e "${YELLOW}==[14/16] Build Output ==${NC}" + +LIBS_DIR="$PROJECT_DIR/build/libs" +if [ -d "$LIBS_DIR" ]; then + echo -e "${CYAN}build/libs contents:${NC}" + ls -lh "$LIBS_DIR"/*.jar 2>/dev/null | awk '{printf " %-40s %10s\n", $9, $5}' + + MAIN_JAR=$(find "$LIBS_DIR" -name "*.jar" -type f ! -name "*-plain.jar" ! -name "*-sources.jar" ! -name "*-javadoc.jar" 2>/dev/null | head -1) +else + echo -e "${YELLOW}[WARN] build/libs not found${NC}" +fi +echo "" + +# ============================================================================ +# [15/16] Run Instructions +# ============================================================================ +echo -e "${YELLOW}==[15/16] Run Instructions ==${NC}" +echo "" + +if [ -n "$MAIN_JAR" ]; then + echo -e "${CYAN}To run the application:${NC}" + echo -e "${WHITE} java -jar $(basename "$MAIN_JAR")${NC}" + echo "" +fi + +echo -e "${CYAN}To rebuild:${NC}" +echo -e "${WHITE} export GRADLE_USER_HOME=\"./_offline_gradle_home\"${NC}" +echo -e "${WHITE} ./gradlew --offline bootJar --no-daemon${NC}" +echo "" + +# ============================================================================ +# [16/16] Complete +# ============================================================================ +echo -e "${GREEN}============================================================${NC}" +echo -e "${GREEN} Offline Build Complete!${NC}" +echo -e "${GREEN}============================================================${NC}" +echo "" +echo -e "${CYAN}Project: $PROJECT_DIR${NC}" +echo ""