Merge pull request '모델관리 추가, 추론관리 추가' (#11) from feat/demo-20251205 into develop

Reviewed-on: https://kamco.gitea.gs.dabeeo.com/dabeeo/kamco-dabeeo-backoffice/pulls/11
This commit is contained in:
2025-11-26 10:10:30 +09:00
73 changed files with 44311 additions and 182 deletions

429
GEOJSON_MONITOR_README.md Normal file
View File

@@ -0,0 +1,429 @@
# GeoJSON 파일 모니터링 시스템
kamco-dabeeo-backoffice 프로젝트에 추가된 GeoJSON 파일 자동 모니터링 및 처리 시스템입니다.
## 주요 기능
### 1. 자동 파일 모니터링
- 지정된 폴더(`/data/geojson/upload`)를 주기적으로 모니터링 (기본값: 30초마다)
- 지원 압축 형식: ZIP, TAR, TAR.GZ, TGZ
- 최대 파일 크기: 100MB
### 2. 압축파일 자동 처리
- 압축파일에서 GeoJSON 파일들을 자동 추출
- GeoJSON 데이터 검증 및 파싱
- `tb_map_sheet_learn_data` 테이블에 데이터 저장
### 3. Geometry 데이터 자동 변환
- JSON 데이터에서 Geometry 정보 추출
- `tb_map_sheet_learn_data_geom` 테이블에 geometry 데이터 저장
- JTS Geometry 객체로 변환하여 PostGIS와 연동
## 시스템 구조
### 폴더 구조
```
/data/geojson/
├── upload/ # 모니터링 대상 폴더 (압축파일 업로드)
├── processed/ # 처리 완료된 파일들
├── error/ # 처리 실패한 파일들
└── /tmp/geojson_extract/ # 임시 추출 폴더
```
### 데이터베이스 테이블
#### tb_map_sheet_learn_data (학습데이터)
- `data_uid`: 식별키
- `data_name`: 파일명
- `data_path`: 파일 경로
- `data_type`: 데이터 타입 (GeoJSON)
- `data_json`: GeoJSON 원본 데이터 (JSON)
- `data_state`: 처리 상태 (PROCESSED/PENDING)
- `anal_state`: 분석 상태 (PENDING/COMPLETED/ERROR)
#### tb_map_sheet_learn_data_geom (학습데이터 Geometry)
- `geo_uid`: 식별키
- `data_uid`: 연결된 학습데이터 ID
- `geom`: Geometry 데이터 (PostGIS)
- `geo_type`: Geometry 타입 (Point, Polygon 등)
- `area`: 면적 정보
- 기타 속성 정보들
## 설정 방법
### application.yml 설정
```yaml
geojson:
monitor:
watch-directory: /data/geojson/upload
processed-directory: /data/geojson/processed
error-directory: /data/geojson/error
temp-directory: /tmp/geojson_extract
cron-expression: "0/30 * * * * *" # 매 30초
supported-extensions:
- zip
- tar
- tar.gz
- tgz
max-file-size: 104857600 # 100MB
```
## 사용 방법
### 1. 자동 모니터링
1. 시스템 시작 시 자동으로 스케줄러가 동작
2. `/data/geojson/upload` 폴더에 압축파일 업로드
3. 30초 이내에 자동으로 처리됨
4. 처리 결과는 로그에서 확인 가능
### 2. API를 통한 수동 제어
#### 모니터링 상태 확인
```bash
GET /api/geojson/monitor/status
```
#### 특정 파일 수동 처리
```bash
POST /api/geojson/process/file?filePath=/path/to/your/file.zip
```
#### 미처리 Geometry 변환
```bash
POST /api/geojson/process/geometry
```
#### 특정 데이터 Geometry 변환
```bash
POST /api/geojson/process/geometry/convert
Content-Type: application/json
[1, 2, 3] # 학습데이터 ID 배열
```
### 3. 데이터 조회 API
#### 학습 데이터 목록 조회
```bash
GET /api/geojson/data/learn-data?page=0&size=10&dataState=PROCESSED
```
#### 학습 데이터 상세 조회
```bash
GET /api/geojson/data/learn-data/{id}
```
#### Geometry 데이터 조회
```bash
GET /api/geojson/data/geometry?dataUid=123
```
#### 시스템 통계 조회
```bash
GET /api/geojson/data/statistics
```
#### 상태별 카운트 조회
```bash
GET /api/geojson/data/status-counts
```
## 처리 흐름
1. **파일 모니터링**: 스케줄러가 주기적으로 upload 폴더 검사
2. **압축파일 검증**: 파일 형식, 크기 확인
3. **압축파일 추출**: GeoJSON 파일들 추출
4. **데이터 검증**: GeoJSON 형식 검증
5. **학습데이터 저장**: `tb_map_sheet_learn_data`에 저장
6. **Geometry 변환**: JSON → JTS Geometry 변환
7. **Geometry 저장**: `tb_map_sheet_learn_data_geom`에 저장
8. **파일 이동**: 처리 완료된 파일을 processed 폴더로 이동
## 오류 처리
### 처리 실패 시
- 파일은 `/data/geojson/error/` 폴더로 이동
- 오류 정보는 `.error.info` 파일에 기록
- 로그에 상세한 오류 내용 기록
### 복구 방법
- 오류 원인을 수정 후 error 폴더에서 upload 폴더로 파일 이동
- 또는 API를 통한 수동 처리 실행
## 모니터링 및 로그
### 로그 확인
```bash
# 전체 로그
tail -f logs/application.log
# GeoJSON 관련 로그만
tail -f logs/application.log | grep "geojson"
```
### 주요 로그 레벨
- `INFO`: 파일 처리 시작/완료
- `WARN`: 비정상적이지만 처리 가능한 상황
- `ERROR`: 처리 실패 및 오류 상황
- `DEBUG`: 상세한 처리 과정
## 성능 고려사항
### 제한사항
- 최대 파일 크기: 100MB
- 한 번에 처리 가능한 GeoJSON 파일: 50개
- 메모리 사용량을 고려하여 대용량 파일 처리 시 주의
### 최적화 팁
- 압축파일 내 GeoJSON 파일 개수를 적절히 제한
- 너무 큰 GeoJSON 파일은 분할하여 처리
- 시스템 리소스에 따라 cron 표현식 조정
## 문제 해결
### 자주 발생하는 문제
1. **폴더 권한 문제**
```bash
chmod 755 /data/geojson/upload
chown -R app-user:app-group /data/geojson/
```
2. **디스크 공간 부족**
- processed 폴더의 오래된 파일들 정리
- 임시 폴더 정리
3. **메모리 부족**
- JVM 힙 크기 조정: `-Xmx2g`
- 처리할 파일 크기 제한
4. **데이터베이스 연결 문제**
- PostgreSQL 연결 상태 확인
- PostGIS 확장 설치 확인
### 디버깅 방법
1. **로그 레벨 조정**
```yaml
logging:
level:
com.kamco.cd.kamcoback.geojson: DEBUG
```
2. **API 테스트**
```bash
# 상태 확인
curl http://localhost:8080/api/geojson/monitor/status
# 통계 확인
curl http://localhost:8080/api/geojson/data/statistics
```
## 확장 가능한 기능
1. **알림 시스템**: 처리 완료/실패 시 이메일 또는 슬랙 알림
2. **웹 UI**: 모니터링 및 관리를 위한 웹 인터페이스
3. **배치 처리**: 대용량 파일을 위한 비동기 배치 처리
4. **데이터 검증**: 더 상세한 GeoJSON 데이터 검증 룰
5. **성능 모니터링**: 처리 시간, 메모리 사용량 등 성능 지표 수집
#### 터미널 API 테스트 로그
```bash
deniallee@Denialui-MacBookPro-2 kamco-dabeeo-backoffice % curl -s "http://localhost:8080/api/geojson/monitor/status" | jq .
{
"cronExpression": "0/30 * * * * *",
"processedDirectory": "/Users/deniallee/geojson/processed",
"watchDirectory": "/Users/deniallee/geojson/upload",
"errorDirectory": "/Users/deniallee/geojson/error",
"maxFileSize": 104857600,
"maxFileSizeMB": 100,
"supportedExtensions": [
"zip",
"tar",
"tar.gz",
"tgz"
]
}
```
---------------------------------------------------------------------------------------------------------------------------
```bash
deniallee@Denialui-MacBookPro-2 kamco-dabeeo-backoffice % curl -s "http://localhost:8080/api/geojson/monitor/stats" | jq .
{
"fileSystem": {
"processedDirectoryCount": 1,
"errorDirectoryCount": 2,
"watchDirectoryCount": 1
},
"database": {
"totalLearnData": 1,
"totalGeomData": 1,
"pendingAnalysis": 0
},
"monitoring": {
"errorDirectory": "/Users/deniallee/geojson/error",
"isActive": true,
"watchDirectory": "/Users/deniallee/geojson/upload",
"processedDirectory": "/Users/deniallee/geojson/processed",
"cronExpression": "0/30 * * * * *"
}
}
```
---------------------------------------------------------------------------------------------------------------------------
```bash
deniallee@Denialui-MacBookPro-2 kamco-dabeeo-backoffice % curl -X POST -s "http://localhost:8080/api/geojson/monitor/init-d
irectories" | jq .
{
"status": "success",
"message": "디렉토리 초기화가 완료되었습니다."
}
```
---------------------------------------------------------------------------------------------------------------------------
```bash
deniallee@Denialui-MacBookPro-2 kamco-dabeeo-backoffice % curl -X POST -s "http://localhost:8080/api/geojson/process/geomet
ry" | jq .
{
"message": "Geometry 변환이 완료되었습니다.",
"processedCount": 0,
"processedIds": [],
"status": "success"
}
```
# GeoJSON 파일 모니터링 시스템 - API 문제 해결 리스트
## 해결된 문제들
### 1. **누락된 Stats API 추가**
- **문제**: `/monitor/stats` 엔드포인트가 없어서 500 에러 발생
- **해결**: 시스템 통계 정보를 제공하는 완전한 API 구현
### 2. **Repository 메서드 누락**
- **문제**: `countByAnalState` 메서드 미구현으로 컴파일 에러
- **해결**: `MapSheetLearnDataRepository`에 메서드 추가
### 3. **Import 누락**
- **문제**: `HashMap`, Repository 클래스들 import 누락
- **해결**: 필요한 모든 import 문 추가
## **현재 사용 가능한 API 목록**
### **GET APIs**
#### 1. 모니터링 상태 조회
```bash
curl "http://localhost:8080/api/geojson/monitor/status"
```
**응답 예시:**
```json
{
"cronExpression": "0/30 * * * * *",
"processedDirectory": "/Users/deniallee/geojson/processed",
"watchDirectory": "/Users/deniallee/geojson/upload",
"errorDirectory": "/Users/deniallee/geojson/error",
"maxFileSize": 104857600,
"maxFileSizeMB": 100,
"supportedExtensions": ["zip", "tar", "tar.gz", "tgz"]
}
```
#### 2. 시스템 통계 조회 **[신규 추가]**
```bash
curl "http://localhost:8080/api/geojson/monitor/stats"
```
**응답 예시:**
```json
{
"fileSystem": {
"processedDirectoryCount": 1,
"errorDirectoryCount": 2,
"watchDirectoryCount": 1
},
"database": {
"totalLearnData": 1,
"totalGeomData": 1,
"pendingAnalysis": 0
},
"monitoring": {
"errorDirectory": "/Users/deniallee/geojson/error",
"isActive": true,
"watchDirectory": "/Users/deniallee/geojson/upload",
"processedDirectory": "/Users/deniallee/geojson/processed",
"cronExpression": "0/30 * * * * *"
}
}
```
### **POST APIs**
#### 1. 디렉토리 초기화
```bash
curl -X POST "http://localhost:8080/api/geojson/monitor/init-directories"
```
**응답 예시:**
```json
{
"status": "success",
"message": "디렉토리 초기화가 완료되었습니다."
}
```
#### 2. 수동 파일 처리
```bash
curl -X POST "http://localhost:8080/api/geojson/process/file?filePath=/path/to/file.zip"
```
#### 3. 미처리 Geometry 변환
```bash
curl -X POST "http://localhost:8080/api/geojson/process/geometry"
```
**응답 예시:**
```json
{
"message": "Geometry 변환이 완료되었습니다.",
"processedCount": 0,
"processedIds": [],
"status": "success"
}
```
#### 4. 특정 학습 데이터 Geometry 변환
```bash
curl -X POST "http://localhost:8080/api/geojson/process/geometry/convert" \
-H "Content-Type: application/json" \
-d '[1, 2, 3]'
```
## **주요 개선사항**
1. **완전한 통계 정보 제공**
- 데이터베이스 통계 (학습 데이터, Geometry 데이터 수)
- 파일 시스템 통계 (각 폴더별 파일 수)
- 모니터링 설정 정보
2. **견고한 에러 처리**
- 각 API별 적절한 에러 처리
- 상세한 에러 메시지 제공
- 로깅을 통한 디버깅 지원
3. **일관된 응답 형식**
- 성공/실패 상태 명확히 구분
- JSON 형식 통일
- 적절한 HTTP 상태 코드
## **시스템 현재 상태**
- **모니터링 시스템**: 정상 작동 (30초 간격)
- **API 서버**: http://localhost:8080 에서 실행 중
- **데이터베이스**: PostgreSQL + PostGIS 연결됨
- **파일 처리**: 자동 ZIP/TAR 처리 가능
- **통계 조회**: 실시간 시스템 상태 확인 가능
---

View File

@@ -39,6 +39,7 @@ dependencies {
implementation 'com.fasterxml.jackson.core:jackson-databind'
implementation 'org.locationtech.jts.io:jts-io-common:1.20.0'
implementation 'org.locationtech.jts:jts-core:1.19.0'
//implementation 'org.hibernate:hibernate-spatial:6.2.7.Final'
// QueryDSL JPA
implementation 'com.querydsl:querydsl-jpa:5.0.0:jakarta'
@@ -53,6 +54,9 @@ dependencies {
// SpringDoc OpenAPI (Swagger)
implementation 'org.springdoc:springdoc-openapi-starter-webmvc-ui:2.7.0'
// Apache Commons Compress for archive handling
implementation 'org.apache.commons:commons-compress:1.26.0'
}
tasks.named('test') {

View File

@@ -2,8 +2,10 @@ package com.kamco.cd.kamcoback;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.scheduling.annotation.EnableScheduling;
@SpringBootApplication
@EnableScheduling
public class KamcoBackApplication {
public static void main(String[] args) {

View File

@@ -0,0 +1,28 @@
package com.kamco.cd.kamcoback.changedetection;
import com.kamco.cd.kamcoback.changedetection.dto.ChangeDetectionDto;
import com.kamco.cd.kamcoback.changedetection.service.ChangeDetectionService;
import com.kamco.cd.kamcoback.config.api.ApiResponseDto;
import io.swagger.v3.oas.annotations.tags.Tag;
import jakarta.transaction.Transactional;
import lombok.RequiredArgsConstructor;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import java.util.List;
@Tag(name = "변화탐지", description = "변화탐지 API")
@RequiredArgsConstructor
@RestController
@RequestMapping("/api/change-detection")
@Transactional
public class ChangeDetectionApiController {
private final ChangeDetectionService changeDetectionService;
@GetMapping
public ApiResponseDto<List<ChangeDetectionDto>> getPolygonToPoint() {
return ApiResponseDto.ok(changeDetectionService.getPolygonToPoint());
}
}

View File

@@ -0,0 +1,10 @@
package com.kamco.cd.kamcoback.changedetection.dto;
import org.locationtech.jts.geom.Geometry;
public record ChangeDetectionDto(
Long id,
Geometry polygon,
double centroidX,
double centroidY
) {}

View File

@@ -0,0 +1,20 @@
package com.kamco.cd.kamcoback.changedetection.service;
import com.kamco.cd.kamcoback.changedetection.dto.ChangeDetectionDto;
import com.kamco.cd.kamcoback.config.api.ApiResponseDto;
import com.kamco.cd.kamcoback.postgres.core.ChangeDetectionCoreService;
import lombok.RequiredArgsConstructor;
import org.springframework.stereotype.Service;
import java.util.List;
@Service
@RequiredArgsConstructor
public class ChangeDetectionService {
private final ChangeDetectionCoreService changeDetectionCoreService;
public List<ChangeDetectionDto> getPolygonToPoint() {
return changeDetectionCoreService.getPolygonToPoint();
}
}

View File

@@ -36,24 +36,24 @@ public class GlobalExceptionHandler {
this.errorLogRepository = errorLogRepository;
}
@ResponseStatus(HttpStatus.NOT_FOUND)
@ResponseStatus(HttpStatus.UNPROCESSABLE_ENTITY)
@ExceptionHandler(EntityNotFoundException.class)
public ApiResponseDto<String> handlerEntityNotFoundException(
EntityNotFoundException e, HttpServletRequest request) {
log.warn("[EntityNotFoundException] resource :{} ", e.getMessage());
String codeName = "NOT_FOUND";
String codeName = "NOT_FOUND_DATA";
ErrorLogEntity errorLog =
saveErrerLogData(
request,
ApiResponseCode.getCode(codeName),
HttpStatus.valueOf(codeName),
ErrorLogDto.LogErrorLevel.ERROR,
HttpStatus.valueOf("UNPROCESSABLE_ENTITY"),
ErrorLogDto.LogErrorLevel.WARNING,
e.getStackTrace());
return ApiResponseDto.createException(
ApiResponseCode.getCode(codeName),
ApiResponseCode.getMessage(codeName),
HttpStatus.valueOf(codeName),
HttpStatus.valueOf("UNPROCESSABLE_ENTITY"),
errorLog.getId());
}
@@ -130,7 +130,7 @@ public class GlobalExceptionHandler {
saveErrerLogData(
request,
ApiResponseCode.getCode(codeName),
HttpStatus.valueOf(codeName),
HttpStatus.valueOf("UNPROCESSABLE_ENTITY"),
ErrorLogDto.LogErrorLevel.CRITICAL,
e.getStackTrace());
@@ -204,6 +204,28 @@ public class GlobalExceptionHandler {
errorLog.getId());
}
@ResponseStatus(HttpStatus.UNPROCESSABLE_ENTITY)
@ExceptionHandler(IllegalStateException.class)
public ApiResponseDto<String> handlerIllegalStateException(
IllegalStateException e, HttpServletRequest request) {
log.warn("[IllegalStateException] resource :{} ", e.getMessage());
String codeName = "UNPROCESSABLE_ENTITY";
ErrorLogEntity errorLog =
saveErrerLogData(
request,
ApiResponseCode.getCode(codeName),
HttpStatus.valueOf(codeName),
ErrorLogDto.LogErrorLevel.WARNING,
e.getStackTrace());
return ApiResponseDto.createException(
ApiResponseCode.getCode(codeName),
ApiResponseCode.getMessage(codeName),
HttpStatus.valueOf(codeName),
errorLog.getId());
}
@ResponseStatus(HttpStatus.INTERNAL_SERVER_ERROR)
@ExceptionHandler(RuntimeException.class)
public ApiResponseDto<String> handlerRuntimeException(
@@ -266,12 +288,14 @@ public class GlobalExceptionHandler {
// TODO : 로그인 개발되면 이것도 연결해야 함
Long userid = Long.valueOf(Optional.ofNullable(ApiLogFunction.getUserId(request)).orElse("1"));
// TODO : stackTrace limit 10줄? 확인 필요
// TODO : stackTrace limit 20줄? 확인 필요
String stackTraceStr =
Arrays.stream(stackTrace)
.limit(10)
// .limit(20)
.map(StackTraceElement::toString)
.collect(Collectors.joining("\n"));
.collect(Collectors.joining("\n"))
.substring(0, Math.min(stackTrace.length, 255));
;
ErrorLogEntity errorLogEntity =
new ErrorLogEntity(

View File

@@ -142,6 +142,7 @@ public class ApiResponseDto<T> {
DUPLICATE_EMPLOYEEID("이미 가입된 사번입니다."),
NOT_FOUND_USER_FOR_EMAIL("이메일로 유저를 찾을 수 없습니다."),
NOT_FOUND_USER("사용자를 찾을 수 없습니다."),
UNPROCESSABLE_ENTITY("이 데이터는 삭제할 수 없습니다."),
INVALID_EMAIL_TOKEN(
"You can only reset your password within 24 hours from when the email was sent.\n"
+ "To reset your password again, please submit a new request through \"Forgot"

View File

@@ -0,0 +1,72 @@
package com.kamco.cd.kamcoback.geojson.config;
import lombok.Getter;
import lombok.Setter;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.stereotype.Component;
import jakarta.annotation.PostConstruct;
/**
* GeoJSON 파일 모니터링 설정
*/
@Component
@ConfigurationProperties(prefix = "geojson.monitor")
@Getter
@Setter
public class GeoJsonMonitorConfig {
/**
* 모니터링할 폴더 경로
*/
private String watchDirectory = "~/geojson/upload";
/**
* 처리 완료 후 파일을 이동할 폴더 경로
*/
private String processedDirectory = "~/geojson/processed";
/**
* 처리 실패 파일을 이동할 폴더 경로
*/
private String errorDirectory = "~/geojson/error";
/**
* 파일 모니터링 스케줄 (cron 표현식)
* 기본값: 매 30초마다 실행
*/
private String cronExpression = "0/30 * * * * *";
/**
* 지원하는 압축파일 확장자
*/
private String[] supportedExtensions = {"zip", "tar", "tar.gz", "tgz"};
/**
* 처리할 최대 파일 크기 (바이트)
*/
private long maxFileSize = 100 * 1024 * 1024; // 100MB
/**
* 임시 압축해제 폴더
*/
private String tempDirectory = "/tmp/geojson_extract";
/**
* 홈 디렉토리 경로 확장
*/
@PostConstruct
public void expandPaths() {
watchDirectory = expandPath(watchDirectory);
processedDirectory = expandPath(processedDirectory);
errorDirectory = expandPath(errorDirectory);
tempDirectory = expandPath(tempDirectory);
}
private String expandPath(String path) {
if (path.startsWith("~")) {
return path.replace("~", System.getProperty("user.home"));
}
return path;
}
}

View File

@@ -0,0 +1,203 @@
package com.kamco.cd.kamcoback.geojson.controller;
import com.kamco.cd.kamcoback.postgres.entity.MapSheetLearnDataEntity;
import com.kamco.cd.kamcoback.postgres.entity.MapSheetLearnDataGeomEntity;
import com.kamco.cd.kamcoback.postgres.repository.MapSheetLearnDataGeomRepository;
import com.kamco.cd.kamcoback.postgres.repository.MapSheetLearnDataRepository;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.springframework.data.domain.PageRequest;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Optional;
/**
* GeoJSON 데이터 조회 및 테스트용 API 컨트롤러
*/
@Slf4j
@RestController
@RequestMapping("/api/geojson/data")
@RequiredArgsConstructor
public class GeoJsonDataController {
private final MapSheetLearnDataRepository mapSheetLearnDataRepository;
private final MapSheetLearnDataGeomRepository mapSheetLearnDataGeomRepository;
/**
* 학습 데이터 목록 조회
*/
@GetMapping("/learn-data")
public ResponseEntity<Map<String, Object>> getLearnDataList(
@RequestParam(defaultValue = "0") int page,
@RequestParam(defaultValue = "10") int size,
@RequestParam(required = false) String dataState,
@RequestParam(required = false) String analState) {
try {
PageRequest pageRequest = PageRequest.of(page, size);
List<MapSheetLearnDataEntity> learnDataList;
if (dataState != null) {
learnDataList = mapSheetLearnDataRepository.findByDataState(dataState);
} else if (analState != null) {
learnDataList = mapSheetLearnDataRepository.findByAnalState(analState);
} else {
learnDataList = mapSheetLearnDataRepository.findAll(pageRequest).getContent();
}
Map<String, Object> response = new HashMap<>();
response.put("data", learnDataList);
response.put("totalCount", learnDataList.size());
response.put("page", page);
response.put("size", size);
return ResponseEntity.ok(response);
} catch (Exception e) {
log.error("학습 데이터 목록 조회 실패", e);
return ResponseEntity.internalServerError()
.body(Map.of("error", "데이터 조회 실패: " + e.getMessage()));
}
}
/**
* 특정 학습 데이터 상세 조회
*/
@GetMapping("/learn-data/{id}")
public ResponseEntity<Map<String, Object>> getLearnDataDetail(@PathVariable Long id) {
try {
if (id == null) {
return ResponseEntity.badRequest()
.body(Map.of("error", "ID가 필요합니다."));
}
Optional<MapSheetLearnDataEntity> learnDataOpt = mapSheetLearnDataRepository.findById(id);
if (learnDataOpt.isEmpty()) {
return ResponseEntity.notFound().build();
}
MapSheetLearnDataEntity learnData = learnDataOpt.get();
List<MapSheetLearnDataGeomEntity> geometryList = mapSheetLearnDataGeomRepository.findByDataUid(id);
Map<String, Object> response = new HashMap<>();
response.put("learnData", learnData);
response.put("geometryData", geometryList);
response.put("geometryCount", geometryList.size());
return ResponseEntity.ok(response);
} catch (Exception e) {
log.error("학습 데이터 상세 조회 실패: {}", id, e);
return ResponseEntity.internalServerError()
.body(Map.of("error", "데이터 조회 실패: " + e.getMessage()));
}
}
/**
* Geometry 데이터 목록 조회
*/
@GetMapping("/geometry")
public ResponseEntity<Map<String, Object>> getGeometryDataList(
@RequestParam(defaultValue = "0") int page,
@RequestParam(defaultValue = "10") int size,
@RequestParam(required = false) Long dataUid,
@RequestParam(required = false) String geoType) {
try {
List<MapSheetLearnDataGeomEntity> geometryList;
if (dataUid != null) {
geometryList = mapSheetLearnDataGeomRepository.findByDataUid(dataUid);
} else if (geoType != null) {
geometryList = mapSheetLearnDataGeomRepository.findByGeoType(geoType);
} else {
PageRequest pageRequest = PageRequest.of(page, size);
geometryList = mapSheetLearnDataGeomRepository.findAll(pageRequest).getContent();
}
Map<String, Object> response = new HashMap<>();
response.put("data", geometryList);
response.put("totalCount", geometryList.size());
response.put("page", page);
response.put("size", size);
return ResponseEntity.ok(response);
} catch (Exception e) {
log.error("Geometry 데이터 목록 조회 실패", e);
return ResponseEntity.internalServerError()
.body(Map.of("error", "데이터 조회 실패: " + e.getMessage()));
}
}
/**
* 시스템 통계 정보 조회
*/
@GetMapping("/statistics")
public ResponseEntity<Map<String, Object>> getStatistics() {
try {
long totalLearnData = mapSheetLearnDataRepository.count();
long totalGeometryData = mapSheetLearnDataGeomRepository.count();
List<MapSheetLearnDataEntity> processedData = mapSheetLearnDataRepository.findByDataState("PROCESSED");
List<MapSheetLearnDataEntity> pendingAnalysis = mapSheetLearnDataRepository.findByAnalState("PENDING");
List<MapSheetLearnDataEntity> completedAnalysis = mapSheetLearnDataRepository.findByAnalState("COMPLETED");
List<MapSheetLearnDataEntity> errorAnalysis = mapSheetLearnDataRepository.findByAnalState("ERROR");
Map<String, Object> statistics = new HashMap<>();
statistics.put("totalLearnData", totalLearnData);
statistics.put("totalGeometryData", totalGeometryData);
statistics.put("processedDataCount", processedData.size());
statistics.put("pendingAnalysisCount", pendingAnalysis.size());
statistics.put("completedAnalysisCount", completedAnalysis.size());
statistics.put("errorAnalysisCount", errorAnalysis.size());
// 처리 완료율 계산
if (totalLearnData > 0) {
double completionRate = (double) completedAnalysis.size() / totalLearnData * 100;
statistics.put("completionRate", Math.round(completionRate * 100.0) / 100.0);
} else {
statistics.put("completionRate", 0.0);
}
return ResponseEntity.ok(Map.of(
"statistics", statistics,
"timestamp", java.time.Instant.now()
));
} catch (Exception e) {
log.error("통계 정보 조회 실패", e);
return ResponseEntity.internalServerError()
.body(Map.of("error", "통계 조회 실패: " + e.getMessage()));
}
}
/**
* 데이터 상태별 카운트 조회
*/
@GetMapping("/status-counts")
public ResponseEntity<Map<String, Object>> getStatusCounts() {
try {
Map<String, Long> dataStateCounts = new HashMap<>();
Map<String, Long> analStateCounts = new HashMap<>();
// 데이터 상태별 카운트
dataStateCounts.put("PROCESSED", mapSheetLearnDataRepository.findByDataState("PROCESSED").size() + 0L);
dataStateCounts.put("PENDING", mapSheetLearnDataRepository.findByDataStateIsNullOrDataState("PENDING").size() + 0L);
// 분석 상태별 카운트
analStateCounts.put("PENDING", mapSheetLearnDataRepository.findByAnalState("PENDING").size() + 0L);
analStateCounts.put("COMPLETED", mapSheetLearnDataRepository.findByAnalState("COMPLETED").size() + 0L);
analStateCounts.put("ERROR", mapSheetLearnDataRepository.findByAnalState("ERROR").size() + 0L);
return ResponseEntity.ok(Map.of(
"dataStateCounts", dataStateCounts,
"analStateCounts", analStateCounts,
"timestamp", java.time.Instant.now()
));
} catch (Exception e) {
log.error("상태별 카운트 조회 실패", e);
return ResponseEntity.internalServerError()
.body(Map.of("error", "카운트 조회 실패: " + e.getMessage()));
}
}
}

View File

@@ -0,0 +1,154 @@
package com.kamco.cd.kamcoback.geojson.controller;
import com.kamco.cd.kamcoback.geojson.service.GeoJsonFileMonitorService;
import com.kamco.cd.kamcoback.geojson.service.GeometryConversionService;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;
import java.util.List;
import java.util.Map;
/**
* GeoJSON 파일 모니터링 및 처리 API 컨트롤러
*/
@Slf4j
@RestController
@RequestMapping("/api/geojson")
@RequiredArgsConstructor
public class GeoJsonMonitorController {
private final GeoJsonFileMonitorService monitorService;
private final GeometryConversionService geometryConversionService;
/**
* 모니터링 상태 조회
*/
@GetMapping("/monitor/status")
public Map<String, Object> getMonitorStatus() {
return monitorService.getMonitorStatus();
}
/**
* 시스템 통계 정보 조회
*/
@GetMapping("/monitor/stats")
public ResponseEntity<Map<String, Object>> getSystemStats() {
try {
Map<String, Object> stats = monitorService.getSystemStats();
return ResponseEntity.ok(stats);
} catch (Exception e) {
log.error("시스템 통계 조회 실패", e);
return ResponseEntity.internalServerError()
.body(Map.of(
"error", "시스템 통계 조회 실패: " + e.getMessage(),
"status", "error"
));
}
}
/**
* 디렉토리 초기화 (수동 실행)
*/
@PostMapping("/monitor/init-directories")
public ResponseEntity<Map<String, Object>> initializeDirectories() {
try {
log.info("디렉토리 초기화 수동 실행 요청");
monitorService.initializeDirectoriesManually();
return ResponseEntity.ok(Map.of(
"message", "디렉토리 초기화가 완료되었습니다.",
"status", "success"
));
} catch (Exception e) {
log.error("디렉토리 초기화 실패", e);
return ResponseEntity.internalServerError()
.body(Map.of(
"error", "디렉토리 초기화 실패: " + e.getMessage(),
"status", "error"
));
}
}
/**
* 수동으로 특정 파일 처리
*/
@PostMapping("/process/file")
public ResponseEntity<Map<String, Object>> processFileManually(@RequestParam String filePath) {
try {
log.info("수동 파일 처리 요청: {}", filePath);
monitorService.processFileManually(filePath);
return ResponseEntity.ok(Map.of(
"message", "파일 처리가 완료되었습니다.",
"filePath", filePath,
"status", "success"
));
} catch (Exception e) {
log.error("수동 파일 처리 실패: {}", filePath, e);
return ResponseEntity.internalServerError()
.body(Map.of(
"error", "파일 처리 실패: " + e.getMessage(),
"filePath", filePath,
"status", "error"
));
}
}
/**
* 미처리된 Geometry 데이터 수동 변환
*/
@PostMapping("/process/geometry")
public ResponseEntity<Map<String, Object>> processUnprocessedGeometry() {
try {
log.info("미처리 Geometry 변환 수동 실행 요청");
List<Long> processedIds = geometryConversionService.processUnprocessedLearnData();
return ResponseEntity.ok(Map.of(
"message", "Geometry 변환이 완료되었습니다.",
"processedCount", processedIds.size(),
"processedIds", processedIds,
"status", "success"
));
} catch (Exception e) {
log.error("Geometry 변환 실패", e);
return ResponseEntity.internalServerError()
.body(Map.of(
"error", "Geometry 변환 실패: " + e.getMessage(),
"status", "error"
));
}
}
/**
* 특정 학습 데이터의 Geometry 변환
*/
@PostMapping("/process/geometry/convert")
public ResponseEntity<Map<String, Object>> convertSpecificGeometry(@RequestBody List<Long> learnDataIds) {
try {
if (learnDataIds == null || learnDataIds.isEmpty()) {
return ResponseEntity.badRequest()
.body(Map.of("error", "변환할 학습 데이터 ID가 없습니다."));
}
log.info("특정 학습 데이터 Geometry 변환 요청: {}", learnDataIds);
List<Long> geometryIds = geometryConversionService.convertToGeometryData(learnDataIds);
return ResponseEntity.ok(Map.of(
"message", "Geometry 변환이 완료되었습니다.",
"inputCount", learnDataIds.size(),
"outputCount", geometryIds.size(),
"geometryIds", geometryIds,
"status", "success"
));
} catch (Exception e) {
log.error("특정 Geometry 변환 실패: {}", learnDataIds, e);
return ResponseEntity.internalServerError()
.body(Map.of(
"error", "Geometry 변환 실패: " + e.getMessage(),
"status", "error"
));
}
}
}

View File

@@ -0,0 +1,167 @@
package com.kamco.cd.kamcoback.geojson.service;
import com.kamco.cd.kamcoback.geojson.config.GeoJsonMonitorConfig;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.compress.archivers.ArchiveEntry;
import org.apache.commons.compress.archivers.tar.TarArchiveInputStream;
import org.apache.commons.compress.archivers.zip.ZipArchiveEntry;
import org.apache.commons.compress.archivers.zip.ZipFile;
import org.apache.commons.compress.compressors.gzip.GzipCompressorInputStream;
import org.springframework.stereotype.Service;
import java.io.*;
import java.nio.file.*;
import java.util.*;
import java.util.stream.Stream;
import java.util.zip.ZipInputStream;
/**
* 압축파일 처리 서비스
*/
@Slf4j
@Service
@RequiredArgsConstructor
public class ArchiveExtractorService {
private final GeoJsonMonitorConfig config;
/**
* 압축파일에서 GeoJSON 파일들을 추출
*/
public Map<String, String> extractGeoJsonFiles(Path archiveFile) throws IOException {
Map<String, String> geoJsonContents = new HashMap<>();
String fileName = archiveFile.getFileName().toString().toLowerCase();
log.info("압축파일 추출 시작: {}", archiveFile);
try {
if (fileName.endsWith(".zip")) {
extractFromZip(archiveFile, geoJsonContents);
} else if (fileName.endsWith(".tar") || fileName.endsWith(".tar.gz") || fileName.endsWith(".tgz")) {
extractFromTar(archiveFile, geoJsonContents);
} else {
throw new IllegalArgumentException("지원하지 않는 압축파일 형식: " + fileName);
}
} catch (Exception e) {
log.error("압축파일 추출 실패: {}", archiveFile, e);
throw e;
}
log.info("압축파일에서 {}개의 GeoJSON 파일을 추출했습니다: {}", geoJsonContents.size(), archiveFile);
return geoJsonContents;
}
/**
* ZIP 파일에서 GeoJSON 추출
*/
private void extractFromZip(Path zipFile, Map<String, String> geoJsonContents) throws IOException {
try (ZipFile zip = new ZipFile(zipFile.toFile())) {
Enumeration<ZipArchiveEntry> entries = zip.getEntries();
while (entries.hasMoreElements()) {
ZipArchiveEntry entry = entries.nextElement();
if (!entry.isDirectory() && isGeoJsonFile(entry.getName())) {
try (InputStream inputStream = zip.getInputStream(entry)) {
String content = readInputStream(inputStream);
geoJsonContents.put(entry.getName(), content);
log.debug("ZIP에서 추출: {}", entry.getName());
}
}
}
}
}
/**
* TAR 파일에서 GeoJSON 추출
*/
private void extractFromTar(Path tarFile, Map<String, String> geoJsonContents) throws IOException {
String fileName = tarFile.getFileName().toString().toLowerCase();
InputStream fileInputStream = Files.newInputStream(tarFile);
try {
// GZIP 압축된 TAR 파일인지 확인
if (fileName.endsWith(".gz") || fileName.endsWith(".tgz")) {
fileInputStream = new GzipCompressorInputStream(fileInputStream);
}
try (TarArchiveInputStream tarInputStream = new TarArchiveInputStream(fileInputStream)) {
ArchiveEntry entry;
while ((entry = tarInputStream.getNextEntry()) != null) {
if (!entry.isDirectory() && isGeoJsonFile(entry.getName())) {
String content = readInputStream(tarInputStream);
geoJsonContents.put(entry.getName(), content);
log.debug("TAR에서 추출: {}", entry.getName());
}
}
}
} finally {
try {
fileInputStream.close();
} catch (IOException e) {
log.warn("파일 스트림 종료 실패", e);
}
}
}
/**
* InputStream에서 문자열 읽기
*/
private String readInputStream(InputStream inputStream) throws IOException {
try (BufferedReader reader = new BufferedReader(new InputStreamReader(inputStream, "UTF-8"))) {
StringBuilder content = new StringBuilder();
String line;
while ((line = reader.readLine()) != null) {
content.append(line).append("\n");
}
return content.toString();
}
}
/**
* 파일이 GeoJSON 파일인지 확인
*/
private boolean isGeoJsonFile(String fileName) {
String lowerFileName = fileName.toLowerCase();
return lowerFileName.endsWith(".geojson") || lowerFileName.endsWith(".json");
}
/**
* 지원하는 압축파일인지 확인
*/
public boolean isSupportedArchive(Path file) {
String fileName = file.getFileName().toString().toLowerCase();
for (String extension : config.getSupportedExtensions()) {
if (fileName.endsWith("." + extension)) {
return true;
}
}
return false;
}
/**
* 파일 크기가 제한 범위 내인지 확인
*/
public boolean isFileSizeValid(Path file) {
try {
long fileSize = Files.size(file);
boolean isValid = fileSize <= config.getMaxFileSize();
if (!isValid) {
log.warn("파일 크기가 제한을 초과했습니다: {} ({}MB > {}MB)",
file, fileSize / 1024 / 1024, config.getMaxFileSize() / 1024 / 1024);
}
return isValid;
} catch (IOException e) {
log.error("파일 크기 확인 실패: {}", file, e);
return false;
}
}
}

View File

@@ -0,0 +1,245 @@
package com.kamco.cd.kamcoback.geojson.service;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.kamco.cd.kamcoback.postgres.entity.MapSheetLearnDataEntity;
import com.kamco.cd.kamcoback.postgres.repository.MapSheetLearnDataRepository;
import java.time.ZonedDateTime;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.time.Instant;
import java.util.*;
/**
* GeoJSON 데이터 처리 서비스
*/
@Slf4j
@Service
@RequiredArgsConstructor
public class GeoJsonDataService {
private final MapSheetLearnDataRepository mapSheetLearnDataRepository;
private final ObjectMapper objectMapper;
/**
* GeoJSON 파일들을 데이터베이스에 저장
*/
@Transactional
public List<Long> processGeoJsonFiles(Map<String, String> geoJsonContents, String archiveFileName) {
List<Long> savedIds = new ArrayList<>();
log.info("GeoJSON 파일 처리 시작: {} ({}개 파일)", archiveFileName, geoJsonContents.size());
for (Map.Entry<String, String> entry : geoJsonContents.entrySet()) {
String fileName = entry.getKey();
String geoJsonContent = entry.getValue();
try {
Long savedId = processGeoJsonFile(fileName, geoJsonContent, archiveFileName);
if (savedId != null) {
savedIds.add(savedId);
log.debug("GeoJSON 파일 저장 성공: {} (ID: {})", fileName, savedId);
}
} catch (Exception e) {
log.error("GeoJSON 파일 처리 실패: {}", fileName, e);
// 개별 파일 처리 실패는 전체 처리를 중단시키지 않음
}
}
log.info("GeoJSON 파일 처리 완료: {} (성공: {}개, 전체: {}개)",
archiveFileName, savedIds.size(), geoJsonContents.size());
return savedIds;
}
/**
* 개별 GeoJSON 파일을 MapSheetLearnDataEntity로 변환하여 저장
*/
private Long processGeoJsonFile(String fileName, String geoJsonContent, String archiveFileName) {
try {
// GeoJSON 파싱 및 검증
JsonNode geoJsonNode = objectMapper.readTree(geoJsonContent);
validateGeoJsonStructure(geoJsonNode);
// 파일이 이미 처리되었는지 확인
String dataPath = generateDataPath(archiveFileName, fileName);
Optional<MapSheetLearnDataEntity> existingData = mapSheetLearnDataRepository.findByDataPath(dataPath);
if (existingData.isPresent()) {
log.warn("이미 처리된 파일입니다: {}", dataPath);
return existingData.get().getId();
}
// 새 엔티티 생성 및 저장
MapSheetLearnDataEntity entity = createMapSheetLearnDataEntity(fileName, geoJsonContent, archiveFileName, geoJsonNode);
MapSheetLearnDataEntity savedEntity = mapSheetLearnDataRepository.save(entity);
return savedEntity.getId();
} catch (Exception e) {
log.error("GeoJSON 파일 처리 중 오류 발생: {}", fileName, e);
throw new RuntimeException("GeoJSON 파일 처리 실패: " + fileName, e);
}
}
/**
* GeoJSON 구조 검증
*/
private void validateGeoJsonStructure(JsonNode geoJsonNode) {
if (!geoJsonNode.has("type")) {
throw new IllegalArgumentException("유효하지 않은 GeoJSON: 'type' 필드가 없습니다.");
}
String type = geoJsonNode.get("type").asText();
if (!"FeatureCollection".equals(type) && !"Feature".equals(type) && !"Geometry".equals(type)) {
throw new IllegalArgumentException("지원하지 않는 GeoJSON type: " + type);
}
}
/**
* MapSheetLearnDataEntity 생성
*/
private MapSheetLearnDataEntity createMapSheetLearnDataEntity(
String fileName, String geoJsonContent, String archiveFileName, JsonNode geoJsonNode) {
MapSheetLearnDataEntity entity = new MapSheetLearnDataEntity();
// 기본 정보 설정
entity.setDataName(fileName);
entity.setDataPath(generateDataPath(archiveFileName, fileName));
entity.setDataType("GeoJSON");
entity.setDataTitle(extractTitle(fileName, geoJsonNode));
// CRS 정보 추출 및 설정
setCrsInformation(entity, geoJsonNode);
// JSON 데이터 저장
try {
@SuppressWarnings("unchecked")
Map<String, Object> jsonMap = objectMapper.readValue(geoJsonContent, Map.class);
entity.setDataJson(jsonMap);
} catch (Exception e) {
log.warn("JSON 파싱 실패, 원본 텍스트로 저장: {}", fileName, e);
// JSON 파싱이 실패하면 원본을 Map 형태로 저장
Map<String, Object> fallbackMap = new HashMap<>();
fallbackMap.put("raw_content", geoJsonContent);
fallbackMap.put("parse_error", e.getMessage());
entity.setDataJson(fallbackMap);
}
// 연도 정보 추출 (파일명에서 추출 시도)
setYearInformation(entity, fileName);
// 상태 정보 설정
entity.setDataState("PROCESSED");
entity.setAnalState("PENDING");
// 시간 정보 설정
ZonedDateTime now = ZonedDateTime.now();
entity.setCreatedDttm(now);
entity.setUpdatedDttm(now);
entity.setDataStateDttm(now);
return entity;
}
/**
* CRS 정보 설정
*/
private void setCrsInformation(MapSheetLearnDataEntity entity, JsonNode geoJsonNode) {
if (geoJsonNode.has("crs")) {
JsonNode crsNode = geoJsonNode.get("crs");
if (crsNode.has("type") && crsNode.has("properties")) {
String crsType = crsNode.get("type").asText();
entity.setDataCrsType(crsType);
JsonNode propertiesNode = crsNode.get("properties");
if (propertiesNode.has("name")) {
String crsName = propertiesNode.get("name").asText();
entity.setDataCrsTypeName(crsName);
}
}
} else {
// CRS가 명시되지 않은 경우 기본값 설정 (WGS84)
entity.setDataCrsType("EPSG");
entity.setDataCrsTypeName("EPSG:4326");
}
}
/**
* 연도 정보 추출
*/
private void setYearInformation(MapSheetLearnDataEntity entity, String fileName) {
// 파일명에서 연도 추출 시도 (예: kamco_2021_2022_35813023.geojson)
String[] parts = fileName.split("_");
for (String part : parts) {
if (part.matches("\\d{4}")) { // 4자리 숫자 (연도)
try {
Integer year = Integer.parseInt(part);
if (year >= 1900 && year <= 2100) {
if (entity.getDataYyyy() == null) {
entity.setDataYyyy(year);
} else {
entity.setCompareYyyy(year);
break;
}
}
} catch (NumberFormatException ignored) {
// 무시
}
}
}
}
/**
* 제목 추출
*/
private String extractTitle(String fileName, JsonNode geoJsonNode) {
// GeoJSON 메타데이터에서 제목 추출 시도
if (geoJsonNode.has("properties")) {
JsonNode properties = geoJsonNode.get("properties");
if (properties.has("title")) {
return properties.get("title").asText();
}
if (properties.has("name")) {
return properties.get("name").asText();
}
}
// 파일명에서 확장자 제거하여 제목으로 사용
int lastDotIndex = fileName.lastIndexOf('.');
if (lastDotIndex > 0) {
return fileName.substring(0, lastDotIndex);
}
return fileName;
}
/**
* 데이터 경로 생성
*/
private String generateDataPath(String archiveFileName, String fileName) {
return archiveFileName + "/" + fileName;
}
/**
* 처리 가능한 파일 개수 확인
*/
public boolean isProcessable(Map<String, String> geoJsonContents) {
if (geoJsonContents == null || geoJsonContents.isEmpty()) {
return false;
}
// 최대 처리 가능한 파일 수 제한 (성능 고려)
int maxFiles = 50;
if (geoJsonContents.size() > maxFiles) {
log.warn("처리 가능한 최대 파일 수를 초과했습니다: {} > {}", geoJsonContents.size(), maxFiles);
return false;
}
return true;
}
}

View File

@@ -0,0 +1,434 @@
package com.kamco.cd.kamcoback.geojson.service;
import com.kamco.cd.kamcoback.geojson.config.GeoJsonMonitorConfig;
import com.kamco.cd.kamcoback.postgres.repository.MapSheetLearnDataRepository;
import com.kamco.cd.kamcoback.postgres.repository.MapSheetLearnDataGeomRepository;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Service;
import jakarta.annotation.PostConstruct;
import java.io.IOException;
import java.nio.file.*;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.stream.Stream;
/**
* GeoJSON 파일 모니터링 서비스
* 지정된 폴더를 주기적으로 모니터링하여 압축파일을 자동으로 처리합니다.
*/
@Slf4j
@Service
@RequiredArgsConstructor
public class GeoJsonFileMonitorService {
private final GeoJsonMonitorConfig config;
private final ArchiveExtractorService archiveExtractorService;
private final GeoJsonDataService geoJsonDataService;
private final GeometryConversionService geometryConversionService;
private final MapSheetLearnDataRepository learnDataRepository;
private final MapSheetLearnDataGeomRepository geomRepository;
/**
* 애플리케이션 시작 시 필요한 디렉토리들을 미리 생성
*/
@PostConstruct
public void initializeDirectories() {
try {
log.info("GeoJSON 모니터링 시스템 초기화 중...");
log.info("설정된 경로 - Watch: {}, Processed: {}, Error: {}, Temp: {}",
config.getWatchDirectory(), config.getProcessedDirectory(),
config.getErrorDirectory(), config.getTempDirectory());
ensureDirectoriesExist();
log.info("GeoJSON 모니터링 시스템 초기화 완료");
} catch (Exception e) {
log.warn("GeoJSON 모니터링 시스템 초기화 실패 - 스케줄러 실행 시 재시도됩니다", e);
// 초기화 실패해도 애플리케이션은 시작되도록 함 (RuntimeException 던지지 않음)
}
}
/**
* 스케줄러를 통한 파일 모니터링
* 설정된 cron 표현식에 따라 주기적으로 실행
*/
// @Scheduled(cron = "#{@geoJsonMonitorConfig.cronExpression}")
public void monitorFiles() {
log.debug("파일 모니터링 시작");
try {
// 모니터링 폴더 존재 확인 및 생성
ensureDirectoriesExist();
// 압축파일 검색 및 처리
processArchiveFiles();
// 미처리된 Geometry 변환 작업 수행
processUnprocessedGeometryData();
} catch (RuntimeException e) {
log.error("파일 모니터링 중 치명적 오류 발생 - 이번 주기 건너뜀", e);
} catch (Exception e) {
log.error("파일 모니터링 중 오류 발생", e);
}
log.debug("파일 모니터링 완료");
}
/**
* 필요한 디렉토리들이 존재하는지 확인하고 생성
*/
private void ensureDirectoriesExist() {
boolean hasError = false;
try {
createDirectoryIfNotExists(config.getWatchDirectory());
} catch (IOException e) {
log.error("Watch 디렉토리 생성 실패: {} - {}", config.getWatchDirectory(), e.getMessage());
hasError = true;
}
try {
createDirectoryIfNotExists(config.getProcessedDirectory());
} catch (IOException e) {
log.error("Processed 디렉토리 생성 실패: {} - {}", config.getProcessedDirectory(), e.getMessage());
hasError = true;
}
try {
createDirectoryIfNotExists(config.getErrorDirectory());
} catch (IOException e) {
log.error("Error 디렉토리 생성 실패: {} - {}", config.getErrorDirectory(), e.getMessage());
hasError = true;
}
try {
createDirectoryIfNotExists(config.getTempDirectory());
} catch (IOException e) {
log.error("Temp 디렉토리 생성 실패: {} - {}", config.getTempDirectory(), e.getMessage());
hasError = true;
}
if (hasError) {
log.warn("일부 디렉토리 생성에 실패했습니다. 해당 기능은 제한될 수 있습니다.");
log.info("수동으로 다음 디렉토리들을 생성해주세요:");
log.info(" - {}", config.getWatchDirectory());
log.info(" - {}", config.getProcessedDirectory());
log.info(" - {}", config.getErrorDirectory());
log.info(" - {}", config.getTempDirectory());
} else {
log.info("모든 필요한 디렉토리가 준비되었습니다.");
}
}
/**
* 디렉토리가 존재하지 않으면 생성
*/
private void createDirectoryIfNotExists(String directory) throws IOException {
if (directory == null || directory.trim().isEmpty()) {
throw new IllegalArgumentException("디렉토리 경로가 비어있습니다.");
}
Path path = Paths.get(directory);
if (!Files.exists(path)) {
try {
Files.createDirectories(path);
log.info("디렉토리 생성 완료: {}", directory);
// 디렉토리 권한 설정 (Unix/Linux 환경에서)
try {
if (!System.getProperty("os.name").toLowerCase().contains("windows")) {
// rwxrwxr-x 권한 설정
java.nio.file.attribute.PosixFilePermissions.asFileAttribute(
java.nio.file.attribute.PosixFilePermissions.fromString("rwxrwxr-x")
);
}
} catch (Exception permissionException) {
log.debug("권한 설정 실패 (무시됨): {}", permissionException.getMessage());
}
} catch (IOException e) {
log.error("디렉토리 생성 실패: {} - {}", directory, e.getMessage());
throw new IOException("디렉토리를 생성할 수 없습니다: " + directory, e);
}
} else if (!Files.isDirectory(path)) {
throw new IOException("지정된 경로가 디렉토리가 아닙니다: " + directory);
} else if (!Files.isWritable(path)) {
log.warn("디렉토리에 쓰기 권한이 없습니다: {}", directory);
} else {
log.debug("디렉토리가 이미 존재합니다: {}", directory);
}
}
/**
* 모니터링 폴더에서 압축파일들을 찾아서 처리
*/
private void processArchiveFiles() {
Path watchDir = Paths.get(config.getWatchDirectory());
// 디렉토리 존재 확인
if (!Files.exists(watchDir)) {
log.debug("Watch 디렉토리가 존재하지 않습니다: {}", watchDir);
return;
}
if (!Files.isDirectory(watchDir)) {
log.warn("Watch 경로가 디렉토리가 아닙니다: {}", watchDir);
return;
}
if (!Files.isReadable(watchDir)) {
log.warn("Watch 디렉토리에 읽기 권한이 없습니다: {}", watchDir);
return;
}
try (Stream<Path> files = Files.list(watchDir)) {
files.filter(Files::isRegularFile)
.filter(archiveExtractorService::isSupportedArchive)
.filter(archiveExtractorService::isFileSizeValid)
.forEach(this::processArchiveFile);
} catch (IOException e) {
log.error("파일 목록 조회 실패: {}", watchDir, e);
}
}
/**
* 개별 압축파일 처리
*/
private void processArchiveFile(Path archiveFile) {
String fileName = archiveFile.getFileName().toString();
log.info("압축파일 처리 시작: {}", fileName);
try {
// 1. 압축파일에서 GeoJSON 파일들 추출
Map<String, String> geoJsonContents = archiveExtractorService.extractGeoJsonFiles(archiveFile);
if (geoJsonContents.isEmpty()) {
log.warn("압축파일에서 GeoJSON 파일을 찾을 수 없습니다: {}", fileName);
moveFileToError(archiveFile, "GeoJSON 파일 없음");
return;
}
// 2. 처리 가능한 파일 수인지 확인
if (!geoJsonDataService.isProcessable(geoJsonContents)) {
log.warn("처리할 수 없는 파일입니다: {}", fileName);
moveFileToError(archiveFile, "처리 불가능한 파일");
return;
}
// 3. GeoJSON 데이터를 데이터베이스에 저장
List<Long> savedLearnDataIds = geoJsonDataService.processGeoJsonFiles(geoJsonContents, fileName);
if (savedLearnDataIds.isEmpty()) {
log.warn("저장된 학습 데이터가 없습니다: {}", fileName);
moveFileToError(archiveFile, "데이터 저장 실패");
return;
}
// 4. Geometry 데이터로 변환
List<Long> geometryIds = geometryConversionService.convertToGeometryData(savedLearnDataIds);
// 5. 처리 완료된 파일을 처리된 폴더로 이동
moveFileToProcessed(archiveFile);
log.info("압축파일 처리 완료: {} (학습 데이터: {}개, Geometry: {}개)",
fileName, savedLearnDataIds.size(), geometryIds.size());
} catch (Exception e) {
log.error("압축파일 처리 실패: {}", fileName, e);
try {
moveFileToError(archiveFile, "처리 중 오류 발생: " + e.getMessage());
} catch (IOException moveError) {
log.error("오류 파일 이동 실패: {}", fileName, moveError);
}
}
}
/**
* 미처리된 Geometry 변환 작업 수행
*/
private void processUnprocessedGeometryData() {
try {
List<Long> processedIds = geometryConversionService.processUnprocessedLearnData();
if (!processedIds.isEmpty()) {
log.info("미처리 Geometry 변환 완료: {}개", processedIds.size());
}
} catch (Exception e) {
log.error("미처리 Geometry 변환 작업 실패", e);
}
}
/**
* 처리 완료된 파일을 processed 폴더로 이동
*/
private void moveFileToProcessed(Path sourceFile) throws IOException {
String fileName = sourceFile.getFileName().toString();
String timestampedFileName = addTimestamp(fileName);
Path targetPath = Paths.get(config.getProcessedDirectory(), timestampedFileName);
Files.move(sourceFile, targetPath, StandardCopyOption.REPLACE_EXISTING);
log.info("파일을 처리된 폴더로 이동: {} -> {}", fileName, timestampedFileName);
}
/**
* 오류가 발생한 파일을 error 폴더로 이동
*/
private void moveFileToError(Path sourceFile, String errorReason) throws IOException {
String fileName = sourceFile.getFileName().toString();
String errorFileName = addTimestamp(fileName) + ".error";
Path targetPath = Paths.get(config.getErrorDirectory(), errorFileName);
Files.move(sourceFile, targetPath, StandardCopyOption.REPLACE_EXISTING);
// 오류 정보를 별도 파일로 저장
String errorInfoFileName = errorFileName + ".info";
Path errorInfoPath = Paths.get(config.getErrorDirectory(), errorInfoFileName);
String errorInfo = String.format("파일: %s%n오류 시간: %s%n오류 원인: %s%n",
fileName, java.time.Instant.now(), errorReason);
Files.write(errorInfoPath, errorInfo.getBytes());
log.warn("파일을 오류 폴더로 이동: {} (원인: {})", fileName, errorReason);
}
/**
* 파일명에 타임스탬프 추가
*/
private String addTimestamp(String fileName) {
int lastDotIndex = fileName.lastIndexOf('.');
String name = (lastDotIndex > 0) ? fileName.substring(0, lastDotIndex) : fileName;
String extension = (lastDotIndex > 0) ? fileName.substring(lastDotIndex) : "";
return String.format("%s_%d%s", name, System.currentTimeMillis(), extension);
}
/**
* 수동으로 특정 파일 처리 (테스트/관리 목적)
*/
public void processFileManually(String filePath) {
Path archiveFile = Paths.get(filePath);
if (!Files.exists(archiveFile)) {
log.error("파일이 존재하지 않습니다: {}", filePath);
return;
}
if (!archiveExtractorService.isSupportedArchive(archiveFile)) {
log.error("지원하지 않는 압축파일 형식입니다: {}", filePath);
return;
}
log.info("수동 파일 처리 시작: {}", filePath);
processArchiveFile(archiveFile);
}
/**
* 디렉토리 초기화를 수동으로 실행 (API에서 호출 가능)
*/
public void initializeDirectoriesManually() {
log.info("디렉토리 수동 초기화 시작");
try {
ensureDirectoriesExist();
log.info("디렉토리 수동 초기화 완료");
} catch (Exception e) {
log.error("디렉토리 수동 초기화 실패", e);
throw new RuntimeException("디렉토리 초기화 실패", e);
}
}
/**
* 모니터링 상태 정보 반환
*/
public Map<String, Object> getMonitorStatus() {
return Map.of(
"watchDirectory", config.getWatchDirectory(),
"processedDirectory", config.getProcessedDirectory(),
"errorDirectory", config.getErrorDirectory(),
"cronExpression", config.getCronExpression(),
"supportedExtensions", config.getSupportedExtensions(),
"maxFileSize", config.getMaxFileSize(),
"maxFileSizeMB", config.getMaxFileSize() / 1024 / 1024
);
}
/**
* 시스템 통계 정보 조회
*/
public Map<String, Object> getSystemStats() {
Map<String, Object> stats = new HashMap<>();
try {
// 데이터베이스 통계
long totalLearnData = learnDataRepository.count();
long totalGeomData = geomRepository.count();
long pendingAnalysis = learnDataRepository.countByAnalState("PENDING");
stats.put("database", Map.of(
"totalLearnData", totalLearnData,
"totalGeomData", totalGeomData,
"pendingAnalysis", pendingAnalysis
));
// 파일 시스템 통계
stats.put("fileSystem", getFileSystemStats());
// 모니터링 설정
stats.put("monitoring", Map.of(
"isActive", true,
"cronExpression", "0/30 * * * * *",
"watchDirectory", config.getWatchDirectory(),
"processedDirectory", config.getProcessedDirectory(),
"errorDirectory", config.getErrorDirectory()
));
} catch (Exception e) {
log.error("통계 정보 조회 실패", e);
stats.put("error", e.getMessage());
}
return stats;
}
/**
* 파일 시스템 통계 조회
*/
private Map<String, Object> getFileSystemStats() {
Map<String, Object> fileStats = new HashMap<>();
try {
// 각 디렉토리의 파일 수 계산
Path watchDir = Paths.get(config.getWatchDirectory());
Path processedDir = Paths.get(config.getProcessedDirectory());
Path errorDir = Paths.get(config.getErrorDirectory());
fileStats.put("watchDirectoryCount", countFilesInDirectory(watchDir));
fileStats.put("processedDirectoryCount", countFilesInDirectory(processedDir));
fileStats.put("errorDirectoryCount", countFilesInDirectory(errorDir));
} catch (Exception e) {
log.warn("파일 시스템 통계 조회 실패: {}", e.getMessage());
fileStats.put("error", e.getMessage());
}
return fileStats;
}
/**
* 디렉토리 내 파일 개수 계산
*/
private long countFilesInDirectory(Path directory) {
if (!Files.exists(directory) || !Files.isDirectory(directory)) {
return 0;
}
try (Stream<Path> files = Files.list(directory)) {
return files.filter(Files::isRegularFile).count();
} catch (IOException e) {
log.warn("디렉토리 파일 계산 실패: {}", directory, e);
return 0;
}
}
}

View File

@@ -0,0 +1,448 @@
package com.kamco.cd.kamcoback.geojson.service;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.kamco.cd.kamcoback.postgres.entity.MapSheetLearnDataEntity;
import com.kamco.cd.kamcoback.postgres.entity.MapSheetLearnDataGeomEntity;
import com.kamco.cd.kamcoback.postgres.repository.MapSheetLearnDataGeomRepository;
import com.kamco.cd.kamcoback.postgres.repository.MapSheetLearnDataRepository;
import java.time.ZonedDateTime;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.locationtech.jts.geom.*;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.time.Instant;
import java.util.*;
/**
* Geometry 데이터 변환 서비스
*/
@Slf4j
@Service
@RequiredArgsConstructor
public class GeometryConversionService {
private final MapSheetLearnDataRepository mapSheetLearnDataRepository;
private final MapSheetLearnDataGeomRepository mapSheetLearnDataGeomRepository;
private final ObjectMapper objectMapper;
private final GeometryFactory geometryFactory = new GeometryFactory();
/**
* MapSheetLearnData의 JSON 데이터를 기반으로 Geometry 테이블에 저장
*/
@Transactional
public List<Long> convertToGeometryData(List<Long> learnDataIds) {
List<Long> processedIds = new ArrayList<>();
log.info("Geometry 변환 시작: {} 개의 학습 데이터", learnDataIds.size());
for (Long dataId : learnDataIds) {
try {
if (dataId != null) {
Optional<MapSheetLearnDataEntity> learnDataOpt = mapSheetLearnDataRepository.findById(dataId);
if (learnDataOpt.isPresent()) {
List<Long> geometryIds = processLearnDataToGeometry(learnDataOpt.get());
processedIds.addAll(geometryIds);
log.debug("학습 데이터 {} 에서 {} 개의 geometry 데이터 생성", dataId, geometryIds.size());
} else {
log.warn("학습 데이터를 찾을 수 없습니다: {}", dataId);
}
}
} catch (Exception e) {
log.error("Geometry 변환 실패 - 학습 데이터 ID: {}", dataId, e);
// 개별 변환 실패는 전체 처리를 중단시키지 않음
}
}
log.info("Geometry 변환 완료: {} 개 처리, {} 개의 geometry 생성", learnDataIds.size(), processedIds.size());
return processedIds;
}
/**
* 개별 학습 데이터를 Geometry 데이터로 변환
*/
private List<Long> processLearnDataToGeometry(MapSheetLearnDataEntity learnData) {
List<Long> geometryIds = new ArrayList<>();
try {
// 기존 geometry 데이터 삭제 (재생성)
mapSheetLearnDataGeomRepository.deleteByDataUid(learnData.getId());
// JSON 데이터에서 GeoJSON 추출
Map<String, Object> dataJson = learnData.getDataJson();
if (dataJson == null || dataJson.isEmpty()) {
log.warn("JSON 데이터가 없습니다: {}", learnData.getId());
return geometryIds;
}
// JSON을 GeoJSON으로 파싱
String geoJsonString = objectMapper.writeValueAsString(dataJson);
JsonNode geoJsonNode = objectMapper.readTree(geoJsonString);
// GeoJSON 타입에 따라 처리
String type = geoJsonNode.get("type").asText();
switch (type) {
case "FeatureCollection":
geometryIds.addAll(processFeatureCollection(geoJsonNode, learnData));
break;
case "Feature":
Long geometryId = processFeature(geoJsonNode, learnData);
if (geometryId != null) {
geometryIds.add(geometryId);
}
break;
case "Point":
case "LineString":
case "Polygon":
case "MultiPoint":
case "MultiLineString":
case "MultiPolygon":
Long directGeometryId = processDirectGeometry(geoJsonNode, learnData);
if (directGeometryId != null) {
geometryIds.add(directGeometryId);
}
break;
default:
log.warn("지원하지 않는 GeoJSON type: {} (데이터 ID: {})", type, learnData.getId());
}
} catch (Exception e) {
log.error("Geometry 변환 실패: 학습 데이터 ID {}", learnData.getId(), e);
throw new RuntimeException("Geometry 변환 실패", e);
}
return geometryIds;
}
/**
* FeatureCollection 처리
*/
private List<Long> processFeatureCollection(JsonNode featureCollectionNode, MapSheetLearnDataEntity learnData) {
List<Long> geometryIds = new ArrayList<>();
if (!featureCollectionNode.has("features")) {
log.warn("FeatureCollection에 features 배열이 없습니다: {}", learnData.getId());
return geometryIds;
}
JsonNode featuresNode = featureCollectionNode.get("features");
if (featuresNode.isArray()) {
for (JsonNode featureNode : featuresNode) {
try {
Long geometryId = processFeature(featureNode, learnData);
if (geometryId != null) {
geometryIds.add(geometryId);
}
} catch (Exception e) {
log.error("Feature 처리 실패 (학습 데이터 ID: {})", learnData.getId(), e);
}
}
}
return geometryIds;
}
/**
* Feature 처리
*/
private Long processFeature(JsonNode featureNode, MapSheetLearnDataEntity learnData) {
try {
if (!featureNode.has("geometry")) {
log.warn("Feature에 geometry가 없습니다: {}", learnData.getId());
return null;
}
JsonNode geometryNode = featureNode.get("geometry");
JsonNode propertiesNode = featureNode.has("properties") ? featureNode.get("properties") : null;
return createGeometryEntity(geometryNode, propertiesNode, learnData);
} catch (Exception e) {
log.error("Feature 처리 중 오류 (학습 데이터 ID: {})", learnData.getId(), e);
return null;
}
}
/**
* 직접 Geometry 처리
*/
private Long processDirectGeometry(JsonNode geometryNode, MapSheetLearnDataEntity learnData) {
return createGeometryEntity(geometryNode, null, learnData);
}
/**
* GeometryEntity 생성 및 저장
*/
private Long createGeometryEntity(JsonNode geometryNode, JsonNode propertiesNode, MapSheetLearnDataEntity learnData) {
try {
MapSheetLearnDataGeomEntity geometryEntity = new MapSheetLearnDataGeomEntity();
// 기본 정보 설정
geometryEntity.setDataUid(learnData.getId());
geometryEntity.setBeforeYyyy(learnData.getDataYyyy());
geometryEntity.setAfterYyyy(learnData.getCompareYyyy());
// Geometry 변환 및 설정
Geometry geometry = parseGeometryFromGeoJson(geometryNode);
if (geometry != null) {
geometryEntity.setGeom(geometry);
geometryEntity.setGeoType(geometry.getGeometryType());
// 면적 계산 (Polygon인 경우)
if (geometry instanceof Polygon || geometry.getGeometryType().contains("Polygon")) {
double area = geometry.getArea();
geometryEntity.setArea(area);
}
} else {
log.warn("Geometry 변환 실패: {}", geometryNode);
return null;
}
// Properties에서 추가 정보 추출
if (propertiesNode != null) {
extractPropertiesData(geometryEntity, propertiesNode, learnData);
}
// 시간 정보 설정
ZonedDateTime now = ZonedDateTime.now();
geometryEntity.setCreatedDttm(now);
geometryEntity.setUpdatedDttm(now);
// 저장
MapSheetLearnDataGeomEntity savedEntity = mapSheetLearnDataGeomRepository.save(geometryEntity);
return savedEntity.getId();
} catch (Exception e) {
log.error("GeometryEntity 생성 실패 (학습 데이터 ID: {})", learnData.getId(), e);
return null;
}
}
/**
* GeoJSON 노드에서 JTS Geometry 객체 생성
*/
private Geometry parseGeometryFromGeoJson(JsonNode geometryNode) {
try {
if (!geometryNode.has("type") || !geometryNode.has("coordinates")) {
log.warn("유효하지 않은 Geometry 형식: type 또는 coordinates가 없습니다.");
return null;
}
String geometryType = geometryNode.get("type").asText();
JsonNode coordinatesNode = geometryNode.get("coordinates");
switch (geometryType.toLowerCase()) {
case "point":
return createPoint(coordinatesNode);
case "linestring":
return createLineString(coordinatesNode);
case "polygon":
return createPolygon(coordinatesNode);
case "multipoint":
return createMultiPoint(coordinatesNode);
case "multilinestring":
return createMultiLineString(coordinatesNode);
case "multipolygon":
return createMultiPolygon(coordinatesNode);
default:
log.warn("지원하지 않는 Geometry 타입: {}", geometryType);
return null;
}
} catch (Exception e) {
log.error("Geometry 파싱 실패", e);
return null;
}
}
private Point createPoint(JsonNode coordinatesNode) {
if (coordinatesNode.size() < 2) return null;
double x = coordinatesNode.get(0).asDouble();
double y = coordinatesNode.get(1).asDouble();
return geometryFactory.createPoint(new Coordinate(x, y));
}
private LineString createLineString(JsonNode coordinatesNode) {
List<Coordinate> coords = new ArrayList<>();
for (JsonNode coordNode : coordinatesNode) {
if (coordNode.size() >= 2) {
coords.add(new Coordinate(coordNode.get(0).asDouble(), coordNode.get(1).asDouble()));
}
}
return geometryFactory.createLineString(coords.toArray(new Coordinate[0]));
}
private Polygon createPolygon(JsonNode coordinatesNode) {
if (coordinatesNode.size() == 0) return null;
// Exterior ring
JsonNode exteriorRing = coordinatesNode.get(0);
List<Coordinate> coords = new ArrayList<>();
for (JsonNode coordNode : exteriorRing) {
if (coordNode.size() >= 2) {
coords.add(new Coordinate(coordNode.get(0).asDouble(), coordNode.get(1).asDouble()));
}
}
if (coords.size() < 3) return null;
// Close ring if not already closed
if (!coords.get(0).equals2D(coords.get(coords.size() - 1))) {
coords.add(new Coordinate(coords.get(0)));
}
LinearRing shell = geometryFactory.createLinearRing(coords.toArray(new Coordinate[0]));
// Interior rings (holes)
LinearRing[] holes = new LinearRing[coordinatesNode.size() - 1];
for (int i = 1; i < coordinatesNode.size(); i++) {
JsonNode holeRing = coordinatesNode.get(i);
List<Coordinate> holeCoords = new ArrayList<>();
for (JsonNode coordNode : holeRing) {
if (coordNode.size() >= 2) {
holeCoords.add(new Coordinate(coordNode.get(0).asDouble(), coordNode.get(1).asDouble()));
}
}
if (holeCoords.size() >= 3) {
if (!holeCoords.get(0).equals2D(holeCoords.get(holeCoords.size() - 1))) {
holeCoords.add(new Coordinate(holeCoords.get(0)));
}
holes[i - 1] = geometryFactory.createLinearRing(holeCoords.toArray(new Coordinate[0]));
}
}
return geometryFactory.createPolygon(shell, holes);
}
private MultiPoint createMultiPoint(JsonNode coordinatesNode) {
List<Point> points = new ArrayList<>();
for (JsonNode pointNode : coordinatesNode) {
Point point = createPoint(pointNode);
if (point != null) {
points.add(point);
}
}
return geometryFactory.createMultiPoint(points.toArray(new Point[0]));
}
private MultiLineString createMultiLineString(JsonNode coordinatesNode) {
List<LineString> lineStrings = new ArrayList<>();
for (JsonNode lineNode : coordinatesNode) {
LineString line = createLineString(lineNode);
if (line != null) {
lineStrings.add(line);
}
}
return geometryFactory.createMultiLineString(lineStrings.toArray(new LineString[0]));
}
private MultiPolygon createMultiPolygon(JsonNode coordinatesNode) {
List<Polygon> polygons = new ArrayList<>();
for (JsonNode polygonNode : coordinatesNode) {
Polygon polygon = createPolygon(polygonNode);
if (polygon != null) {
polygons.add(polygon);
}
}
return geometryFactory.createMultiPolygon(polygons.toArray(new Polygon[0]));
}
/**
* Properties에서 추가 정보 추출
*/
private void extractPropertiesData(MapSheetLearnDataGeomEntity geometryEntity, JsonNode propertiesNode, MapSheetLearnDataEntity learnData) {
// CD 정확도 정보
if (propertiesNode.has("cd_prob")) {
try {
double cdProb = propertiesNode.get("cd_prob").asDouble();
geometryEntity.setCdProb(cdProb);
} catch (Exception e) {
log.debug("cd_prob 파싱 실패", e);
}
}
// Before class 정보
if (propertiesNode.has("class_before_name")) {
geometryEntity.setClassBeforeName(propertiesNode.get("class_before_name").asText());
}
if (propertiesNode.has("class_before_prob")) {
try {
double beforeProb = propertiesNode.get("class_before_prob").asDouble();
geometryEntity.setClassBeforeProb(beforeProb);
} catch (Exception e) {
log.debug("class_before_prob 파싱 실패", e);
}
}
// After class 정보
if (propertiesNode.has("class_after_name")) {
geometryEntity.setClassAfterName(propertiesNode.get("class_after_name").asText());
}
if (propertiesNode.has("class_after_prob")) {
try {
double afterProb = propertiesNode.get("class_after_prob").asDouble();
geometryEntity.setClassAfterProb(afterProb);
} catch (Exception e) {
log.debug("class_after_prob 파싱 실패", e);
}
}
// 도엽 번호
if (propertiesNode.has("map_sheet_num")) {
try {
long mapSheetNum = propertiesNode.get("map_sheet_num").asLong();
geometryEntity.setMapSheetNum(mapSheetNum);
} catch (Exception e) {
log.debug("map_sheet_num 파싱 실패", e);
}
}
// 면적 (properties에서 제공되는 경우)
if (propertiesNode.has("area")) {
try {
double area = propertiesNode.get("area").asDouble();
geometryEntity.setArea(area);
} catch (Exception e) {
log.debug("area 파싱 실패", e);
}
}
}
/**
* 미처리된 학습 데이터들을 찾아서 자동으로 geometry 변환 수행
*/
@Transactional
public List<Long> processUnprocessedLearnData() {
// 분석 상태가 PENDING인 학습 데이터 조회
List<MapSheetLearnDataEntity> unprocessedData = mapSheetLearnDataRepository.findByAnalState("PENDING");
if (unprocessedData.isEmpty()) {
log.debug("처리할 미완료 학습 데이터가 없습니다.");
return new ArrayList<>();
}
log.info("미처리 학습 데이터 {}개에 대해 geometry 변환을 수행합니다.", unprocessedData.size());
List<Long> processedIds = new ArrayList<>();
for (MapSheetLearnDataEntity data : unprocessedData) {
try {
List<Long> geometryIds = processLearnDataToGeometry(data);
processedIds.addAll(geometryIds);
// 처리 완료 상태로 업데이트
data.setAnalState("COMPLETED");
mapSheetLearnDataRepository.save(data);
} catch (Exception e) {
log.error("미처리 학습 데이터 처리 실패: {}", data.getId(), e);
// 실패한 경우 ERROR 상태로 설정
data.setAnalState("ERROR");
mapSheetLearnDataRepository.save(data);
}
}
return processedIds;
}
}

View File

@@ -0,0 +1,160 @@
package com.kamco.cd.kamcoback.inference;
import com.kamco.cd.kamcoback.config.api.ApiResponseDto;
import com.kamco.cd.kamcoback.inference.dto.InferenceResultDto;
import com.kamco.cd.kamcoback.inference.dto.InferenceResultDto.Dashboard;
import com.kamco.cd.kamcoback.inference.dto.InferenceResultDto.Detail;
import com.kamco.cd.kamcoback.inference.dto.InferenceResultDto.SearchReq;
import com.kamco.cd.kamcoback.inference.service.InferenceResultService;
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.Parameter;
import io.swagger.v3.oas.annotations.media.Content;
import io.swagger.v3.oas.annotations.media.Schema;
import io.swagger.v3.oas.annotations.responses.ApiResponse;
import io.swagger.v3.oas.annotations.responses.ApiResponses;
import io.swagger.v3.oas.annotations.tags.Tag;
import java.util.List;
import lombok.RequiredArgsConstructor;
import org.springframework.data.domain.Page;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
@Tag(name = "분석결과", description = "추론관리 분석결과")
@RequestMapping("/api/inf/res")
@RequiredArgsConstructor
@RestController
public class InferenceResultApiController {
private final InferenceResultService inferenceResultService;
@Operation(
summary = "추론관리 분석결과 목록 조회",
description =
"분석상태, 제목으로 분석결과를 조회 합니다.")
@ApiResponses(
value = {
@ApiResponse(
responseCode = "200",
description = "검색 성공",
content =
@Content(
mediaType = "application/json",
schema = @Schema(implementation = Page.class))),
@ApiResponse(responseCode = "400", description = "잘못된 검색 조건", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("/list")
public ApiResponseDto<Page<InferenceResultDto.AnalResList>> getInferenceResultList(
@Parameter(description = "분석상태", example = "0000")
@RequestParam(required = false)
String statCode,
@Parameter(description = "제목", example = "2023_2024년도") @RequestParam(required = false)
String title,
@Parameter(description = "페이지 번호 (0부터 시작)", example = "0") @RequestParam(defaultValue = "0")
int page,
@Parameter(description = "페이지 크기", example = "20") @RequestParam(defaultValue = "20")
int size,
@Parameter(description = "정렬 조건 (형식: 필드명,방향)", example = "name,asc")
@RequestParam(required = false)
String sort
) {
InferenceResultDto.SearchReq searchReq = new InferenceResultDto.SearchReq(statCode, title, page, size, sort);
Page<InferenceResultDto.AnalResList> analResList = inferenceResultService.getInferenceResultList(searchReq);
return ApiResponseDto.ok(analResList);
}
@Operation(
summary = "추론관리 분석결과 요약정보",
description =
"분석결과 요약정보를 조회합니다.")
@ApiResponses(
value = {
@ApiResponse(
responseCode = "200",
description = "검색 성공",
content =
@Content(
mediaType = "application/json",
schema = @Schema(implementation = InferenceResultDto.AnalResSummary.class))),
@ApiResponse(responseCode = "400", description = "잘못된 검색 조건", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("/summary")
public ApiResponseDto<InferenceResultDto.AnalResSummary> getInferenceResultSummary(
@Parameter(description = "목록 id", example = "1")
@RequestParam Long id) {
return ApiResponseDto.ok(inferenceResultService.getInferenceResultSummary(id));
}
@Operation(
summary = "추론관리 분석결과 상세",
description =
"분석결과 상제 정보 Summary, DashBoard")
@ApiResponses(
value = {
@ApiResponse(
responseCode = "200",
description = "검색 성공",
content =
@Content(
mediaType = "application/json",
schema = @Schema(implementation = InferenceResultDto.Detail.class))),
@ApiResponse(responseCode = "400", description = "잘못된 검색 조건", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("/detail")
public ApiResponseDto<InferenceResultDto.Detail> getInferenceDetail(
@Parameter(description = "목록 id", example = "1")
@RequestParam Long id) {
// summary
InferenceResultDto.AnalResSummary summary = inferenceResultService.getInferenceResultSummary(id);
//dashBoard
List<InferenceResultDto.Dashboard> dashboardList = this.getInferenceResultDashboard(id);
return ApiResponseDto.ok(new Detail(summary, dashboardList));
}
@Operation(
summary = "추론관리 분석결과 상세 목록",
description =
"추론관리 분석결과 상세 목록 geojson 데이터 조회")
@ApiResponses(
value = {
@ApiResponse(
responseCode = "200",
description = "검색 성공",
content =
@Content(
mediaType = "application/json",
schema = @Schema(implementation = Page.class))),
@ApiResponse(responseCode = "400", description = "잘못된 검색 조건", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("/geom")
public ApiResponseDto<Page<InferenceResultDto.Geom>> getInferenceResultGeomList(
@Parameter(description = "기준년도 분류", example = "0001") @RequestParam(required = false) String targetClass,
@Parameter(description = "비교년도 분류", example = "0002") @RequestParam(required = false) String compareClass,
@Parameter(description = "5000:1 도협번호 37801011,37801012") @RequestParam(required = false) List<Long> mapSheetNum,
@Parameter(description = "페이지 번호 (0부터 시작)", example = "0") @RequestParam(defaultValue = "0") int page,
@Parameter(description = "페이지 크기", example = "20") @RequestParam(defaultValue = "20") int size,
@Parameter(description = "정렬 조건 (형식: 필드명,방향)", example = "name,asc") @RequestParam(required = false) String sort
) {
InferenceResultDto.SearchGeoReq searchGeoReq = new InferenceResultDto.SearchGeoReq(targetClass, compareClass, mapSheetNum, page, size, sort);
Page<InferenceResultDto.Geom> geomList = inferenceResultService.getInferenceResultGeomList(searchGeoReq);
return ApiResponseDto.ok(geomList);
}
/**
* 분석결과 상세 대시보드 조회
* @param id
* @return
*/
private List<Dashboard> getInferenceResultDashboard(Long id) {
return inferenceResultService.getInferenceResultBasic(id);
}
}

View File

@@ -0,0 +1,310 @@
package com.kamco.cd.kamcoback.inference.dto;
import com.kamco.cd.kamcoback.common.utils.interfaces.JsonFormatDttm;
import io.swagger.v3.oas.annotations.media.Schema;
import java.time.ZonedDateTime;
import java.util.List;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.Setter;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.PageRequest;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
public class InferenceResultDto {
@Schema(name = "InferenceResultBasic", description = "분석결과 기본 정보")
@Getter
public static class Basic {
private Long id;
private String dataName;
private Long mapSheepNum;
private Long detectingCnt;
@JsonFormatDttm
private ZonedDateTime analStrtDttm;
@JsonFormatDttm
private ZonedDateTime analEndDttm;
private Long analSec;
private String analState;
public Basic(
Long id,
String dataName,
Long mapSheepNum,
Long detectingCnt,
ZonedDateTime analStrtDttm,
ZonedDateTime analEndDttm,
Long analSec,
String analState
) {
this.id = id;
this.dataName = dataName;
this.mapSheepNum = mapSheepNum;
this.detectingCnt = detectingCnt;
this.analStrtDttm = analStrtDttm;
this.analEndDttm = analEndDttm;
this.analSec = analSec;
this.analState = analState;
}
}
@Schema(name = "AnalysisResultList", description = "분석결과 목록")
@Getter
public static class AnalResList {
private Long id;
private String analTitle;
private String analMapSheet;
private Long detectingCnt;
@JsonFormatDttm
private ZonedDateTime analStrtDttm;
@JsonFormatDttm
private ZonedDateTime analEndDttm;
private Long analSec;
private Long analPredSec;
private String analState;
private String analStateNm;
private String gukyuinUsed;
public AnalResList(
Long id,
String analTitle,
String analMapSheet,
Long detectingCnt,
ZonedDateTime analStrtDttm,
ZonedDateTime analEndDttm,
Long analSec,
Long analPredSec,
String analState,
String analStateNm,
String gukyuinUsed
) {
this.id = id;
this.analTitle = analTitle;
this.analMapSheet = analMapSheet;
this.detectingCnt = detectingCnt;
this.analStrtDttm = analStrtDttm;
this.analEndDttm = analEndDttm;
this.analSec = analSec;
this.analPredSec = analPredSec;
this.analState = analState;
this.analStateNm = analStateNm;
this.gukyuinUsed = gukyuinUsed;
}
}
@Schema(name = "AnalysisResultSummary", description = "분석결과 요약정보")
@Getter
public static class AnalResSummary {
private Long id;
private String modelInfo;
private Integer targetYyyy;
private Integer compareYyyy;
private String analMapSheet;
@JsonFormatDttm
private ZonedDateTime analStrtDttm;
@JsonFormatDttm
private ZonedDateTime analEndDttm;
private Long analSec;
private Long analPredSec;
private String resultUrl;
private Long detectingCnt;
private Double accuracy;
private String analState;
private String analStateNm;
public AnalResSummary(
Long id,
String modelInfo,
Integer targetYyyy,
Integer compareYyyy,
String analMapSheet,
ZonedDateTime analStrtDttm,
ZonedDateTime analEndDttm,
Long analSec,
Long analPredSec,
String resultUrl,
Long detectingCnt,
Double accuracy,
String analState,
String analStateNm
) {
this.id = id;
this.modelInfo = modelInfo;
this.targetYyyy = targetYyyy;
this.compareYyyy = compareYyyy;
this.analMapSheet = analMapSheet;
this.analStrtDttm = analStrtDttm;
this.analEndDttm = analEndDttm;
this.analSec = analSec;
this.analPredSec = analPredSec;
this.resultUrl = resultUrl;
this.detectingCnt = detectingCnt;
this.accuracy = accuracy;
this.analState = analState;
this.analStateNm = analStateNm;
}
}
@Getter
public static class Dashboard {
Integer compareYyyy;
Integer targetYyyy;
Long mapSheetNum;
String classBeforeName;
String classAfterName;
Long classBeforeCnt;
Long classAfterCnt;
@JsonFormatDttm
ZonedDateTime createdDttm;
Long createdUid;
@JsonFormatDttm
ZonedDateTime updatedDttm;
Long updatedUid;
Long refMapSheetNum;
Long dataUid;
public Dashboard(
Integer compareYyyy,
Integer targetYyyy,
Long mapSheetNum,
String classBeforeName,
String classAfterName,
Long classBeforeCnt,
Long classAfterCnt,
ZonedDateTime createdDttm,
Long createdUid,
ZonedDateTime updatedDttm,
Long updatedUid,
Long refMapSheetNum,
Long dataUid
) {
this.compareYyyy = compareYyyy;
this.targetYyyy = targetYyyy;
this.mapSheetNum = mapSheetNum;
this.classBeforeName = classBeforeName;
this.classAfterName = classAfterName;
this.classBeforeCnt = classBeforeCnt;
this.classAfterCnt = classAfterCnt;
this.createdDttm = createdDttm;
this.createdUid = createdUid;
this.updatedDttm = updatedDttm;
this.updatedUid = updatedUid;
this.refMapSheetNum = refMapSheetNum;
this.dataUid = dataUid;
}
}
@Getter
public static class Detail {
AnalResSummary summary;
List<Dashboard> dashboard;
public Detail(
AnalResSummary summary,
List<Dashboard> dashboard
) {
this.summary = summary;
this.dashboard = dashboard;
}
}
@Getter
public static class Geom {
Integer compareYyyy;
Integer targetYyyy;
String classBeforeCd;
String classBeforeName;
Double classBeforeProb;
String classAfterCd;
String classAfterName;
Double classAfterProb;
Long mapSheetNum;
public Geom(
Integer compareYyyy,
Integer targetYyyy,
String classBeforeCd,
String classBeforeName,
Double classBeforeProb,
String classAfterCd,
String classAfterName,
Double classAfterProb,
Long mapSheetNum
) {
this.compareYyyy = compareYyyy;
this.targetYyyy = targetYyyy;
this.classBeforeCd = classBeforeCd;
this.classBeforeName = classBeforeName;
this.classBeforeProb = classBeforeProb;
this.classAfterCd = classAfterCd;
this.classAfterName = classAfterName;
this.classAfterProb = classAfterProb;
this.mapSheetNum = mapSheetNum;
}
}
@Schema(name = "InferenceResultSearchReq", description = "분석결과 목록 요청 정보")
@Getter
@Setter
@NoArgsConstructor
@AllArgsConstructor
public static class SearchReq {
// 검색 조건
private String statCode;
private String title;
// 페이징 파라미터
private int page = 0;
private int size = 20;
private String sort;
public Pageable toPageable() {
if (sort != null && !sort.isEmpty()) {
String[] sortParams = sort.split(",");
String property = sortParams[0];
Sort.Direction direction =
sortParams.length > 1 ? Sort.Direction.fromString(sortParams[1]) : Sort.Direction.ASC;
return PageRequest.of(page, size, Sort.by(direction, property));
}
return PageRequest.of(page, size);
}
}
@Schema(name = "InferenceResultSearchReq", description = "분석결과 목록 요청 정보")
@Getter
@Setter
@NoArgsConstructor
@AllArgsConstructor
public static class SearchGeoReq {
// 기준년도
private String targetClass;
// 비교년도
private String compareClass;
//분석도엽
private List<Long> mapSheetNum;
// 페이징 파라미터
private int page = 0;
private int size = 20;
private String sort;
public Pageable toPageable() {
if (sort != null && !sort.isEmpty()) {
String[] sortParams = sort.split(",");
String property = sortParams[0];
Sort.Direction direction =
sortParams.length > 1 ? Sort.Direction.fromString(sortParams[1]) : Sort.Direction.ASC;
return PageRequest.of(page, size, Sort.by(direction, property));
}
return PageRequest.of(page, size);
}
}
}

View File

@@ -0,0 +1,57 @@
package com.kamco.cd.kamcoback.inference.service;
import com.kamco.cd.kamcoback.inference.dto.InferenceResultDto;
import com.kamco.cd.kamcoback.inference.dto.InferenceResultDto.Basic;
import com.kamco.cd.kamcoback.inference.dto.InferenceResultDto.Dashboard;
import com.kamco.cd.kamcoback.postgres.core.InferenceResultCoreService;
import java.util.List;
import lombok.RequiredArgsConstructor;
import org.springframework.data.domain.Page;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
@Service
@RequiredArgsConstructor
@Transactional(readOnly = true)
public class InferenceResultService {
private final InferenceResultCoreService inferenceResultCoreService;
/**
* 추론관리 > 분석결과 목록 조회
* @param searchReq
* @return
*/
public Page<InferenceResultDto.AnalResList> getInferenceResultList(InferenceResultDto.SearchReq searchReq) {
return inferenceResultCoreService.getInferenceResultList(searchReq);
}
/**
* 분석결과 요약정보
* @param id
* @return
*/
public InferenceResultDto.AnalResSummary getInferenceResultSummary(Long id) {
return inferenceResultCoreService.getInferenceResultSummary(id);
}
/**
* 분석결과 대시보드 조회
* @param id
* @return
*/
public List<Dashboard> getInferenceResultBasic(Long id) {
return inferenceResultCoreService.getInferenceResultDashboard(id);
}
/**
* 분석결과 상세 목록
* @param searchGeoReq
* @return
*/
public Page<InferenceResultDto.Geom> getInferenceResultGeomList(InferenceResultDto.SearchGeoReq searchGeoReq) {
return inferenceResultCoreService.getInferenceResultGeomList(searchGeoReq);
}
}

View File

@@ -2,6 +2,7 @@ package com.kamco.cd.kamcoback.log;
import com.kamco.cd.kamcoback.config.api.ApiResponseDto;
import com.kamco.cd.kamcoback.log.dto.AuditLogDto;
import com.kamco.cd.kamcoback.log.service.AuditLogService;
import com.kamco.cd.kamcoback.postgres.core.AuditLogCoreService;
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.tags.Tag;
@@ -16,92 +17,92 @@ import org.springframework.web.bind.annotation.RestController;
@Tag(name = "감사 로그", description = "감사 로그 관리 API")
@RequiredArgsConstructor
@RestController
@RequestMapping({"/api/log/audit", "/v1/api/log/audit"})
@RequestMapping("/api/log/audit")
public class AuditLogApiController {
private final AuditLogCoreService auditLogCoreService;
private final AuditLogService auditLogService;
@Operation(summary = "일자별 로그 조회")
@GetMapping("/daily")
public ApiResponseDto<Page<AuditLogDto.AuditList>> getDailyLogs(
public ApiResponseDto<Page<AuditLogDto.DailyAuditList>> getDailyLogs(
@RequestParam(required = false) LocalDate startDate,
@RequestParam(required = false) LocalDate endDate,
@RequestParam int page,
@RequestParam(defaultValue = "20") int size) {
AuditLogDto.DailySearchReq searchReq =
new AuditLogDto.DailySearchReq(startDate, endDate, null, page, size, "created_dttm,desc");
AuditLogDto.searchReq searchReq =
new AuditLogDto.searchReq(page, size, "created_dttm,desc");
Page<AuditLogDto.AuditList> result =
auditLogCoreService.getLogByDaily(searchReq, startDate, endDate);
Page<AuditLogDto.DailyAuditList> result =
auditLogService.getLogByDaily(searchReq, startDate, endDate);
return ApiResponseDto.ok(result);
}
@Operation(summary = "일자별 로그 상세")
@GetMapping("/daily/result")
public ApiResponseDto<Page<AuditLogDto.AuditDetail>> getDailyResultLogs(
public ApiResponseDto<Page<AuditLogDto.DailyDetail>> getDailyResultLogs(
@RequestParam LocalDate logDate,
@RequestParam int page,
@RequestParam(defaultValue = "20") int size) {
AuditLogDto.DailySearchReq searchReq =
new AuditLogDto.DailySearchReq(null, null, logDate, page, size, "created_dttm,desc");
Page<AuditLogDto.AuditDetail> result =
auditLogCoreService.getLogByDailyResult(searchReq, logDate);
AuditLogDto.searchReq searchReq =
new AuditLogDto.searchReq(page, size, "created_dttm,desc");
Page<AuditLogDto.DailyDetail> result =
auditLogService.getLogByDailyResult(searchReq, logDate);
return ApiResponseDto.ok(result);
}
@Operation(summary = "메뉴별 로그 조회")
@GetMapping("/menu")
public ApiResponseDto<Page<AuditLogDto.AuditList>> getMenuLogs(
public ApiResponseDto<Page<AuditLogDto.MenuAuditList>> getMenuLogs(
@RequestParam(required = false) String searchValue,
@RequestParam int page,
@RequestParam(defaultValue = "20") int size) {
AuditLogDto.MenuUserSearchReq searchReq =
new AuditLogDto.MenuUserSearchReq(searchValue, null, null, page, size, "created_dttm,desc");
Page<AuditLogDto.AuditList> result = auditLogCoreService.getLogByMenu(searchReq, searchValue);
AuditLogDto.searchReq searchReq =
new AuditLogDto.searchReq(page, size, "created_dttm,desc");
Page<AuditLogDto.MenuAuditList> result = auditLogService.getLogByMenu(searchReq, searchValue);
return ApiResponseDto.ok(result);
}
@Operation(summary = "메뉴별 로그 상세")
@GetMapping("/menu/result")
public ApiResponseDto<Page<AuditLogDto.AuditDetail>> getMenuResultLogs(
public ApiResponseDto<Page<AuditLogDto.MenuDetail>> getMenuResultLogs(
@RequestParam String menuId,
@RequestParam int page,
@RequestParam(defaultValue = "20") int size) {
AuditLogDto.MenuUserSearchReq searchReq =
new AuditLogDto.MenuUserSearchReq(null, menuId, null, page, size, "created_dttm,desc");
Page<AuditLogDto.AuditDetail> result =
auditLogCoreService.getLogByMenuResult(searchReq, menuId);
AuditLogDto.searchReq searchReq =
new AuditLogDto.searchReq(page, size, "created_dttm,desc");
Page<AuditLogDto.MenuDetail> result =
auditLogService.getLogByMenuResult(searchReq, menuId);
return ApiResponseDto.ok(result);
}
@Operation(summary = "사용자별 로그 조회")
@GetMapping("/account")
public ApiResponseDto<Page<AuditLogDto.AuditList>> getAccountLogs(
public ApiResponseDto<Page<AuditLogDto.UserAuditList>> getAccountLogs(
@RequestParam(required = false) String searchValue,
@RequestParam int page,
@RequestParam(defaultValue = "20") int size) {
AuditLogDto.MenuUserSearchReq searchReq =
new AuditLogDto.MenuUserSearchReq(searchValue, null, null, page, size, "created_dttm,desc");
Page<AuditLogDto.AuditList> result =
auditLogCoreService.getLogByAccount(searchReq, searchValue);
AuditLogDto.searchReq searchReq =
new AuditLogDto.searchReq(page, size, "created_dttm,desc");
Page<AuditLogDto.UserAuditList> result =
auditLogService.getLogByAccount(searchReq, searchValue);
return ApiResponseDto.ok(result);
}
@Operation(summary = "사용자별 로그 상세")
@GetMapping("/account/result")
public ApiResponseDto<Page<AuditLogDto.AuditDetail>> getAccountResultLogs(
public ApiResponseDto<Page<AuditLogDto.UserDetail>> getAccountResultLogs(
@RequestParam Long userUid,
@RequestParam int page,
@RequestParam(defaultValue = "20") int size) {
AuditLogDto.MenuUserSearchReq searchReq =
new AuditLogDto.MenuUserSearchReq(null, null, userUid, page, size, "created_dttm,desc");
Page<AuditLogDto.AuditDetail> result =
auditLogCoreService.getLogByAccountResult(searchReq, userUid);
AuditLogDto.searchReq searchReq =
new AuditLogDto.searchReq(page, size, "created_dttm,desc");
Page<AuditLogDto.UserDetail> result =
auditLogService.getLogByAccountResult(searchReq, userUid);
return ApiResponseDto.ok(result);
}

View File

@@ -3,6 +3,7 @@ package com.kamco.cd.kamcoback.log;
import com.kamco.cd.kamcoback.config.api.ApiResponseDto;
import com.kamco.cd.kamcoback.log.dto.ErrorLogDto;
import com.kamco.cd.kamcoback.log.dto.EventType;
import com.kamco.cd.kamcoback.log.service.ErrorLogService;
import com.kamco.cd.kamcoback.postgres.core.ErrorLogCoreService;
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.tags.Tag;
@@ -17,10 +18,10 @@ import org.springframework.web.bind.annotation.RestController;
@Tag(name = "에러 로그", description = "에러 로그 관리 API")
@RequiredArgsConstructor
@RestController
@RequestMapping({"/api/log/error", "/v1/api/log/error"})
@RequestMapping({"/api/log/error"})
public class ErrorLogApiController {
private final ErrorLogCoreService errorLogCoreService;
private final ErrorLogService errorLogService;
@Operation(summary = "에러로그 조회")
@GetMapping("/error")
@@ -34,7 +35,7 @@ public class ErrorLogApiController {
ErrorLogDto.ErrorSearchReq searchReq =
new ErrorLogDto.ErrorSearchReq(
logErrorLevel, eventType, startDate, endDate, page, size, "created_dttm,desc");
Page<ErrorLogDto.Basic> result = errorLogCoreService.findLogByError(searchReq);
Page<ErrorLogDto.Basic> result = errorLogService.findLogByError(searchReq);
return ApiResponseDto.ok(result);
}

View File

@@ -56,51 +56,104 @@ public class AuditLogDto {
}
}
@Schema(name = "AuditList", description = "감사 로그 목록")
@Schema(name = "AuditCommon", description = "목록 공통")
@Getter
@AllArgsConstructor
public static class AuditList {
public static class AuditCommon {
private int readCount;
private int cudCount;
private int printCount;
private int downloadCount;
private Long totalCount;
}
private Long accountId;
private String loginId;
private String username;
private LocalDateTime baseDate;
private Long menuId;
private String menuName;
@Schema(name = "DailyAuditList", description = "일자별 목록")
@Getter
public static class DailyAuditList extends AuditCommon {
private final String baseDate;
public AuditList(
LocalDateTime baseDate,
int readCount,
int cudCount,
int printCount,
int downloadCount,
Long totalCount) {
public DailyAuditList(int readCount, int cudCount, int printCount, int downloadCount, Long totalCount, String baseDate) {
super(readCount, cudCount, printCount, downloadCount, totalCount);
this.baseDate = baseDate;
this.readCount = readCount;
this.cudCount = cudCount;
this.printCount = printCount;
this.downloadCount = downloadCount;
this.totalCount = totalCount;
}
}
@Schema(name = "AuditDetail", description = "감사 로그 상세")
@Schema(name = "MenuAuditList", description = "메뉴별 목록")
@Getter
public static class MenuAuditList extends AuditCommon {
private final String menuId;
private final String menuName;
public MenuAuditList(String menuId, String menuName, int readCount, int cudCount, int printCount, int downloadCount, Long totalCount) {
super(readCount, cudCount, printCount, downloadCount, totalCount);
this.menuId = menuId;
this.menuName = menuName;
}
}
@Schema(name = "UserAuditList", description = "사용자별 목록")
@Getter
public static class UserAuditList extends AuditCommon {
private final Long accountId;
private final String loginId;
private final String username;
public UserAuditList(Long accountId, String loginId, String username, int readCount, int cudCount, int printCount, int downloadCount, Long totalCount) {
super(readCount, cudCount, printCount, downloadCount, totalCount);
this.accountId = accountId;
this.loginId = loginId;
this.username = username;
}
}
@Schema(name = "AuditDetail", description = "감사 로그 상세 공통")
@Getter
@AllArgsConstructor
public static class AuditDetail {
private Long logId;
private LocalDateTime logDateTime;
private EventType eventType;
private LogDetail detail;
}
private String userName;
private String loginId;
private String menuName;
@Schema(name = "DailyDetail", description = "일자별 로그 상세")
@Getter
public static class DailyDetail extends AuditDetail {
private final String userName;
private final String loginId;
private final String menuName;
public DailyDetail(Long logId, String userName, String loginId, String menuName, EventType eventType, LogDetail detail){
super(logId, eventType, detail);
this.userName = userName;
this.loginId = loginId;
this.menuName = menuName;
}
}
@Schema(name = "MenuDetail", description = "메뉴별 로그 상세")
@Getter
public static class MenuDetail extends AuditDetail {
private final String logDateTime;
private final String userName;
private final String loginId;
public MenuDetail(Long logId, String logDateTime, String userName, String loginId, EventType eventType, LogDetail detail){
super(logId, eventType, detail);
this.logDateTime = logDateTime;
this.userName = userName;
this.loginId = loginId;
}
}
@Schema(name = "UserDetail", description = "사용자별 로그 상세")
@Getter
public static class UserDetail extends AuditDetail {
private final String logDateTime;
private final String menuNm;
public UserDetail(Long logId, String logDateTime, String menuNm, EventType eventType, LogDetail detail){
super(logId, eventType, detail);
this.logDateTime = logDateTime;
this.menuNm = menuNm;
}
}
@Getter
@@ -112,22 +165,16 @@ public class AuditLogDto {
String menuName;
String menuUrl;
String menuDescription;
int sortOrder;
Long sortOrder;
boolean used;
}
@Schema(name = "LogDailySearchReq", description = "일자별 로그 검색 요청")
@Schema(name = "searchReq", description = "일자별 로그 검색 요청")
@Getter
@Setter
@NoArgsConstructor
@AllArgsConstructor
public static class DailySearchReq {
private LocalDate startDate;
private LocalDate endDate;
// 일자별 로그 검색 조건
private LocalDate logDate;
public static class searchReq {
// 페이징 파라미터
private int page = 0;
@@ -145,41 +192,4 @@ public class AuditLogDto {
return PageRequest.of(page, size);
}
}
@Schema(name = "MenuUserSearchReq", description = "메뉴별,사용자별 로그 검색 요청")
@Getter
@Setter
@NoArgsConstructor
public static class MenuUserSearchReq {
// 메뉴별, 사용자별 로그 검색 조건
private String searchValue;
private String menuUid;
private Long userUid; // menuId, userUid 조회
// 페이징 파라미터
private int page = 0;
private int size = 20;
private String sort;
public MenuUserSearchReq(
String searchValue, String menuUid, Long userUid, int page, int size, String sort) {
this.searchValue = searchValue;
this.menuUid = menuUid;
this.page = page;
this.size = size;
this.sort = sort;
}
public Pageable toPageable() {
if (sort != null && !sort.isEmpty()) {
String[] sortParams = sort.split(",");
String property = sortParams[0];
Sort.Direction direction =
sortParams.length > 1 ? Sort.Direction.fromString(sortParams[1]) : Sort.Direction.ASC;
return PageRequest.of(page, size, Sort.by(direction, property));
}
return PageRequest.of(page, size);
}
}
}

View File

@@ -0,0 +1,49 @@
package com.kamco.cd.kamcoback.log.service;
import com.kamco.cd.kamcoback.log.dto.AuditLogDto;
import com.kamco.cd.kamcoback.postgres.core.AuditLogCoreService;
import lombok.RequiredArgsConstructor;
import org.springframework.data.domain.Page;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.time.LocalDate;
@Service
@RequiredArgsConstructor
@Transactional(readOnly = true)
public class AuditLogService {
private final AuditLogCoreService auditLogCoreService;
public Page<AuditLogDto.DailyAuditList> getLogByDaily(
AuditLogDto.searchReq searchRange, LocalDate startDate, LocalDate endDate) {
return auditLogCoreService.getLogByDaily(searchRange, startDate, endDate);
}
public Page<AuditLogDto.MenuAuditList> getLogByMenu(
AuditLogDto.searchReq searchRange, String searchValue) {
return auditLogCoreService.getLogByMenu(searchRange, searchValue);
}
public Page<AuditLogDto.UserAuditList> getLogByAccount(
AuditLogDto.searchReq searchRange, String searchValue) {
return auditLogCoreService.getLogByAccount(searchRange, searchValue);
}
public Page<AuditLogDto.DailyDetail> getLogByDailyResult(
AuditLogDto.searchReq searchRange, LocalDate logDate) {
return auditLogCoreService.getLogByDailyResult(searchRange, logDate);
}
public Page<AuditLogDto.MenuDetail> getLogByMenuResult(
AuditLogDto.searchReq searchRange, String menuId) {
return auditLogCoreService.getLogByMenuResult(searchRange, menuId);
}
public Page<AuditLogDto.UserDetail> getLogByAccountResult(
AuditLogDto.searchReq searchRange, Long accountId) {
return auditLogCoreService.getLogByAccountResult(searchRange, accountId);
}
}

View File

@@ -0,0 +1,19 @@
package com.kamco.cd.kamcoback.log.service;
import com.kamco.cd.kamcoback.log.dto.ErrorLogDto;
import com.kamco.cd.kamcoback.postgres.core.ErrorLogCoreService;
import lombok.RequiredArgsConstructor;
import org.springframework.data.domain.Page;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
@Service
@RequiredArgsConstructor
@Transactional(readOnly = true)
public class ErrorLogService {
private final ErrorLogCoreService errorLogCoreService;
public Page<ErrorLogDto.Basic> findLogByError(ErrorLogDto.ErrorSearchReq searchReq) {
return errorLogCoreService.findLogByError(searchReq);
}
}

View File

@@ -0,0 +1,103 @@
package com.kamco.cd.kamcoback.model;
import com.kamco.cd.kamcoback.code.dto.CommonCodeDto;
import com.kamco.cd.kamcoback.config.api.ApiResponseDto;
import com.kamco.cd.kamcoback.log.dto.AuditLogDto;
import com.kamco.cd.kamcoback.model.dto.ModelMngDto;
import com.kamco.cd.kamcoback.model.dto.ModelVerDto;
import com.kamco.cd.kamcoback.model.service.ModelMngService;
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.media.Content;
import io.swagger.v3.oas.annotations.media.Schema;
import io.swagger.v3.oas.annotations.responses.ApiResponse;
import io.swagger.v3.oas.annotations.responses.ApiResponses;
import io.swagger.v3.oas.annotations.tags.Tag;
import jakarta.transaction.Transactional;
import lombok.RequiredArgsConstructor;
import org.springframework.data.domain.Page;
import org.springframework.web.bind.annotation.*;
import java.time.LocalDate;
import java.util.List;
import java.util.Optional;
@Tag(name = "모델 관리", description = "모델 관리 API")
@RequiredArgsConstructor
@RestController
@RequestMapping("/api/model")
@Transactional
public class ModelMngApiController {
private final ModelMngService modelMngService;
@Operation(summary = "목록 조회", description = "모든 모델 조회")
@ApiResponses(
value = {
@ApiResponse(
responseCode = "200",
description = "조회 성공",
content =
@Content(
mediaType = "application/json",
schema = @Schema(implementation = CommonCodeDto.Basic.class))),
@ApiResponse(responseCode = "404", description = "코드를 찾을 수 없음", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping
public ApiResponseDto<List<ModelMngDto.Basic>> getFindAll() {
return ApiResponseDto.ok(modelMngService.findModelMngAll());
}
/**
* 최종 등록 모델 정보
* @return ModelMngDto.FinalModelDto
*/
@Operation(summary = "최종 등록 모델 조회", description = "최종 등록 모델 조회")
@GetMapping("/final-model-info")
public ApiResponseDto<Optional<ModelMngDto.FinalModelDto>> getFinalModelInfo() {
return ApiResponseDto.ok(modelMngService.getFinalModelInfo());
}
/**
* 모델 등록 => 모델, 버전 동시 등록 (UI 상 따로 등록하는 곳 없음)
* @param addReq 모델 입력 값
* @return ModelVerDto.Basic
*/
@Operation(summary = "모델 등록", description = "모델 등록")
@PostMapping
public ApiResponseDto<ModelVerDto.Basic> save(@RequestBody ModelMngDto.AddReq addReq) {
return ApiResponseDto.createOK(modelMngService.save(addReq));
}
@Operation(summary = "모델 수정", description = "모델 수정")
@PutMapping("/{id}")
public ApiResponseDto<Long> update(@PathVariable Long id, @RequestBody ModelMngDto.AddReq addReq) {
return ApiResponseDto.ok(modelMngService.update(id, addReq));
}
@Operation(summary = "모델 삭제", description = "모델 삭제")
@DeleteMapping("/{id}")
public ApiResponseDto<Long> delete(@PathVariable Long id) {
return ApiResponseDto.deleteOk(modelMngService.delete(id));
}
@Operation(summary = "모델 등록 이력", description = "모델 등록 이력")
@GetMapping("/reg-history")
public ApiResponseDto<Page<ModelMngDto.ModelRegHistory>> getRegHistoryList(
@RequestParam(required = false) LocalDate startDate,
@RequestParam(required = false) LocalDate endDate,
@RequestParam int page,
@RequestParam(defaultValue = "20") int size,
@RequestParam(required = false) String searchVal,
@RequestParam(required = false) String searchColumn
) {
ModelMngDto.searchReq searchReq =
new ModelMngDto.searchReq(page, size, Optional.ofNullable(searchColumn).orElse("createdDate") + ",desc");
//searchColumn:: Entity 컬럼명칭으로 -> 기본값 = 등록일 createdDate, (선택) 배포일 deployDttm
Page<ModelMngDto.ModelRegHistory> result =
modelMngService.getRegHistoryList(searchReq, startDate, endDate, searchVal);
return ApiResponseDto.ok(result);
}
}

View File

@@ -0,0 +1,151 @@
package com.kamco.cd.kamcoback.model.dto;
import com.kamco.cd.kamcoback.common.utils.interfaces.JsonFormatDttm;
import io.swagger.v3.oas.annotations.media.Schema;
import jakarta.validation.constraints.NotEmpty;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.Setter;
import org.springframework.data.domain.PageRequest;
import org.springframework.data.domain.Pageable;
import org.springframework.data.domain.Sort;
import java.time.ZonedDateTime;
public class ModelMngDto {
@Schema(name = "ModelMng Basic", description = "모델관리 엔티티 기본 정보")
@Getter
public static class Basic {
private final Long id;
private final String modelNm;
private final String modelCate;
private final String modelPath;
@JsonFormatDttm
private final ZonedDateTime createdDttm;
private final Long createdUid;
@JsonFormatDttm
private final ZonedDateTime updatedDttm;
private final Long updatedUid;
private final String modelCntnt;
public Basic(
Long id,
String modelNm,
String modelCate,
String modelPath,
ZonedDateTime createdDttm,
Long createdUid,
ZonedDateTime updatedDttm,
Long updatedUid,
String modelCntnt
) {
this.id = id;
this.modelNm = modelNm;
this.modelCate = modelCate;
this.modelPath = modelPath;
this.createdDttm = createdDttm;
this.createdUid = createdUid;
this.updatedDttm = updatedDttm;
this.updatedUid = updatedUid;
this.modelCntnt = modelCntnt;
}
}
@Schema(name = "FinalModelDto", description = "최종 등록 모델")
@Getter
public static class FinalModelDto {
private final Long modelUid;
private final String modelNm;
private final String modelCate;
private final Long modelVerUid;
private final String modelVer;
private final String usedState;
private final String modelState;
private final Double qualityProb;
private final String deployState;
private final String modelPath;
public FinalModelDto(Long modelUid, String modelNm, String modelCate, Long modelVerUid, String modelVer,
String usedState, String modelState, Double qualityProb, String deployState, String modelPath) {
this.modelUid = modelUid;
this.modelNm = modelNm;
this.modelCate = modelCate;
this.modelVerUid = modelVerUid;
this.modelVer = modelVer;
this.usedState = usedState;
this.deployState = deployState;
this.modelState = modelState;
this.qualityProb = qualityProb;
this.modelPath = modelPath;
}
}
@Schema(name = "ModelAddReq", description = "모델 버전 등록 req")
@Getter
@Setter
@NoArgsConstructor
@AllArgsConstructor
public static class AddReq {
private String modelNm;
private String modelCate;
private String modelPath;
@NotEmpty private String modelVer;
private String modelCntnt;
}
@Schema(name = "searchReq", description = "등록이력보기 검색 요청")
@Getter
@Setter
@NoArgsConstructor
@AllArgsConstructor
public static class searchReq {
// 페이징 파라미터
private int page = 0;
private int size = 20;
private String sort;
public Pageable toPageable() {
if (sort != null && !sort.isEmpty()) {
String[] sortParams = sort.split(",");
String property = sortParams[0];
Sort.Direction direction =
sortParams.length > 1 ? Sort.Direction.fromString(sortParams[1]) : Sort.Direction.ASC;
return PageRequest.of(page, size, Sort.by(direction, property));
}
return PageRequest.of(page, size);
}
}
@Schema(name = "ModelRegHistory", description = "모델 등록 이력")
@Getter
@Setter
@NoArgsConstructor
@AllArgsConstructor
public static class ModelRegHistory {
private String modelNm;
private String modelCate;
private String modelVer;
private String strCreatedDttm;
private String usedState;
private String deployState;
private String strDeployDttm;
}
@Schema(name = "ModelDmlReturn", description = "모델 등록/수정/삭제 리턴")
@Getter
@Setter
@NoArgsConstructor
@AllArgsConstructor
public static class ModelDmlReturn {
private String execStatus;
private String message;
}
}

View File

@@ -0,0 +1,70 @@
package com.kamco.cd.kamcoback.model.dto;
import com.kamco.cd.kamcoback.common.utils.interfaces.JsonFormatDttm;
import io.swagger.v3.oas.annotations.media.Schema;
import jakarta.validation.constraints.NotEmpty;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.Setter;
import java.time.ZonedDateTime;
public class ModelVerDto {
@Schema(name = "modelVer Basic", description = "모델버전 엔티티 기본 정보")
@Getter
public static class Basic {
private final Long id;
private final Long modelUid;
private final String modelCate;
private final String modelVer;
private final String usedState;
private final String modelState;
private final Double qualityProb;
private final String deployState;
private final String modelPath;
@JsonFormatDttm
private final ZonedDateTime createdDttm;
private final Long createdUid;
@JsonFormatDttm
private final ZonedDateTime updatedDttm;
private final Long updatedUid;
public Basic(
Long id,
Long modelUid,
String modelCate,
String modelVer,
String usedState,
String modelState,
Double qualityProb,
String deployState,
String modelPath,
ZonedDateTime createdDttm,
Long createdUid,
ZonedDateTime updatedDttm,
Long updatedUid
) {
this.id = id;
this.modelUid = modelUid;
this.modelCate = modelCate;
this.modelVer = modelVer;
this.usedState = usedState;
this.modelState = modelState;
this.qualityProb = qualityProb;
this.deployState = deployState;
this.modelPath = modelPath;
this.createdDttm = createdDttm;
this.createdUid = createdUid;
this.updatedDttm = updatedDttm;
this.updatedUid = updatedUid;
}
}
}

View File

@@ -0,0 +1,46 @@
package com.kamco.cd.kamcoback.model.service;
import com.kamco.cd.kamcoback.log.dto.AuditLogDto;
import com.kamco.cd.kamcoback.model.dto.ModelMngDto;
import com.kamco.cd.kamcoback.model.dto.ModelVerDto;
import com.kamco.cd.kamcoback.postgres.core.ModelMngCoreService;
import lombok.RequiredArgsConstructor;
import org.springframework.data.domain.Page;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import java.time.LocalDate;
import java.util.List;
import java.util.Optional;
@Service
@RequiredArgsConstructor
@Transactional(readOnly = true)
public class ModelMngService {
private final ModelMngCoreService modelMngCoreService;
public List<ModelMngDto.Basic> findModelMngAll(){
return modelMngCoreService.findModelMngAll();
}
public Optional<ModelMngDto.FinalModelDto> getFinalModelInfo() {
return modelMngCoreService.getFinalModelInfo();
}
public ModelVerDto.Basic save(ModelMngDto.AddReq addReq) {
return modelMngCoreService.save(addReq);
}
public Long update(Long id, ModelMngDto.AddReq addReq) {
return modelMngCoreService.update(id, addReq);
}
public Long delete(Long id) {
return modelMngCoreService.delete(id);
}
public Page<ModelMngDto.ModelRegHistory> getRegHistoryList(ModelMngDto.searchReq searchReq, LocalDate startDate, LocalDate endDate, String searchVal) {
return modelMngCoreService.getRegHistoryList(searchReq, startDate, endDate, searchVal);
}
}

View File

@@ -0,0 +1,27 @@
package com.kamco.cd.kamcoback.postgres;
import com.querydsl.core.types.Order;
import com.querydsl.core.types.OrderSpecifier;
import com.querydsl.core.types.dsl.PathBuilder;
import org.springframework.data.domain.Pageable;
public class QuerydslOrderUtil {
/**
* Pageable의 Sort 정보를 QueryDSL OrderSpecifier 배열로 변환
* @param pageable Spring Pageable
* @param entityClass 엔티티 클래스 (예: User.class)
* @param alias Q 엔티티 alias (예: "user")
*/
public static <T> OrderSpecifier<?>[] getOrderSpecifiers(Pageable pageable, Class<T> entityClass, String alias) {
PathBuilder<T> entityPath = new PathBuilder<>(entityClass, alias);
return pageable.getSort()
.stream()
.map(sort -> {
Order order = sort.isAscending() ? Order.ASC : Order.DESC;
// PathBuilder.get()는 컬럼명(String)을 동적 Path로 반환
return new OrderSpecifier<>(order, entityPath.get(sort.getProperty(), String.class));
})
.toArray(OrderSpecifier[]::new);
}
}

View File

@@ -13,7 +13,7 @@ import org.springframework.transaction.annotation.Transactional;
@RequiredArgsConstructor
@Transactional(readOnly = true)
public class AuditLogCoreService
implements BaseCoreService<AuditLogDto.AuditList, Long, AuditLogDto.DailySearchReq> {
implements BaseCoreService<AuditLogDto.DailyAuditList, Long, AuditLogDto.searchReq> {
private final AuditLogRepository auditLogRepository;
@@ -21,42 +21,42 @@ public class AuditLogCoreService
public void remove(Long aLong) {}
@Override
public AuditLogDto.AuditList getOneById(Long aLong) {
public AuditLogDto.DailyAuditList getOneById(Long aLong) {
return null;
}
@Override
public Page<AuditLogDto.AuditList> search(AuditLogDto.DailySearchReq searchReq) {
public Page<AuditLogDto.DailyAuditList> search(AuditLogDto.searchReq searchReq) {
return null;
}
public Page<AuditLogDto.AuditList> getLogByDaily(
AuditLogDto.DailySearchReq searchRange, LocalDate startDate, LocalDate endDate) {
public Page<AuditLogDto.DailyAuditList> getLogByDaily(
AuditLogDto.searchReq searchRange, LocalDate startDate, LocalDate endDate) {
return auditLogRepository.findLogByDaily(searchRange, startDate, endDate);
}
public Page<AuditLogDto.AuditList> getLogByMenu(
AuditLogDto.MenuUserSearchReq searchRange, String searchValue) {
public Page<AuditLogDto.MenuAuditList> getLogByMenu(
AuditLogDto.searchReq searchRange, String searchValue) {
return auditLogRepository.findLogByMenu(searchRange, searchValue);
}
public Page<AuditLogDto.AuditList> getLogByAccount(
AuditLogDto.MenuUserSearchReq searchRange, String searchValue) {
public Page<AuditLogDto.UserAuditList> getLogByAccount(
AuditLogDto.searchReq searchRange, String searchValue) {
return auditLogRepository.findLogByAccount(searchRange, searchValue);
}
public Page<AuditLogDto.AuditDetail> getLogByDailyResult(
AuditLogDto.DailySearchReq searchRange, LocalDate logDate) {
public Page<AuditLogDto.DailyDetail> getLogByDailyResult(
AuditLogDto.searchReq searchRange, LocalDate logDate) {
return auditLogRepository.findLogByDailyResult(searchRange, logDate);
}
public Page<AuditLogDto.AuditDetail> getLogByMenuResult(
AuditLogDto.MenuUserSearchReq searchRange, String menuId) {
public Page<AuditLogDto.MenuDetail> getLogByMenuResult(
AuditLogDto.searchReq searchRange, String menuId) {
return auditLogRepository.findLogByMenuResult(searchRange, menuId);
}
public Page<AuditLogDto.AuditDetail> getLogByAccountResult(
AuditLogDto.MenuUserSearchReq searchRange, Long accountId) {
public Page<AuditLogDto.UserDetail> getLogByAccountResult(
AuditLogDto.searchReq searchRange, Long accountId) {
return auditLogRepository.findLogByAccountResult(searchRange, accountId);
}
}

View File

@@ -0,0 +1,38 @@
package com.kamco.cd.kamcoback.postgres.core;
import com.kamco.cd.kamcoback.changedetection.dto.ChangeDetectionDto;
import com.kamco.cd.kamcoback.postgres.entity.MapSheetAnalDataGeomEntity;
import com.kamco.cd.kamcoback.postgres.repository.changedetection.ChangeDetectionRepository;
import lombok.RequiredArgsConstructor;
import org.locationtech.jts.geom.Geometry;
import org.locationtech.jts.geom.Point;
import org.springframework.stereotype.Service;
import java.util.List;
import java.util.stream.Collectors;
@Service
@RequiredArgsConstructor
public class ChangeDetectionCoreService {
private final ChangeDetectionRepository changeDetectionRepository;
public List<ChangeDetectionDto> getPolygonToPoint() {
List<MapSheetAnalDataGeomEntity> list = changeDetectionRepository.findAll();
return list.stream().map(p -> {
Geometry polygon = p.getGeom();
// 중심 좌표 계산
Point centroid = polygon.getCentroid();
return new ChangeDetectionDto(
p.getId(),
polygon,
centroid.getX(),
centroid.getY()
);
})
.collect(Collectors.toList());
}
}

View File

@@ -4,10 +4,11 @@ import com.kamco.cd.kamcoback.code.dto.CommonCodeDto;
import com.kamco.cd.kamcoback.code.dto.CommonCodeDto.Basic;
import com.kamco.cd.kamcoback.common.service.BaseCoreService;
import com.kamco.cd.kamcoback.postgres.entity.CommonCodeEntity;
import com.kamco.cd.kamcoback.postgres.repository.CommonCodeRepository;
import com.kamco.cd.kamcoback.postgres.repository.code.CommonCodeRepository;
import com.kamco.cd.kamcoback.zoo.dto.AnimalDto.SearchReq;
import jakarta.persistence.EntityNotFoundException;
import java.util.List;
import java.util.Optional;
import lombok.RequiredArgsConstructor;
import org.springframework.data.domain.Page;
import org.springframework.stereotype.Service;
@@ -72,6 +73,16 @@ public class CommonCodeCoreService
return commonCodeRepository.findByCode(code).stream().map(CommonCodeEntity::toDto).toList();
}
/**
* 공통코드 이름 조회
* @param parentCodeCd
* @param childCodeCd
* @return
*/
public Optional<String> getCode(String parentCodeCd, String childCodeCd) {
return commonCodeRepository.getCode(parentCodeCd, childCodeCd);
}
@Override
public void remove(Long id) {
CommonCodeEntity entity =

View File

@@ -0,0 +1,59 @@
package com.kamco.cd.kamcoback.postgres.core;
import com.kamco.cd.kamcoback.inference.dto.InferenceResultDto;
import com.kamco.cd.kamcoback.inference.dto.InferenceResultDto.Dashboard;
import com.kamco.cd.kamcoback.postgres.entity.MapSheetAnalSttcEntity;
import com.kamco.cd.kamcoback.postgres.repository.Inference.InferenceResultRepository;
import jakarta.persistence.EntityNotFoundException;
import java.util.List;
import java.util.stream.Collectors;
import lombok.RequiredArgsConstructor;
import org.springframework.data.domain.Page;
import org.springframework.stereotype.Service;
@Service
@RequiredArgsConstructor
public class InferenceResultCoreService {
private final InferenceResultRepository inferenceResultRepository;
/**
* 추론관리 > 분석결과 목록 조회
* @param searchReq
* @return
*/
public Page<InferenceResultDto.AnalResList> getInferenceResultList(InferenceResultDto.SearchReq searchReq) {
return inferenceResultRepository.getInferenceResultList(searchReq);
}
/**
* 분석결과 요약정보
* @param id
* @return
*/
public InferenceResultDto.AnalResSummary getInferenceResultSummary(Long id) {
InferenceResultDto.AnalResSummary summary = inferenceResultRepository.getInferenceResultSummary(id).orElseThrow(() -> new EntityNotFoundException("요약정보를 찾을 수 없습니다. " + id));
return summary;
}
/**
* 분석결과 대시보드 조회
* @param id
* @return
*/
public List<Dashboard> getInferenceResultDashboard(Long id) {
return inferenceResultRepository.getInferenceResultDashboard(id)
.stream()
.map(MapSheetAnalSttcEntity::toDto)
.toList();
}
/**
* 분석결과 상세 목록
* @param searchGeoReq
* @return
*/
public Page<InferenceResultDto.Geom> getInferenceResultGeomList(InferenceResultDto.SearchGeoReq searchGeoReq) {
return inferenceResultRepository.getInferenceGeomList(searchGeoReq);
}
}

View File

@@ -0,0 +1,70 @@
package com.kamco.cd.kamcoback.postgres.core;
import com.kamco.cd.kamcoback.model.dto.ModelMngDto;
import com.kamco.cd.kamcoback.model.dto.ModelVerDto;
import com.kamco.cd.kamcoback.postgres.entity.ModelMngEntity;
import com.kamco.cd.kamcoback.postgres.entity.ModelVerEntity;
import com.kamco.cd.kamcoback.postgres.repository.model.ModelMngRepository;
import com.kamco.cd.kamcoback.postgres.repository.model.ModelVerRepository;
import jakarta.persistence.EntityNotFoundException;
import lombok.RequiredArgsConstructor;
import org.springframework.data.domain.Page;
import org.springframework.stereotype.Service;
import java.time.LocalDate;
import java.util.List;
import java.util.Optional;
@Service
@RequiredArgsConstructor
public class ModelMngCoreService {
private final ModelMngRepository modelMngRepository;
private final ModelVerRepository modelVerRepository;
public List<ModelMngDto.Basic> findModelMngAll() {
return modelMngRepository.findModelMngAll().stream().map(ModelMngEntity::toDto).toList();
}
public Optional<ModelMngDto.FinalModelDto> getFinalModelInfo(){
return modelMngRepository.getFinalModelInfo();
}
public ModelVerDto.Basic save(ModelMngDto.AddReq addReq) {
ModelMngEntity modelMngEntity = new ModelMngEntity(addReq.getModelNm(), addReq.getModelCate(), addReq.getModelPath(),
1L, 1L, addReq.getModelCntnt()); //TODO: 로그인 기능 붙이면 Uid 넣어야 함
ModelMngEntity saved = modelMngRepository.save(modelMngEntity);
ModelVerEntity modelVerEntity = new ModelVerEntity(saved.getId(), addReq.getModelCate(), addReq.getModelVer(), "NONE", "NONE",
0.0, "NONE", addReq.getModelPath(), 1L, 1L);
return modelVerRepository.save(modelVerEntity).toDto();
}
public Long update(Long id, ModelMngDto.AddReq addReq) {
//조회
ModelVerEntity existData = modelVerRepository.findModelVerById(id)
.orElseThrow(EntityNotFoundException::new); //데이터 없는 경우 exception
existData.update(addReq);
//TODO: 추후 수정 단계에서 도커파일 업로드하면 버전 업데이트 하는 로직 필요
return existData.getId();
}
public Long delete(Long id) {
//조회
ModelVerEntity verEntity = modelVerRepository.findModelVerById(id)
.orElseThrow(() -> new EntityNotFoundException("버전 id 에 대한 정보를 찾을 수 없습니다. id : " + id));
//usedState가 USED 이거나 이미 삭제된 상태이면 삭제 불가
if (verEntity.getUsedState().equals("USED") || verEntity.isDeleted().equals(true)) {
throw new IllegalStateException("해당 모델이 사용중이라 삭제 불가"); //TODO: 추후 규칙 정의되면 수정 필요
}
// id 코드 deleted = true 업데이트
verEntity.deleted();
return verEntity.getId();
}
public Page<ModelMngDto.ModelRegHistory> getRegHistoryList(ModelMngDto.searchReq searchReq, LocalDate startDate, LocalDate endDate, String searchVal) {
return modelMngRepository.getRegHistoryList(searchReq, startDate, endDate, searchVal);
}
}

View File

@@ -50,6 +50,9 @@ public class CommonCodeEntity extends CommonDateEntity {
@Column(name = "used")
private Boolean used;
@Column(name = "misc_cd")
private String miscCd;
@NotNull
@Column(name = "deleted", nullable = false)
private Boolean deleted = false;

View File

@@ -31,7 +31,10 @@ public class ErrorLogEntity extends CommonCreateEntity {
private String errorCode;
private String errorMessage;
@Column(columnDefinition = "TEXT")
private String stackTrace;
private Long handlerUid;
private ZonedDateTime handledDttm;

View File

@@ -0,0 +1,107 @@
package com.kamco.cd.kamcoback.postgres.entity;
import jakarta.persistence.Column;
import jakarta.persistence.Entity;
import jakarta.persistence.GeneratedValue;
import jakarta.persistence.GenerationType;
import jakarta.persistence.Id;
import jakarta.persistence.SequenceGenerator;
import jakarta.persistence.Table;
import jakarta.validation.constraints.Size;
import java.time.Instant;
import java.time.LocalTime;
import java.time.ZonedDateTime;
import java.util.Map;
import lombok.Getter;
import lombok.Setter;
import org.hibernate.annotations.ColumnDefault;
import org.hibernate.annotations.JdbcTypeCode;
import org.hibernate.type.SqlTypes;
@Getter
@Setter
@Entity
@Table(name = "tb_map_sheet_anal_data")
public class MapSheetAnalDataEntity {
@Id
@GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "tb_map_sheet_anal_data_id_gen")
@SequenceGenerator(name = "tb_map_sheet_anal_data_id_gen", sequenceName = "tb_map_sheet_learn_data_data_uid", allocationSize = 1)
@Column(name = "data_uid", nullable = false)
private Long id;
@Size(max = 128)
@Column(name = "data_name", length = 128)
private String dataName;
@Size(max = 255)
@Column(name = "data_path")
private String dataPath;
@Size(max = 128)
@Column(name = "data_type", length = 128)
private String dataType;
@Size(max = 128)
@Column(name = "data_crs_type", length = 128)
private String dataCrsType;
@Size(max = 255)
@Column(name = "data_crs_type_name")
private String dataCrsTypeName;
@ColumnDefault("now()")
@Column(name = "created_dttm", columnDefinition = "TIMESTAMP WITH TIME ZONE DEFAULT now()")
private ZonedDateTime createdDttm;
@Column(name = "created_uid")
private Long createdUid;
@ColumnDefault("now()")
@Column(name = "updated_dttm", columnDefinition = "TIMESTAMP WITH TIME ZONE DEFAULT now()")
private ZonedDateTime updatedDttm;
@Column(name = "updated_uid")
private Long updatedUid;
@Column(name = "compare_yyyy")
private Integer compareYyyy;
@Column(name = "target_yyyy")
private Integer targetYyyy;
@Column(name = "data_json")
@JdbcTypeCode(SqlTypes.JSON)
private Map<String, Object> dataJson;
@Size(max = 20)
@Column(name = "data_state", length = 20)
private String dataState;
@ColumnDefault("now()")
@Column(name = "data_state_dttm", columnDefinition = "TIMESTAMP WITH TIME ZONE DEFAULT now()")
private ZonedDateTime dataStateDttm;
@Column(name = "anal_strt_dttm", columnDefinition = "TIMESTAMP WITH TIME ZONE")
private ZonedDateTime analStrtDttm;
@Column(name = "anal_end_dttm", columnDefinition = "TIMESTAMP WITH TIME ZONE")
private ZonedDateTime analEndDttm;
@Column(name = "anal_sec")
private Long analSec;
@Size(max = 20)
@Column(name = "anal_state", length = 20)
private String analState;
@Column(name = "anal_uid")
private Long analUid;
@Column(name = "map_sheep_num")
private Long mapSheepNum;
@Column(name = "detecting_cnt")
private Long detectingCnt;
}

View File

@@ -0,0 +1,71 @@
package com.kamco.cd.kamcoback.postgres.entity;
import jakarta.persistence.*;
import jakarta.validation.constraints.Size;
import lombok.Getter;
import lombok.Setter;
import org.locationtech.jts.geom.Geometry;
import java.time.ZonedDateTime;
@Getter
@Setter
@Entity
@Table(name = "tb_map_sheet_anal_data_geom")
public class MapSheetAnalDataGeomEntity {
@Id
@GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "tb_map_sheet_anal_data_geom_id_gen")
@SequenceGenerator(name = "tb_map_sheet_anal_data_geom_id_gen", sequenceName = "tb_map_sheet_learn_data_geom_geom_uid", allocationSize = 1)
@Column(name = "geo_uid", nullable = false)
private Long id;
@Column(name = "cd_prob")
private Double cdProb;
@Column(name = "class_before_cd")
private String classBeforeCd;
@Column(name = "class_before_prob")
private Double classBeforeProb;
@Column(name = "class_after_cd")
private String classAfterCd;
@Column(name = "class_after_prob")
private Double classAfterProb;
@Column(name = "map_sheet_num")
private Long mapSheetNum;
@Column(name = "compare_yyyy")
private Integer compareYyyy;
@Column(name = "target_yyyy")
private Integer targetYyyy;
@Column(name = "area")
private Double area;
@Size(max = 100)
@Column(name = "geo_type", length = 100)
private String geoType;
@Column(name = "data_uid")
private Long dataUid;
@Column(name = "created_dttm")
private ZonedDateTime createdDttm;
@Column(name = "created_uid")
private Long createdUid;
@Column(name = "updated_dttm")
private ZonedDateTime updatedDttm;
@Column(name = "updated_uid")
private Long updatedUid;
@Column(name = "geom", columnDefinition = "geometry")
private Geometry geom;
}

View File

@@ -0,0 +1,96 @@
package com.kamco.cd.kamcoback.postgres.entity;
import jakarta.persistence.Column;
import jakarta.persistence.Entity;
import jakarta.persistence.FetchType;
import jakarta.persistence.GeneratedValue;
import jakarta.persistence.GenerationType;
import jakarta.persistence.Id;
import jakarta.persistence.JoinColumn;
import jakarta.persistence.ManyToOne;
import jakarta.persistence.SequenceGenerator;
import jakarta.persistence.Table;
import jakarta.validation.constraints.Size;
import java.time.Instant;
import java.time.ZonedDateTime;
import lombok.Getter;
import lombok.Setter;
import org.hibernate.annotations.ColumnDefault;
@Getter
@Setter
@Entity
@Table(name = "tb_map_sheet_anal")
public class MapSheetAnalEntity {
@Id
@GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "tb_map_sheet_anal_id_gen")
@SequenceGenerator(name = "tb_map_sheet_anal_id_gen", sequenceName = "tb_map_sheet_anal_anal_uid", allocationSize = 1)
@Column(name = "anal_uid", nullable = false)
private Long id;
@Column(name = "compare_yyyy")
private Integer compareYyyy;
@Column(name = "target_yyyy")
private Integer targetYyyy;
@Column(name = "model_uid")
private Long modelUid;
@Size(max = 100)
@Column(name = "server_ids", length = 100)
private String serverIds;
@Column(name = "anal_map_sheet", length = Integer.MAX_VALUE)
private String analMapSheet;
@Column(name = "anal_strt_dttm")
private ZonedDateTime analStrtDttm;
@Column(name = "anal_end_dttm")
private ZonedDateTime analEndDttm;
@Column(name = "anal_sec")
private Long analSec;
@Column(name = "anal_pred_sec")
private Long analPredSec;
@Size(max = 20)
@Column(name = "anal_state", length = 20)
private String analState;
@Size(max = 20)
@Column(name = "gukyuin_used", length = 20)
private String gukyuinUsed;
@Column(name = "accuracy")
private Double accuracy;
@Size(max = 255)
@Column(name = "result_url")
private String resultUrl;
@ColumnDefault("now()")
@Column(name = "created_dttm")
private ZonedDateTime createdDttm;
@Column(name = "created_uid")
private Long createdUid;
@ColumnDefault("now()")
@Column(name = "updated_dttm")
private ZonedDateTime updatedDttm;
@Column(name = "updated_uid")
private Long updatedUid;
@Size(max = 255)
@Column(name = "anal_title")
private String analTitle;
@Column(name = "detecting_cnt")
private Long detectingCnt;
}

View File

@@ -0,0 +1,70 @@
package com.kamco.cd.kamcoback.postgres.entity;
import com.kamco.cd.kamcoback.code.dto.CommonCodeDto;
import com.kamco.cd.kamcoback.inference.dto.InferenceResultDto;
import jakarta.persistence.Column;
import jakarta.persistence.EmbeddedId;
import jakarta.persistence.Entity;
import jakarta.persistence.Table;
import jakarta.validation.constraints.NotNull;
import java.time.OffsetDateTime;
import java.time.ZonedDateTime;
import lombok.Getter;
import lombok.Setter;
import org.hibernate.annotations.ColumnDefault;
@Getter
@Setter
@Entity
@Table(name = "tb_map_sheet_anal_sttc")
public class MapSheetAnalSttcEntity {
@EmbeddedId
private MapSheetAnalSttcEntityId id;
@Column(name = "class_before_cnt")
private Long classBeforeCnt;
@Column(name = "class_after_cnt")
private Long classAfterCnt;
@ColumnDefault("now()")
@Column(name = "created_dttm")
private ZonedDateTime createdDttm;
@Column(name = "created_uid")
private Long createdUid;
@ColumnDefault("now()")
@Column(name = "updated_dttm")
private ZonedDateTime updatedDttm;
@Column(name = "updated_uid", length = Integer.MAX_VALUE)
private Long updatedUid;
@NotNull
@Column(name = "ref_map_sheet_num", nullable = false)
private Long refMapSheetNum;
@NotNull
@Column(name = "data_uid", nullable = false)
private Long dataUid;
public InferenceResultDto.Dashboard toDto() {
return new InferenceResultDto.Dashboard(
id.getCompareYyyy(),
id.getTargetYyyy(),
id.getMapSheetNum(),
id.getClassBeforeName(),
id.getClassAfterName(),
this.classBeforeCnt,
this.classAfterCnt,
this.createdDttm,
this.createdUid,
this.updatedDttm,
this.updatedUid,
this.refMapSheetNum,
this.dataUid
);
}
}

View File

@@ -0,0 +1,62 @@
package com.kamco.cd.kamcoback.postgres.entity;
import jakarta.persistence.Column;
import jakarta.persistence.Embeddable;
import jakarta.validation.constraints.NotNull;
import jakarta.validation.constraints.Size;
import java.io.Serializable;
import java.util.Objects;
import lombok.Getter;
import lombok.Setter;
import org.hibernate.Hibernate;
@Getter
@Setter
@Embeddable
public class MapSheetAnalSttcEntityId implements Serializable {
private static final long serialVersionUID = -8630519290255405042L;
@NotNull
@Column(name = "compare_yyyy", nullable = false)
private Integer compareYyyy;
@NotNull
@Column(name = "target_yyyy", nullable = false)
private Integer targetYyyy;
@NotNull
@Column(name = "map_sheet_num", nullable = false)
private Long mapSheetNum;
@Size(max = 64)
@NotNull
@Column(name = "class_before_name", nullable = false, length = 64)
private String classBeforeName;
@Size(max = 64)
@NotNull
@Column(name = "class_after_name", nullable = false, length = 64)
private String classAfterName;
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (o == null || Hibernate.getClass(this) != Hibernate.getClass(o)) {
return false;
}
MapSheetAnalSttcEntityId entity = (MapSheetAnalSttcEntityId) o;
return Objects.equals(this.targetYyyy, entity.targetYyyy) &&
Objects.equals(this.classBeforeName, entity.classBeforeName) &&
Objects.equals(this.classAfterName, entity.classAfterName) &&
Objects.equals(this.compareYyyy, entity.compareYyyy) &&
Objects.equals(this.mapSheetNum, entity.mapSheetNum);
}
@Override
public int hashCode() {
return Objects.hash(targetYyyy, classBeforeName, classAfterName, compareYyyy, mapSheetNum);
}
}

View File

@@ -0,0 +1,108 @@
package com.kamco.cd.kamcoback.postgres.entity;
import jakarta.persistence.Column;
import jakarta.persistence.Entity;
import jakarta.persistence.Id;
import jakarta.persistence.Table;
import jakarta.validation.constraints.Size;
import java.time.LocalTime;
import java.time.OffsetDateTime;
import java.time.ZonedDateTime;
import java.util.Map;
import lombok.Getter;
import lombok.Setter;
import org.hibernate.annotations.ColumnDefault;
import org.hibernate.annotations.JdbcTypeCode;
import org.hibernate.type.SqlTypes;
@Getter
@Setter
@Entity
@Table(name = "tb_map_sheet_learn_data")
public class MapSheetLearnDataEntity {
@Id
@Column(name = "data_uid", nullable = false)
private Long id;
@Column(name = "anal_end_dttm", columnDefinition = "TIMESTAMP WITH TIME ZONE")
private ZonedDateTime analEndDttm;
@Size(max = 255)
@Column(name = "anal_map_sheet")
private String analMapSheet;
@Column(name = "anal_sec")
private Long analSec;
@Size(max = 20)
@Column(name = "anal_state", length = 20)
private String analState;
@Column(name = "anal_strt_dttm", columnDefinition = "TIMESTAMP WITH TIME ZONE")
private ZonedDateTime analStrtDttm;
@Column(name = "compare_yyyy")
private Integer compareYyyy;
@ColumnDefault("now()")
@Column(name = "created_dttm", columnDefinition = "TIMESTAMP WITH TIME ZONE DEFAULT now()")
private ZonedDateTime createdDttm;
@Column(name = "created_uid")
private Long createdUid;
@Size(max = 128)
@Column(name = "data_crs_type", length = 128)
private String dataCrsType;
@Size(max = 255)
@Column(name = "data_crs_type_name")
private String dataCrsTypeName;
@Column(name = "data_json")
@JdbcTypeCode(SqlTypes.JSON)
private Map<String, Object> dataJson;
@Size(max = 128)
@Column(name = "data_name", length = 128)
private String dataName;
@Size(max = 255)
@Column(name = "data_path")
private String dataPath;
@Size(max = 20)
@Column(name = "data_state", length = 20)
private String dataState;
@ColumnDefault("now()")
@Column(name = "data_state_dttm", columnDefinition = "TIMESTAMP WITH TIME ZONE DEFAULT now()")
private ZonedDateTime dataStateDttm;
@Size(max = 255)
@Column(name = "data_title")
private String dataTitle;
@Size(max = 128)
@Column(name = "data_type", length = 128)
private String dataType;
@Column(name = "data_yyyy")
private Integer dataYyyy;
@Size(max = 20)
@Column(name = "gukuin_used", length = 20)
private String gukuinUsed;
@Column(name = "gukuin_used_dttm", columnDefinition = "TIMESTAMP WITH TIME ZONE")
private ZonedDateTime gukuinUsedDttm;
@ColumnDefault("now()")
@Column(name = "updated_dttm", columnDefinition = "TIMESTAMP WITH TIME ZONE DEFAULT now()")
private ZonedDateTime updatedDttm;
@Column(name = "updated_uid")
private Long updatedUid;
}

View File

@@ -0,0 +1,79 @@
package com.kamco.cd.kamcoback.postgres.entity;
import jakarta.persistence.Column;
import jakarta.persistence.Entity;
import jakarta.persistence.FetchType;
import jakarta.persistence.Id;
import jakarta.persistence.JoinColumn;
import jakarta.persistence.ManyToOne;
import jakarta.persistence.Table;
import jakarta.validation.constraints.Size;
import java.time.OffsetDateTime;
import java.time.ZonedDateTime;
import lombok.Getter;
import lombok.Setter;
import org.hibernate.annotations.OnDelete;
import org.hibernate.annotations.OnDeleteAction;
import org.locationtech.jts.geom.Geometry;
@Getter
@Setter
@Entity
@Table(name = "tb_map_sheet_learn_data_geom")
public class MapSheetLearnDataGeomEntity {
@Id
@Column(name = "geo_uid", nullable = false)
private Long id;
@Column(name = "after_yyyy")
private Integer afterYyyy;
@Column(name = "area")
private Double area;
@Column(name = "before_yyyy")
private Integer beforeYyyy;
@Column(name = "cd_prob")
private Double cdProb;
@Size(max = 100)
@Column(name = "class_after_name", length = 100)
private String classAfterName;
@Column(name = "class_after_prob")
private Double classAfterProb;
@Size(max = 100)
@Column(name = "class_before_name", length = 100)
private String classBeforeName;
@Column(name = "class_before_prob")
private Double classBeforeProb;
@Column(name = "created_dttm", columnDefinition = "TIMESTAMP WITH TIME ZONE")
private ZonedDateTime createdDttm;
@Column(name = "created_uid")
private Long createdUid;
private Long dataUid;
@Size(max = 100)
@Column(name = "geo_type", length = 100)
private String geoType;
@Column(name = "geom")
private Geometry geom;
@Column(name = "map_sheet_num")
private Long mapSheetNum;
@Column(name = "updated_dttm", columnDefinition = "TIMESTAMP WITH TIME ZONE")
private ZonedDateTime updatedDttm;
@Column(name = "updated_uid")
private Long updatedUid;
}

View File

@@ -0,0 +1,41 @@
package com.kamco.cd.kamcoback.postgres.entity;
import com.kamco.cd.kamcoback.postgres.CommonDateEntity;
import jakarta.persistence.*;
import lombok.Getter;
import lombok.Setter;
import java.time.ZonedDateTime;
@Getter
@Setter
@Entity
@Table(name = "tb_model_deploy_hst")
public class ModelDeployHstEntity extends CommonDateEntity {
@Id
@GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "tb_model_deploy_hst_id_gen")
@SequenceGenerator(name = "tb_model_deploy_hst_id_gen", sequenceName = "tb_model_deploy_hst_deploy_uid", allocationSize = 1)
@Column(name = "deploy_uid", nullable = false)
private Long id;
@Column(name = "model_uid")
private Long modelUid;
@Column(name = "server_id")
private Long serverId;
@Column(name = "deploy_state")
private String deployState;
@Column(name = "deploy_dttm", columnDefinition = "TIMESTAMP WITH TIME ZONE")
private ZonedDateTime deployDttm;
@Column(name = "created_uid")
private Long createdUid;
@Column(name = "updated_uid")
private Long updatedUid;
@Column(name = "model_ver_uid")
private Long modelVerUid;
}

View File

@@ -0,0 +1,72 @@
package com.kamco.cd.kamcoback.postgres.entity;
import com.kamco.cd.kamcoback.model.dto.ModelMngDto;
import com.kamco.cd.kamcoback.postgres.CommonDateEntity;
import jakarta.persistence.*;
import jakarta.validation.constraints.Size;
import lombok.Getter;
import lombok.Setter;
import org.hibernate.annotations.ColumnDefault;
import java.time.ZonedDateTime;
@Getter
@Setter
@Entity
@Table(name = "tb_model_mng")
public class ModelMngEntity extends CommonDateEntity {
@Id
@GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "tb_model_mng_id_gen")
@SequenceGenerator(name = "tb_model_mng_id_gen", sequenceName = "tb_model_mng_model_uid", allocationSize = 1)
@Column(name = "model_uid", nullable = false)
private Long id;
@Size(max = 100)
@ColumnDefault("'NULL::character varying'")
@Column(name = "model_nm", length = 100)
private String modelNm;
@Size(max = 64)
@ColumnDefault("'NULL::character varying'")
@Column(name = "model_cate", length = 64)
private String modelCate;
@Size(max = 255)
@ColumnDefault("'NULL::character varying'")
@Column(name = "model_path")
private String modelPath;
@Column(name = "created_uid")
private Long createdUid;
@Column(name = "updated_uid")
private Long updatedUid;
@Column(name = "model_cntnt", columnDefinition = "TEXT")
private String modelCntnt;
public ModelMngEntity(String modelNm, String modelCate, String modelPath,
Long createdUid, Long updatedUid, String modelCntnt) {
this.modelNm = modelNm;
this.modelCate = modelCate;
this.modelPath = modelPath;
this.modelCntnt = modelCntnt;
this.createdUid = createdUid;
this.updatedUid = updatedUid;
}
public ModelMngDto.Basic toDto() {
return new ModelMngDto.Basic(
this.id,
this.modelNm,
this.modelCate,
this.modelPath,
super.getCreatedDate(),
this.createdUid,
super.getModifiedDate(),
this.updatedUid,
this.modelCntnt);
}
}

View File

@@ -0,0 +1,125 @@
package com.kamco.cd.kamcoback.postgres.entity;
import com.kamco.cd.kamcoback.model.dto.ModelMngDto;
import com.kamco.cd.kamcoback.model.dto.ModelVerDto;
import com.kamco.cd.kamcoback.postgres.CommonDateEntity;
import jakarta.persistence.*;
import jakarta.validation.constraints.NotNull;
import jakarta.validation.constraints.Size;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.Setter;
@Getter
@Setter
@Entity
@Table(name = "tb_model_ver")
@NoArgsConstructor
public class ModelVerEntity extends CommonDateEntity {
@Id
@GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "tb_model_ver_id_gen")
@SequenceGenerator(name = "tb_model_ver_id_gen", sequenceName = "tb_model_ver_model_ver_uid", allocationSize = 1)
@Column(name = "model_ver_uid", nullable = false)
private Long id;
@NotNull
@Column(name = "model_uid", nullable = false)
private Long modelUid;
@Size(max = 64)
@Column(name = "model_cate", length = 64)
private String modelCate;
@Size(max = 64)
@Column(name = "model_ver", length = 64)
private String modelVer;
@Size(max = 20)
@Column(name = "used_state", length = 20)
private String usedState;
@Size(max = 20)
@Column(name = "model_state", length = 20)
private String modelState;
@Column(name = "quality_prob")
private Double qualityProb;
@Size(max = 20)
@Column(name = "deploy_state", length = 20)
private String deployState;
@Size(max = 255)
@Column(name = "model_path")
private String modelPath;
@Column(name = "created_uid")
private Long createdUid;
@Column(name = "updated_uid")
private Long updatedUid;
private Boolean deleted = false;
public ModelVerEntity(Long id, Long modelUid, String modelCate, String modelVer, String usedState, String modelState,
Double qualityProb, String deployState, String modelPath, Long createdUid, Long updatedUid, Boolean deleted) {
this.id = id;
this.modelUid = modelUid;
this.modelCate = modelCate;
this.modelVer = modelVer;
this.usedState = usedState;
this.modelState = modelState;
this.qualityProb = qualityProb;
this.deployState = deployState;
this.modelPath = modelPath;
this.createdUid = createdUid;
this.updatedUid = updatedUid;
this.deleted = deleted;
}
public ModelVerEntity(Long modelUid, String modelCate, String modelVer, String usedState, String modelState,
Double qualityProb, String deployState, String modelPath, Long createdUid, Long updatedUid) {
this.modelUid = modelUid;
this.modelCate = modelCate;
this.modelVer = modelVer;
this.usedState = usedState;
this.modelState = modelState;
this.qualityProb = qualityProb;
this.deployState = deployState;
this.modelPath = modelPath;
this.createdUid = createdUid;
this.updatedUid = updatedUid;
}
public ModelVerDto.Basic toDto() {
return new ModelVerDto.Basic(
this.id,
this.modelUid,
this.modelCate,
this.modelVer,
this.usedState,
this.modelState,
this.qualityProb,
this.deployState,
this.modelPath,
super.getCreatedDate(),
this.createdUid,
super.getModifiedDate(),
this.updatedUid);
}
public void update(ModelMngDto.AddReq addReq) {
this.modelCate = addReq.getModelCate();
this.modelVer = addReq.getModelVer();
this.modelPath = addReq.getModelPath();
}
public Boolean isDeleted() {
return deleted;
}
public void deleted(){
this.deleted = true;
}
}

View File

@@ -0,0 +1,8 @@
package com.kamco.cd.kamcoback.postgres.repository.Inference;
import com.kamco.cd.kamcoback.postgres.entity.MapSheetAnalEntity;
import org.springframework.data.jpa.repository.JpaRepository;
public interface InferenceResultRepository extends JpaRepository<MapSheetAnalEntity, Long>, InferenceResultRepositoryCustom {
}

View File

@@ -0,0 +1,17 @@
package com.kamco.cd.kamcoback.postgres.repository.Inference;
import com.kamco.cd.kamcoback.inference.dto.InferenceResultDto;
import com.kamco.cd.kamcoback.inference.dto.InferenceResultDto.Dashboard;
import com.kamco.cd.kamcoback.postgres.entity.MapSheetAnalDataGeomEntity;
import com.kamco.cd.kamcoback.postgres.entity.MapSheetAnalEntity;
import com.kamco.cd.kamcoback.postgres.entity.MapSheetAnalSttcEntity;
import java.util.List;
import java.util.Optional;
import org.springframework.data.domain.Page;
public interface InferenceResultRepositoryCustom {
Page<InferenceResultDto.AnalResList> getInferenceResultList(InferenceResultDto.SearchReq searchReq);
Optional<InferenceResultDto.AnalResSummary> getInferenceResultSummary(Long id);
List<MapSheetAnalSttcEntity> getInferenceResultDashboard(Long id);
Page<InferenceResultDto.Geom> getInferenceGeomList(InferenceResultDto.SearchGeoReq searchGeoReq);
}

View File

@@ -0,0 +1,205 @@
package com.kamco.cd.kamcoback.postgres.repository.Inference;
import com.kamco.cd.kamcoback.inference.dto.InferenceResultDto;
import com.kamco.cd.kamcoback.inference.dto.InferenceResultDto.SearchGeoReq;
import com.kamco.cd.kamcoback.postgres.entity.MapSheetAnalDataGeomEntity;
import com.kamco.cd.kamcoback.postgres.entity.MapSheetAnalSttcEntity;
import com.kamco.cd.kamcoback.postgres.entity.QMapSheetAnalDataGeomEntity;
import com.kamco.cd.kamcoback.postgres.entity.QMapSheetAnalEntity;
import com.kamco.cd.kamcoback.postgres.entity.QMapSheetAnalSttcEntity;
import com.kamco.cd.kamcoback.postgres.entity.QModelMngEntity;
import com.kamco.cd.kamcoback.postgres.entity.QModelVerEntity;
import com.querydsl.core.types.Projections;
import com.querydsl.core.types.dsl.Expressions;
import com.querydsl.jpa.JPAExpressions;
import com.querydsl.jpa.JPQLQuery;
import com.querydsl.jpa.impl.JPAQueryFactory;
import java.util.List;
import java.util.Optional;
import lombok.RequiredArgsConstructor;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.PageImpl;
import org.springframework.data.domain.Pageable;
import org.springframework.stereotype.Repository;
import com.querydsl.core.BooleanBuilder;
@Repository
@RequiredArgsConstructor
public class InferenceResultRepositoryImpl implements InferenceResultRepositoryCustom {
private final JPAQueryFactory queryFactory;
private final QMapSheetAnalEntity mapSheetAnal = QMapSheetAnalEntity.mapSheetAnalEntity;
private final QModelMngEntity tmm = QModelMngEntity.modelMngEntity;
private final QModelVerEntity tmv = QModelVerEntity.modelVerEntity;
private final QMapSheetAnalSttcEntity mapSheetAnalSttc = QMapSheetAnalSttcEntity.mapSheetAnalSttcEntity;
private final QMapSheetAnalDataGeomEntity mapSheetAnalDataGeom = QMapSheetAnalDataGeomEntity.mapSheetAnalDataGeomEntity;
/**
* 분석결과 목록 조회
* @param searchReq
* @return
*/
@Override
public Page<InferenceResultDto.AnalResList> getInferenceResultList(InferenceResultDto.SearchReq searchReq) {
Pageable pageable = searchReq.toPageable();
// "0000" 전체조회
BooleanBuilder builder = new BooleanBuilder();
if (searchReq.getStatCode() != null && !"0000".equals(searchReq.getStatCode())) {
builder.and(mapSheetAnal.analState.eq(searchReq.getStatCode()));
}
// 제목
if (searchReq.getTitle() != null) {
builder.and(mapSheetAnal.analTitle.like("%" + searchReq.getTitle() + "%"));
}
List<InferenceResultDto.AnalResList> content = queryFactory
.select(Projections.constructor(InferenceResultDto.AnalResList.class,
mapSheetAnal.id,
mapSheetAnal.analTitle,
mapSheetAnal.analMapSheet,
mapSheetAnal.detectingCnt,
mapSheetAnal.analStrtDttm,
mapSheetAnal.analEndDttm,
mapSheetAnal.analSec,
mapSheetAnal.analPredSec,
mapSheetAnal.analState,
Expressions.stringTemplate("fn_code_name({0}, {1})", "0002", mapSheetAnal.analState),
mapSheetAnal.gukyuinUsed
))
.from(mapSheetAnal)
.where(
builder
)
.offset(pageable.getOffset())
.limit(pageable.getPageSize())
.orderBy(mapSheetAnal.createdDttm.desc())
.fetch();
long total = queryFactory
.select(mapSheetAnal.id)
.from(mapSheetAnal)
.where(
builder
)
.fetchCount();
return new PageImpl<>(content, pageable, total);
}
/**
* 분석결과 요약정보
* @param id
* @return
*/
@Override
public Optional<InferenceResultDto.AnalResSummary> getInferenceResultSummary(Long id) {
// 1. 최신 버전 UID를 가져오는 서브쿼리
JPQLQuery<Long> latestVerUidSub = JPAExpressions
.select(tmv.id.max())
.from(tmv)
.where(tmv.modelUid.eq(tmm.id));
Optional<InferenceResultDto.AnalResSummary> content = Optional.ofNullable(queryFactory
.select(Projections.constructor(InferenceResultDto.AnalResSummary.class,
mapSheetAnal.id,
tmm.modelNm.concat(" ").concat(tmv.modelVer).as("modelInfo"),
mapSheetAnal.targetYyyy,
mapSheetAnal.compareYyyy,
mapSheetAnal.analMapSheet,
mapSheetAnal.analStrtDttm,
mapSheetAnal.analEndDttm,
mapSheetAnal.analSec,
mapSheetAnal.analPredSec,
mapSheetAnal.resultUrl,
mapSheetAnal.detectingCnt,
mapSheetAnal.accuracy,
mapSheetAnal.analState,
Expressions.stringTemplate("fn_code_name({0}, {1})", "0002", mapSheetAnal.analState)
))
.from(mapSheetAnal)
.leftJoin(tmm).on(mapSheetAnal.modelUid.eq(tmm.id))
.leftJoin(tmv).on(
tmv.modelUid.eq(tmm.id)
.and(tmv.id.eq(latestVerUidSub))
)
.where(mapSheetAnal.id.eq(id))
.fetchOne()
);
return content;
}
/**
* 분석결과 상세 대시보드 조회
* @param id
* @return
*/
@Override
public List<MapSheetAnalSttcEntity> getInferenceResultDashboard(Long id) {
return queryFactory
.select(mapSheetAnalSttc)
.from(mapSheetAnalSttc)
.where(mapSheetAnalSttc.dataUid.eq(id))
.fetch();
}
/**
* 분석결과 상세 목록
* @param searchGeoReq
* @return
*/
@Override
public Page<InferenceResultDto.Geom> getInferenceGeomList(SearchGeoReq searchGeoReq) {
Pageable pageable = searchGeoReq.toPageable();
BooleanBuilder builder = new BooleanBuilder();
// 기준년도 분류
if(searchGeoReq.getTargetClass() != null && !searchGeoReq.getTargetClass().equals("")){
builder.and(mapSheetAnalDataGeom.classAfterCd.eq(searchGeoReq.getTargetClass()));
}
// 비교년도 분류
if(searchGeoReq.getCompareClass() != null && !searchGeoReq.getCompareClass().equals("")){
builder.and(mapSheetAnalDataGeom.classBeforeCd.eq(searchGeoReq.getCompareClass()));
}
// 분석도엽
if(searchGeoReq.getMapSheetNum() != null && !searchGeoReq.getMapSheetNum().isEmpty()){
List<Long> mapSheetNum = searchGeoReq.getMapSheetNum();
builder.and(mapSheetAnalDataGeom.mapSheetNum.in(mapSheetNum));
}
List<InferenceResultDto.Geom> content = queryFactory
.select(Projections.constructor(InferenceResultDto.Geom.class,
mapSheetAnalDataGeom.compareYyyy,
mapSheetAnalDataGeom.targetYyyy,
mapSheetAnalDataGeom.classBeforeCd,
Expressions.stringTemplate("fn_code_name({0}, {1})", "0000", mapSheetAnalDataGeom.classBeforeCd),
mapSheetAnalDataGeom.classBeforeProb,
mapSheetAnalDataGeom.classAfterCd,
Expressions.stringTemplate("fn_code_name({0}, {1})", "0000", mapSheetAnalDataGeom.classAfterCd),
mapSheetAnalDataGeom.classAfterProb,
mapSheetAnalDataGeom.mapSheetNum))
.from(mapSheetAnalDataGeom)
.where(builder)
.fetch()
;
long total = queryFactory
.select(mapSheetAnalDataGeom.id)
.from(mapSheetAnalDataGeom)
.where(
builder
)
.fetchCount();
return new PageImpl<>(content, pageable, total);
}
}

View File

@@ -0,0 +1,36 @@
package com.kamco.cd.kamcoback.postgres.repository;
import com.kamco.cd.kamcoback.postgres.entity.MapSheetLearnDataGeomEntity;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.stereotype.Repository;
import java.util.List;
@Repository
public interface MapSheetLearnDataGeomRepository extends JpaRepository<MapSheetLearnDataGeomEntity, Long> {
/**
* 데이터 UID로 지오메트리 정보 조회
*/
List<MapSheetLearnDataGeomEntity> findByDataUid(Long dataUid);
/**
* 도엽 번호로 지오메트리 정보 조회
*/
List<MapSheetLearnDataGeomEntity> findByMapSheetNum(Long mapSheetNum);
/**
* 연도 범위로 지오메트리 정보 조회
*/
List<MapSheetLearnDataGeomEntity> findByBeforeYyyyAndAfterYyyy(Integer beforeYyyy, Integer afterYyyy);
/**
* 지오메트리 타입별 조회
*/
List<MapSheetLearnDataGeomEntity> findByGeoType(String geoType);
/**
* 데이터 UID로 기존 지오메트리 데이터 삭제 (재생성 전에 사용)
*/
void deleteByDataUid(Long dataUid);
}

View File

@@ -0,0 +1,47 @@
package com.kamco.cd.kamcoback.postgres.repository;
import com.kamco.cd.kamcoback.postgres.entity.MapSheetLearnDataEntity;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.stereotype.Repository;
import java.util.List;
import java.util.Optional;
@Repository
public interface MapSheetLearnDataRepository extends JpaRepository<MapSheetLearnDataEntity, Long> {
/**
* 데이터 이름으로 조회
*/
Optional<MapSheetLearnDataEntity> findByDataName(String dataName);
/**
* 데이터 경로로 조회
*/
Optional<MapSheetLearnDataEntity> findByDataPath(String dataPath);
/**
* 처리 상태별 조회
*/
List<MapSheetLearnDataEntity> findByDataState(String dataState);
/**
* 데이터 타입별 조회
*/
List<MapSheetLearnDataEntity> findByDataType(String dataType);
/**
* 분석 상태별 조회
*/
List<MapSheetLearnDataEntity> findByAnalState(String analState);
/**
* 분석 상태별 개수 조회
*/
long countByAnalState(String analState);
/**
* 처리되지 않은 데이터 조회 (data_state가 'PENDING' 또는 null인 것들)
*/
List<MapSheetLearnDataEntity> findByDataStateIsNullOrDataState(String dataState);
}

View File

@@ -0,0 +1,6 @@
package com.kamco.cd.kamcoback.postgres.repository.changedetection;
import com.kamco.cd.kamcoback.postgres.entity.MapSheetAnalDataGeomEntity;
import org.springframework.data.jpa.repository.JpaRepository;
public interface ChangeDetectionRepository extends JpaRepository<MapSheetAnalDataGeomEntity, Long>, ChangeDetectionRepositoryCustom {}

View File

@@ -0,0 +1,9 @@
package com.kamco.cd.kamcoback.postgres.repository.changedetection;
import com.kamco.cd.kamcoback.config.api.ApiResponseDto;
public interface ChangeDetectionRepositoryCustom {
String getPolygonToPoint();
}

View File

@@ -0,0 +1,33 @@
package com.kamco.cd.kamcoback.postgres.repository.changedetection;
import com.kamco.cd.kamcoback.postgres.entity.MapSheetAnalDataGeomEntity;
import com.querydsl.jpa.impl.JPAQueryFactory;
import org.springframework.data.jpa.repository.support.QuerydslRepositorySupport;
import java.util.List;
import static com.kamco.cd.kamcoback.postgres.entity.QMapSheetAnalDataGeomEntity.mapSheetAnalDataGeomEntity;
public class ChangeDetectionRepositoryImpl extends QuerydslRepositorySupport
implements ChangeDetectionRepositoryCustom {
private final JPAQueryFactory queryFactory;
public ChangeDetectionRepositoryImpl(JPAQueryFactory queryFactory) {
super(MapSheetAnalDataGeomEntity.class);
this.queryFactory = queryFactory;
}
@Override
public String getPolygonToPoint() {
return null;
}
public List<MapSheetAnalDataGeomEntity> findAll() {
return queryFactory
.selectFrom(mapSheetAnalDataGeomEntity)
.orderBy(mapSheetAnalDataGeomEntity.id.desc())
.fetch();
}
}

View File

@@ -1,4 +1,4 @@
package com.kamco.cd.kamcoback.postgres.repository;
package com.kamco.cd.kamcoback.postgres.repository.code;
import com.kamco.cd.kamcoback.postgres.entity.CommonCodeEntity;
import org.springframework.data.jpa.repository.JpaRepository;

View File

@@ -1,4 +1,4 @@
package com.kamco.cd.kamcoback.postgres.repository;
package com.kamco.cd.kamcoback.postgres.repository.code;
import com.kamco.cd.kamcoback.code.dto.CommonCodeDto;
import com.kamco.cd.kamcoback.postgres.entity.CommonCodeEntity;
@@ -13,4 +13,6 @@ public interface CommonCodeRepositoryCustom {
List<CommonCodeEntity> findByAll();
void updateOrder(CommonCodeDto.OrderReq req);
Optional<String> getCode(String parentCodeCd, String childCodeCd);
}

View File

@@ -1,4 +1,4 @@
package com.kamco.cd.kamcoback.postgres.repository;
package com.kamco.cd.kamcoback.postgres.repository.code;
import static com.kamco.cd.kamcoback.postgres.entity.QCommonCodeEntity.commonCodeEntity;
@@ -83,6 +83,22 @@ public class CommonCodeRepositoryImpl extends QuerydslRepositorySupport
});
}
@Override
public Optional<String> getCode(String parentCodeCd, String childCodeCd) {
QCommonCodeEntity parent = QCommonCodeEntity.commonCodeEntity;
QCommonCodeEntity child = new QCommonCodeEntity("child");
String result = queryFactory
.select(child.name)
.from(child)
.join(child.parent, parent)
.where(parent.code.eq(parentCodeCd)
.and(child.code.eq(childCodeCd)))
.fetchFirst(); // 단일 결과만
return Optional.ofNullable(result);
}
private List<CommonCodeEntity> findAllByIds(Set<Long> ids) {
return queryFactory
.selectFrom(commonCodeEntity)

View File

@@ -6,21 +6,21 @@ import org.springframework.data.domain.Page;
public interface AuditLogRepositoryCustom {
Page<AuditLogDto.AuditList> findLogByDaily(
AuditLogDto.DailySearchReq searchReq, LocalDate startDate, LocalDate endDate);
Page<AuditLogDto.DailyAuditList> findLogByDaily(
AuditLogDto.searchReq searchReq, LocalDate startDate, LocalDate endDate);
Page<AuditLogDto.AuditList> findLogByMenu(
AuditLogDto.MenuUserSearchReq searchReq, String searchValue);
Page<AuditLogDto.MenuAuditList> findLogByMenu(
AuditLogDto.searchReq searchReq, String searchValue);
Page<AuditLogDto.AuditList> findLogByAccount(
AuditLogDto.MenuUserSearchReq searchReq, String searchValue);
Page<AuditLogDto.UserAuditList> findLogByAccount(
AuditLogDto.searchReq searchReq, String searchValue);
Page<AuditLogDto.AuditDetail> findLogByDailyResult(
AuditLogDto.DailySearchReq searchReq, LocalDate logDate);
Page<AuditLogDto.DailyDetail> findLogByDailyResult(
AuditLogDto.searchReq searchReq, LocalDate logDate);
Page<AuditLogDto.AuditDetail> findLogByMenuResult(
AuditLogDto.MenuUserSearchReq searchReq, String menuId);
Page<AuditLogDto.MenuDetail> findLogByMenuResult(
AuditLogDto.searchReq searchReq, String menuId);
Page<AuditLogDto.AuditDetail> findLogByAccountResult(
AuditLogDto.MenuUserSearchReq searchReq, Long accountId);
Page<AuditLogDto.UserDetail> findLogByAccountResult(
AuditLogDto.searchReq searchReq, Long accountId);
}

View File

@@ -36,24 +36,25 @@ public class AuditLogRepositoryImpl extends QuerydslRepositorySupport
}
@Override
public Page<AuditLogDto.AuditList> findLogByDaily(
AuditLogDto.DailySearchReq searchReq, LocalDate startDate, LocalDate endDate) {
DateTimeExpression<LocalDateTime> groupDateTime =
Expressions.dateTimeTemplate(
LocalDateTime.class, "date_trunc('day', {0})", auditLogEntity.createdDate);
public Page<AuditLogDto.DailyAuditList> findLogByDaily(
AuditLogDto.searchReq searchReq, LocalDate startDate, LocalDate endDate) {
StringExpression groupDateTime =
Expressions.stringTemplate("to_char({0}, 'YYYY-MM-DD')", auditLogEntity.createdDate);
Pageable pageable = searchReq.toPageable();
List<AuditLogDto.AuditList> foundContent =
List<AuditLogDto.DailyAuditList> foundContent =
queryFactory
.select(
Projections.constructor(
AuditLogDto.AuditList.class,
groupDateTime.as("baseDate"),
AuditLogDto.DailyAuditList.class,
readCount().as("readCount"),
cudCount().as("cudCount"),
printCount().as("printCount"),
downloadCount().as("downloadCount"),
auditLogEntity.count().as("totalCount")))
auditLogEntity.count().as("totalCount"),
groupDateTime.as("baseDate")
)
)
.from(auditLogEntity)
.where(eventEndedAtBetween(startDate, endDate))
.groupBy(groupDateTime)
@@ -73,14 +74,14 @@ public class AuditLogRepositoryImpl extends QuerydslRepositorySupport
}
@Override
public Page<AuditLogDto.AuditList> findLogByMenu(
AuditLogDto.MenuUserSearchReq searchReq, String searchValue) {
public Page<AuditLogDto.MenuAuditList> findLogByMenu(
AuditLogDto.searchReq searchReq, String searchValue) {
Pageable pageable = searchReq.toPageable();
List<AuditLogDto.AuditList> foundContent =
List<AuditLogDto.MenuAuditList> foundContent =
queryFactory
.select(
Projections.constructor(
AuditLogDto.AuditList.class,
AuditLogDto.MenuAuditList.class,
auditLogEntity.menuUid.as("menuId"),
menuEntity.menuNm.max().as("menuName"),
readCount().as("readCount"),
@@ -113,14 +114,14 @@ public class AuditLogRepositoryImpl extends QuerydslRepositorySupport
}
@Override
public Page<AuditLogDto.AuditList> findLogByAccount(
AuditLogDto.MenuUserSearchReq searchReq, String searchValue) {
public Page<AuditLogDto.UserAuditList> findLogByAccount(
AuditLogDto.searchReq searchReq, String searchValue) {
Pageable pageable = searchReq.toPageable();
List<AuditLogDto.AuditList> foundContent =
List<AuditLogDto.UserAuditList> foundContent =
queryFactory
.select(
Projections.constructor(
AuditLogDto.AuditList.class,
AuditLogDto.UserAuditList.class,
auditLogEntity.userUid.as("accountId"),
userEntity.userId.as("loginId"),
userEntity.userNm.as("username"),
@@ -152,8 +153,8 @@ public class AuditLogRepositoryImpl extends QuerydslRepositorySupport
}
@Override
public Page<AuditLogDto.AuditDetail> findLogByDailyResult(
AuditLogDto.DailySearchReq searchReq, LocalDate logDate) {
public Page<AuditLogDto.DailyDetail> findLogByDailyResult(
AuditLogDto.searchReq searchReq, LocalDate logDate) {
Pageable pageable = searchReq.toPageable();
QMenuEntity parent = new QMenuEntity("parent");
// 1depth menu name
@@ -170,11 +171,11 @@ public class AuditLogRepositoryImpl extends QuerydslRepositorySupport
.then(NULL_STRING)
.otherwise(menuEntity.menuNm);
List<AuditLogDto.AuditDetail> foundContent =
List<AuditLogDto.DailyDetail> foundContent =
queryFactory
.select(
Projections.constructor(
AuditLogDto.AuditDetail.class,
AuditLogDto.DailyDetail.class,
auditLogEntity.id.as("logId"),
userEntity.userNm.as("userName"),
userEntity.userId.as("loginId"),
@@ -217,8 +218,8 @@ public class AuditLogRepositoryImpl extends QuerydslRepositorySupport
}
@Override
public Page<AuditLogDto.AuditDetail> findLogByMenuResult(
AuditLogDto.MenuUserSearchReq searchReq, String menuUid) {
public Page<AuditLogDto.MenuDetail> findLogByMenuResult(
AuditLogDto.searchReq searchReq, String menuUid) {
Pageable pageable = searchReq.toPageable();
QMenuEntity parent = new QMenuEntity("parent");
// 1depth menu name
@@ -235,13 +236,13 @@ public class AuditLogRepositoryImpl extends QuerydslRepositorySupport
.then(NULL_STRING)
.otherwise(menuEntity.menuNm);
List<AuditLogDto.AuditDetail> foundContent =
List<AuditLogDto.MenuDetail> foundContent =
queryFactory
.select(
Projections.constructor(
AuditLogDto.AuditDetail.class,
AuditLogDto.MenuDetail.class,
auditLogEntity.id.as("logId"),
auditLogEntity.createdDate.as("logDateTime"),
Expressions.stringTemplate("to_char({0}, 'YYYY-MM-DD')", auditLogEntity.createdDate).as("logDateTime"), //??
userEntity.userNm.as("userName"),
userEntity.userId.as("loginId"),
auditLogEntity.eventType.as("eventType"),
@@ -282,8 +283,8 @@ public class AuditLogRepositoryImpl extends QuerydslRepositorySupport
}
@Override
public Page<AuditLogDto.AuditDetail> findLogByAccountResult(
AuditLogDto.MenuUserSearchReq searchReq, Long userUid) {
public Page<AuditLogDto.UserDetail> findLogByAccountResult(
AuditLogDto.searchReq searchReq, Long userUid) {
Pageable pageable = searchReq.toPageable();
QMenuEntity parent = new QMenuEntity("parent");
// 1depth menu name
@@ -300,13 +301,13 @@ public class AuditLogRepositoryImpl extends QuerydslRepositorySupport
.then(NULL_STRING)
.otherwise(menuEntity.menuNm);
List<AuditLogDto.AuditDetail> foundContent =
List<AuditLogDto.UserDetail> foundContent =
queryFactory
.select(
Projections.constructor(
AuditLogDto.AuditDetail.class,
AuditLogDto.UserDetail.class,
auditLogEntity.id.as("logId"),
auditLogEntity.createdDate.as("logDateTime"),
Expressions.stringTemplate("to_char({0}, 'YYYY-MM-DD')", auditLogEntity.createdDate).as("logDateTime"),
menuEntity.menuNm.as("menuName"),
auditLogEntity.eventType.as("eventType"),
Projections.constructor(
@@ -390,12 +391,11 @@ public class AuditLogRepositoryImpl extends QuerydslRepositorySupport
}
private BooleanExpression eventEndedAtEqDate(LocalDate logDate) {
DateTimeExpression<LocalDateTime> eventEndedDate =
Expressions.dateTimeTemplate(
LocalDateTime.class, "date_trunc('day', {0})", auditLogEntity.createdDate);
StringExpression eventEndedDate =
Expressions.stringTemplate("to_char({0}, 'YYYY-MM-DD')", auditLogEntity.createdDate);
LocalDateTime comparisonDate = logDate.atStartOfDay();
return eventEndedDate.eq(comparisonDate);
return eventEndedDate.eq(comparisonDate.toString());
}
private BooleanExpression menuUidEq(String menuUid) {

View File

@@ -0,0 +1,6 @@
package com.kamco.cd.kamcoback.postgres.repository.model;
import com.kamco.cd.kamcoback.postgres.entity.ModelMngEntity;
import org.springframework.data.jpa.repository.JpaRepository;
public interface ModelMngRepository extends JpaRepository<ModelMngEntity, Long>, ModelMngRepositoryCustom {}

View File

@@ -0,0 +1,18 @@
package com.kamco.cd.kamcoback.postgres.repository.model;
import com.kamco.cd.kamcoback.model.dto.ModelMngDto;
import com.kamco.cd.kamcoback.postgres.entity.ModelMngEntity;
import org.springframework.data.domain.Page;
import java.time.LocalDate;
import java.util.List;
import java.util.Optional;
public interface ModelMngRepositoryCustom {
List<ModelMngEntity> findModelMngAll();
Optional<ModelMngDto.FinalModelDto> getFinalModelInfo();
Page<ModelMngDto.ModelRegHistory> getRegHistoryList(ModelMngDto.searchReq searchReq, LocalDate startDate, LocalDate endDate, String searchVal);
}

View File

@@ -0,0 +1,141 @@
package com.kamco.cd.kamcoback.postgres.repository.model;
import com.kamco.cd.kamcoback.model.dto.ModelMngDto;
import com.kamco.cd.kamcoback.postgres.QuerydslOrderUtil;
import com.kamco.cd.kamcoback.postgres.entity.ModelMngEntity;
import com.kamco.cd.kamcoback.postgres.entity.ModelVerEntity;
import com.querydsl.core.types.Projections;
import com.querydsl.core.types.dsl.BooleanExpression;
import com.querydsl.core.types.dsl.Expressions;
import com.querydsl.core.types.dsl.StringExpression;
import com.querydsl.jpa.impl.JPAQueryFactory;
import io.micrometer.common.util.StringUtils;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.PageImpl;
import org.springframework.data.domain.Pageable;
import org.springframework.data.jpa.repository.support.QuerydslRepositorySupport;
import java.time.LocalDate;
import java.time.LocalDateTime;
import java.time.ZonedDateTime;
import java.util.List;
import java.util.Objects;
import java.util.Optional;
import static com.kamco.cd.kamcoback.postgres.entity.QModelDeployHstEntity.modelDeployHstEntity;
import static com.kamco.cd.kamcoback.postgres.entity.QModelMngEntity.modelMngEntity;
import static com.kamco.cd.kamcoback.postgres.entity.QModelVerEntity.modelVerEntity;
public class ModelMngRepositoryImpl extends QuerydslRepositorySupport
implements ModelMngRepositoryCustom {
private final JPAQueryFactory queryFactory;
private final StringExpression NULL_STRING = Expressions.stringTemplate("cast(null as text)");
public ModelMngRepositoryImpl(JPAQueryFactory queryFactory) {
super(ModelMngEntity.class);
this.queryFactory = queryFactory;
}
@Override
public List<ModelMngEntity> findModelMngAll() {
return queryFactory
.selectFrom(modelMngEntity)
.orderBy(modelMngEntity.id.desc())
.fetch();
}
@Override
public Optional<ModelMngDto.FinalModelDto> getFinalModelInfo(){
return queryFactory
.select(
Projections.constructor(
ModelMngDto.FinalModelDto.class,
modelMngEntity.id.as("modelUid"),
modelMngEntity.modelNm,
modelMngEntity.modelCate,
modelVerEntity.id.as("modelVerUid"),
modelVerEntity.modelVer,
modelVerEntity.usedState,
modelVerEntity.modelState,
modelVerEntity.qualityProb,
modelVerEntity.deployState,
modelVerEntity.modelPath
)
)
.from(modelMngEntity)
.innerJoin(modelVerEntity)
.on(modelMngEntity.id.eq(modelVerEntity.modelUid))
.where(modelVerEntity.usedState.eq("USED")) //USED 인 것 중에
.orderBy(modelVerEntity.modelVer.desc()) //Version 높은 것 기준
.stream()
.findFirst();
}
@Override
public Page<ModelMngDto.ModelRegHistory> getRegHistoryList(ModelMngDto.searchReq searchReq, LocalDate startDate, LocalDate endDate, String searchVal) {
Pageable pageable = searchReq.toPageable();
List<ModelMngDto.ModelRegHistory> foundContent =
queryFactory
.select(
Projections.constructor(
ModelMngDto.ModelRegHistory.class,
modelMngEntity.modelNm,
modelMngEntity.modelCate,
modelVerEntity.modelVer,
Expressions.stringTemplate("to_char({0}, 'YYYY-MM-DD')", modelVerEntity.createdDate).as("createdDttm"),
modelVerEntity.usedState,
modelVerEntity.deployState,
Expressions.stringTemplate("to_char({0}, 'YYYY-MM-DD')", modelDeployHstEntity.deployDttm).as("deployDttm")
)
)
.from(modelMngEntity)
.innerJoin(modelVerEntity)
.on(modelMngEntity.id.eq(modelVerEntity.modelUid))
.leftJoin(modelDeployHstEntity)
.on(
modelVerEntity.id.eq(modelDeployHstEntity.modelVerUid)
.and(modelDeployHstEntity.serverId.eq(1L)) //1건만 조회해야 하기에 1번 서버만 조회하기
)
.where(
eventEndedAtBetween(startDate, endDate),
searchModelVerLike(searchVal)
)
.offset(pageable.getOffset())
.limit(pageable.getPageSize())
.orderBy(QuerydslOrderUtil.getOrderSpecifiers(pageable, ModelVerEntity.class, "modelVerEntity"))
.fetch();
Long countQuery =
queryFactory
.select(modelVerEntity.id.count())
.from(modelMngEntity)
.innerJoin(modelVerEntity)
.on(modelMngEntity.id.eq(modelVerEntity.modelUid))
.where(
eventEndedAtBetween(startDate, endDate),
searchModelVerLike(searchVal)
)
.fetchOne();
return new PageImpl<>(foundContent, pageable, countQuery);
}
private BooleanExpression eventEndedAtBetween(LocalDate startDate, LocalDate endDate) {
if (Objects.isNull(startDate) || Objects.isNull(endDate)) {
return null;
}
LocalDateTime startDateTime = startDate.atStartOfDay();
LocalDateTime endDateTime = endDate.plusDays(1).atStartOfDay();
return modelMngEntity.createdDate.goe(ZonedDateTime.from(startDateTime))
.and(modelMngEntity.modifiedDate.lt(ZonedDateTime.from(endDateTime)));
}
private BooleanExpression searchModelVerLike(String searchVal){
if (StringUtils.isBlank(searchVal)) {
return null;
}
return modelVerEntity.modelVer.contains(searchVal);
}
}

View File

@@ -0,0 +1,7 @@
package com.kamco.cd.kamcoback.postgres.repository.model;
import com.kamco.cd.kamcoback.postgres.entity.ModelMngEntity;
import com.kamco.cd.kamcoback.postgres.entity.ModelVerEntity;
import org.springframework.data.jpa.repository.JpaRepository;
public interface ModelVerRepository extends JpaRepository<ModelVerEntity, Long>, ModelVerRepositoryCustom {}

View File

@@ -0,0 +1,13 @@
package com.kamco.cd.kamcoback.postgres.repository.model;
import com.kamco.cd.kamcoback.model.dto.ModelMngDto;
import com.kamco.cd.kamcoback.postgres.entity.ModelMngEntity;
import com.kamco.cd.kamcoback.postgres.entity.ModelVerEntity;
import java.util.List;
import java.util.Optional;
public interface ModelVerRepositoryCustom {
Optional<ModelVerEntity> findModelVerById(Long id);
}

View File

@@ -0,0 +1,33 @@
package com.kamco.cd.kamcoback.postgres.repository.model;
import com.kamco.cd.kamcoback.postgres.entity.ModelMngEntity;
import com.kamco.cd.kamcoback.postgres.entity.ModelVerEntity;
import com.querydsl.core.types.dsl.Expressions;
import com.querydsl.core.types.dsl.StringExpression;
import com.querydsl.jpa.impl.JPAQueryFactory;
import org.springframework.data.jpa.repository.support.QuerydslRepositorySupport;
import java.util.Optional;
import static com.kamco.cd.kamcoback.postgres.entity.QModelVerEntity.modelVerEntity;
public class ModelVerRepositoryImpl extends QuerydslRepositorySupport
implements ModelVerRepositoryCustom {
private final JPAQueryFactory queryFactory;
private final StringExpression NULL_STRING = Expressions.stringTemplate("cast(null as text)");
public ModelVerRepositoryImpl(JPAQueryFactory queryFactory) {
super(ModelMngEntity.class);
this.queryFactory = queryFactory;
}
@Override
public Optional<ModelVerEntity> findModelVerById(Long id) {
return Optional.ofNullable(queryFactory
.selectFrom(modelVerEntity)
.where(modelVerEntity.id.eq(id)) //model_ver_uid
.fetchOne()
);
}
}

View File

@@ -6,6 +6,7 @@ import com.kamco.cd.kamcoback.zoo.dto.AnimalDto.Basic;
import com.kamco.cd.kamcoback.zoo.dto.AnimalDto.Category;
import com.kamco.cd.kamcoback.zoo.dto.AnimalDto.Species;
import com.kamco.cd.kamcoback.zoo.service.AnimalService;
import io.swagger.v3.oas.annotations.Hidden;
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.Parameter;
import io.swagger.v3.oas.annotations.media.Content;
@@ -24,6 +25,7 @@ import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
@Hidden
@Tag(name = "Animal", description = "동물 관리 API")
@RequiredArgsConstructor
@RestController

View File

@@ -4,6 +4,7 @@ import com.kamco.cd.kamcoback.config.api.ApiResponseDto;
import com.kamco.cd.kamcoback.zoo.dto.ZooDto;
import com.kamco.cd.kamcoback.zoo.dto.ZooDto.Detail;
import com.kamco.cd.kamcoback.zoo.service.ZooService;
import io.swagger.v3.oas.annotations.Hidden;
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.Parameter;
import io.swagger.v3.oas.annotations.media.Content;
@@ -22,6 +23,7 @@ import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
@Hidden
@Tag(name = "Zoo", description = "동물원 관리 API")
@RequiredArgsConstructor
@RestController

View File

@@ -6,7 +6,7 @@ spring:
jpa:
show-sql: true
hibernate:
ddl-auto: validate
ddl-auto: validate # 스키마 검증만 수행, 자동 변경하지 않음
properties:
hibernate:
default_batch_fetch_size: 100 # ✅ 성능 - N+1 쿼리 방지

View File

@@ -22,7 +22,7 @@ spring:
leak-detection-threshold: 60000
jpa:
hibernate:
ddl-auto: validate
ddl-auto: update # 테이블이 없으면 생성, 있으면 업데이트
properties:
hibernate:
jdbc:
@@ -57,3 +57,18 @@ management:
include:
- "health"
# GeoJSON 파일 모니터링 설정
geojson:
monitor:
watch-directory: ~/geojson/upload
processed-directory: ~/geojson/processed
error-directory: ~/geojson/error
temp-directory: /tmp/geojson_extract
cron-expression: "0/30 * * * * *" # 매 30초마다 실행
supported-extensions:
- zip
- tar
- tar.gz
- tgz
max-file-size: 104857600 # 100MB

View File

@@ -0,0 +1,32 @@
-- PostGIS extension 및 기본 설정 확인
-- 이 스크립트를 PostgreSQL에서 실행하여 PostGIS가 설치되어 있는지 확인
-- 1. PostGIS extension 설치 (이미 설치되어 있다면 무시됨)
CREATE EXTENSION IF NOT EXISTS postgis;
CREATE EXTENSION IF NOT EXISTS postgis_topology;
-- 2. 현재 설치된 확장 확인
SELECT name, default_version, installed_version
FROM pg_available_extensions
WHERE name LIKE '%postgis%';
-- 3. Geometry 타입이 사용 가능한지 확인
SELECT typname
FROM pg_type
WHERE typname = 'geometry';
-- 4. 테스트용 geometry 컬럼 생성 확인
DO $$
BEGIN
-- 임시 테스트 테이블로 geometry 타입 확인
DROP TABLE IF EXISTS temp_geom_test;
CREATE TEMP TABLE temp_geom_test (
id serial,
test_geom geometry(Point, 4326)
);
RAISE NOTICE 'PostGIS geometry 타입이 정상적으로 작동합니다.';
EXCEPTION
WHEN OTHERS THEN
RAISE NOTICE 'PostGIS 설정에 문제가 있습니다: %', SQLERRM;
END
$$;

View File

@@ -0,0 +1,97 @@
-- GeoJSON 모니터링 시스템을 위한 필수 테이블 생성 스크립트
-- dump-kamco_cds-202511201730.sql에서 추출
-- 1. 시퀀스 생성
CREATE SEQUENCE IF NOT EXISTS public.tb_map_sheet_learn_data_data_uid
START WITH 1
INCREMENT BY 1
NO MINVALUE
NO MAXVALUE
CACHE 1;
-- 2. tb_map_sheet_learn_data 테이블 생성
CREATE TABLE IF NOT EXISTS public.tb_map_sheet_learn_data (
data_uid bigint DEFAULT nextval('public.tb_map_sheet_learn_data_data_uid'::regclass) NOT NULL,
data_name character varying(128),
data_path character varying(255),
data_type character varying(128),
data_crs_type character varying(128),
data_crs_type_name character varying(255),
created_dttm timestamp without time zone DEFAULT now(),
created_uid bigint,
updated_dttm timestamp without time zone DEFAULT now(),
updated_uid bigint,
compare_yyyy integer,
data_yyyy integer,
data_json json,
data_state character varying(20),
data_state_dttm timestamp without time zone DEFAULT now(),
data_title character varying(255),
anal_map_sheet character varying(255),
anal_strt_dttm timestamp without time zone,
anal_end_dttm time without time zone,
anal_sec bigint,
gukuin_used character varying(20),
gukuin_used_dttm timestamp without time zone,
anal_state character varying(20),
CONSTRAINT tb_map_sheet_learn_data_pkey PRIMARY KEY (data_uid)
);
-- 3. 시퀀스 생성 (Geometry용)
CREATE SEQUENCE IF NOT EXISTS public.tb_map_sheet_learn_data_geom_geom_uid
START WITH 1
INCREMENT BY 1
NO MINVALUE
NO MAXVALUE
CACHE 1;
-- 4. tb_map_sheet_learn_data_geom 테이블 생성
CREATE TABLE IF NOT EXISTS public.tb_map_sheet_learn_data_geom (
geo_uid bigint DEFAULT nextval('public.tb_map_sheet_learn_data_geom_geom_uid'::regclass) NOT NULL,
cd_prob double precision,
class_before_name character varying(100),
class_before_prob double precision,
class_after_name character varying(100),
class_after_prob double precision,
map_sheet_num bigint,
before_yyyy integer,
after_yyyy integer,
area double precision,
geom public.geometry,
geo_type character varying(100),
data_uid bigint,
created_dttm timestamp without time zone,
created_uid bigint,
updated_dttm timestamp without time zone,
updated_uid bigint,
CONSTRAINT tb_map_sheet_learn_data_geom_pkey PRIMARY KEY (geo_uid)
);
-- 5. 외래 키 제약 조건
ALTER TABLE ONLY public.tb_map_sheet_learn_data_geom
ADD CONSTRAINT fk_learn_data_geom_data_uid
FOREIGN KEY (data_uid) REFERENCES public.tb_map_sheet_learn_data(data_uid) ON DELETE CASCADE;
-- 6. 인덱스 생성
CREATE INDEX IF NOT EXISTS idx_tb_map_sheet_learn_data_data_state ON public.tb_map_sheet_learn_data(data_state);
CREATE INDEX IF NOT EXISTS idx_tb_map_sheet_learn_data_anal_state ON public.tb_map_sheet_learn_data(anal_state);
CREATE INDEX IF NOT EXISTS idx_tb_map_sheet_learn_data_data_path ON public.tb_map_sheet_learn_data(data_path);
CREATE INDEX IF NOT EXISTS idx_tb_map_sheet_learn_data_geom_data_uid ON public.tb_map_sheet_learn_data_geom(data_uid);
CREATE INDEX IF NOT EXISTS idx_tb_map_sheet_learn_data_geom_geo_type ON public.tb_map_sheet_learn_data_geom(geo_type);
-- 7. 테이블 코멘트
COMMENT ON TABLE public.tb_map_sheet_learn_data IS '학습데이터';
COMMENT ON COLUMN public.tb_map_sheet_learn_data.data_uid IS '식별키';
COMMENT ON COLUMN public.tb_map_sheet_learn_data.data_name IS '데이타명';
COMMENT ON COLUMN public.tb_map_sheet_learn_data.data_path IS '경로';
COMMENT ON COLUMN public.tb_map_sheet_learn_data.data_type IS '타입';
COMMENT ON COLUMN public.tb_map_sheet_learn_data.data_state IS '처리상태';
COMMENT ON COLUMN public.tb_map_sheet_learn_data.anal_state IS '분석상태';
COMMENT ON TABLE public.tb_map_sheet_learn_data_geom IS '학습데이터GEOM정보';
COMMENT ON COLUMN public.tb_map_sheet_learn_data_geom.geo_uid IS '식별키';
COMMENT ON COLUMN public.tb_map_sheet_learn_data_geom.geom IS 'geometry정보';
COMMENT ON COLUMN public.tb_map_sheet_learn_data_geom.data_uid IS '데이터식별키';
-- 완료 메시지
SELECT 'GeoJSON 모니터링 시스템 테이블 생성 완료' as message;

File diff suppressed because one or more lines are too long

View File

@@ -0,0 +1,66 @@
-- Fix timestamp column type conversion issue
-- Run this if the Hibernate automatic schema update still fails
-- For tb_map_sheet_anal_data
ALTER TABLE tb_map_sheet_anal_data
ALTER COLUMN anal_end_dttm SET DATA TYPE TIMESTAMP WITH TIME ZONE
USING anal_end_dttm::TIMESTAMP WITH TIME ZONE;
ALTER TABLE tb_map_sheet_anal_data
ALTER COLUMN anal_strt_dttm SET DATA TYPE TIMESTAMP WITH TIME ZONE
USING anal_strt_dttm::TIMESTAMP WITH TIME ZONE;
ALTER TABLE tb_map_sheet_anal_data
ALTER COLUMN created_dttm SET DATA TYPE TIMESTAMP WITH TIME ZONE
USING created_dttm::TIMESTAMP WITH TIME ZONE;
ALTER TABLE tb_map_sheet_anal_data
ALTER COLUMN updated_dttm SET DATA TYPE TIMESTAMP WITH TIME ZONE
USING updated_dttm::TIMESTAMP WITH TIME ZONE;
ALTER TABLE tb_map_sheet_anal_data
ALTER COLUMN data_state_dttm SET DATA TYPE TIMESTAMP WITH TIME ZONE
USING data_state_dttm::TIMESTAMP WITH TIME ZONE;
-- For tb_map_sheet_learn_data
ALTER TABLE tb_map_sheet_learn_data
ALTER COLUMN anal_end_dttm SET DATA TYPE TIMESTAMP WITH TIME ZONE
USING anal_end_dttm::TIMESTAMP WITH TIME ZONE;
ALTER TABLE tb_map_sheet_learn_data
ALTER COLUMN anal_strt_dttm SET DATA TYPE TIMESTAMP WITH TIME ZONE
USING anal_strt_dttm::TIMESTAMP WITH TIME ZONE;
ALTER TABLE tb_map_sheet_learn_data
ALTER COLUMN created_dttm SET DATA TYPE TIMESTAMP WITH TIME ZONE
USING created_dttm::TIMESTAMP WITH TIME ZONE;
ALTER TABLE tb_map_sheet_learn_data
ALTER COLUMN updated_dttm SET DATA TYPE TIMESTAMP WITH TIME ZONE
USING updated_dttm::TIMESTAMP WITH TIME ZONE;
ALTER TABLE tb_map_sheet_learn_data
ALTER COLUMN data_state_dttm SET DATA TYPE TIMESTAMP WITH TIME ZONE
USING data_state_dttm::TIMESTAMP WITH TIME ZONE;
ALTER TABLE tb_map_sheet_learn_data
ALTER COLUMN gukuin_used_dttm SET DATA TYPE TIMESTAMP WITH TIME ZONE
USING gukuin_used_dttm::TIMESTAMP WITH TIME ZONE;
-- For tb_map_sheet_learn_data_geom
ALTER TABLE tb_map_sheet_learn_data_geom
ALTER COLUMN created_dttm SET DATA TYPE TIMESTAMP WITH TIME ZONE
USING created_dttm::TIMESTAMP WITH TIME ZONE;
ALTER TABLE tb_map_sheet_learn_data_geom
ALTER COLUMN updated_dttm SET DATA TYPE TIMESTAMP WITH TIME ZONE
USING updated_dttm::TIMESTAMP WITH TIME ZONE;
-- For tb_map_sheet_anal_data_geom
ALTER TABLE tb_map_sheet_anal_data_geom
ALTER COLUMN created_dttm SET DATA TYPE TIMESTAMP WITH TIME ZONE
USING created_dttm::TIMESTAMP WITH TIME ZONE;
ALTER TABLE tb_map_sheet_anal_data_geom
ALTER COLUMN updated_dttm SET DATA TYPE TIMESTAMP WITH TIME ZONE
USING updated_dttm::TIMESTAMP WITH TIME ZONE;