255 Commits

Author SHA1 Message Date
3d2a4049d3 시스템 사용율 모니터링 기능 로그 수정 2026-03-19 17:38:30 +09:00
0cbaf53e86 시스템 사용율 모니터링 기능 추가 2026-03-19 17:08:37 +09:00
80fd2bda3e 시스템 사용율 모니터링 기능 테스트 2026-03-19 16:52:26 +09:00
fb647e5991 시스템 사용율 모니터링 기능 테스트 2026-03-19 16:36:39 +09:00
87575a62f7 시스템 사용율 모니터링 기능 테스트 2026-03-19 16:27:05 +09:00
246c11f8b0 시스템 사용율 모니터링 기능 테스트 2026-03-19 16:12:27 +09:00
2c1f9bdf5c 시스템 사용율 모니터링 기능 테스트 2026-03-19 15:46:23 +09:00
5799f7dfb2 시스템 사용율 모니터링 기능 테스트 2026-03-19 15:41:18 +09:00
9f428e9572 시스템 사용율 모니터링 기능 테스트 2026-03-19 15:37:26 +09:00
904968a1be 시스템 사용율 모니터링 기능 테스트 2026-03-19 15:25:20 +09:00
4b44be6a29 시스템 사용율 모니터링 기능 테스트 2026-03-19 15:25:11 +09:00
f9f0662f8e 시스템 사용율 모니터링 기능 테스트 2026-03-19 15:24:03 +09:00
7416327cc3 ai 학습실행 run command 수정 2026-03-11 10:18:33 +09:00
da31bd9d99 ai 학습실행 run command 수정 2026-03-10 23:09:39 +09:00
f3e5347335 ai 학습실행 run command 수정 2026-03-10 22:48:10 +09:00
7d5581f60c 데이터셋 삭제 플래그 추가 2026-03-10 19:09:00 +09:00
b4428217ea hello 2026-03-10 18:01:03 +09:00
8a63fdacdd confict 2026-03-10 17:27:14 +09:00
cb2e42143a confict 2026-03-10 17:23:27 +09:00
997e85c0cc 운영환경처리 2026-03-10 17:22:27 +09:00
731ca59475 심볼릭 링크로 수정 2026-03-10 17:16:54 +09:00
fe6d37456d 하드링크 실패 시 심볼릭 링크로 만들어보기 2026-03-10 16:56:35 +09:00
6c98a48a5d 에러 수정 2026-03-10 16:34:02 +09:00
81b69caa99 운영환경처리 2026-03-10 16:00:23 +09:00
7fce070686 spotless 2026-03-10 15:25:32 +09:00
8d83505ee7 운영환경처리 2026-03-10 15:00:31 +09:00
0ff38b24d4 Merge branch 'feat/training_260202' into develop
# Conflicts:
#	src/main/java/com/kamco/cd/training/train/service/JobRecoveryOnStartupService.java
2026-03-10 14:22:14 +09:00
265813e6f7 Merge branch 'feat/training_260202' of https://kamco.git.gs.dabeeo.com/MVPTeam/kamco-train-api into feat/training_260202 2026-03-10 14:19:40 +09:00
8190a6e9c8 unzip 2026-03-10 14:19:22 +09:00
5c082f7c9d 운영환경처리 2026-03-10 08:39:42 +09:00
43d0e55cb7 Merge branch 'develop' of https://kamco.git.gs.dabeeo.com/MVPTeam/kamco-train-api into develop 2026-03-04 06:12:27 +09:00
df3bedfbda prod 2026-03-04 05:35:21 +09:00
c26a48d07d prod 2026-03-04 05:25:04 +09:00
1c7c213977 Merge pull request '학습 실패 처리 수정' (#155) from feat/training_260303 into develop
Reviewed-on: #155
2026-03-04 01:43:56 +09:00
6583a45abd 학습 실패 처리 수정 2026-03-04 01:43:34 +09:00
b15f77d894 Merge pull request '학습 실패 처리 수정' (#154) from feat/training_260303 into develop
Reviewed-on: #154
2026-03-04 01:34:40 +09:00
3bcd99f0db 학습 실패 처리 수정 2026-03-04 01:34:16 +09:00
5513cd60a0 Merge pull request '리커버리 수정' (#153) from feat/training_260303 into develop
Reviewed-on: #153
2026-03-04 01:18:19 +09:00
7b35d26a13 리커버리 수정 2026-03-04 01:18:04 +09:00
d92ff88ef7 Merge pull request '리커버리 수정' (#152) from feat/training_260303 into develop
Reviewed-on: #152
2026-03-04 01:15:04 +09:00
bfcddd0327 리커버리 수정 2026-03-04 01:14:35 +09:00
e193330f99 Merge pull request '리커버리 수정' (#151) from feat/training_260303 into develop
Reviewed-on: #151
2026-03-04 01:00:45 +09:00
6c0184597d 리커버리 수정 2026-03-04 01:00:26 +09:00
1e8a8d8dad Merge pull request '리커버리 수정 테스트 로그 추가,' (#150) from feat/training_260303 into develop
Reviewed-on: #150
2026-03-04 00:44:03 +09:00
eb7680b952 리커버리 수정 테스트 로그 추가, 2026-03-04 00:43:20 +09:00
2c357ebf27 Merge pull request '리커버리 수정 테스트,,' (#149) from feat/training_260303 into develop
Reviewed-on: #149
2026-03-04 00:30:36 +09:00
0daaa1c8cb 리커버리 수정 테스트,, 2026-03-04 00:30:17 +09:00
96383595df Merge pull request '리커버리 수정' (#148) from feat/training_260303 into develop
Reviewed-on: #148
2026-03-04 00:22:09 +09:00
c5e03f7ca8 리커버리 수정 2026-03-04 00:21:51 +09:00
df2acc4dfb Merge pull request '리커버리 수정' (#147) from feat/training_260303 into develop
Reviewed-on: #147
2026-03-04 00:06:21 +09:00
693de354d2 리커버리 수정 2026-03-04 00:06:04 +09:00
62cdc5015e Merge pull request '학습 리커버리 테스트' (#146) from feat/training_260303 into develop
Reviewed-on: #146
2026-03-03 23:45:46 +09:00
5c11263d55 랃그 2026-03-03 23:45:10 +09:00
dd5031ae3a Merge pull request 'log.info 추가' (#145) from feat/training_260303 into develop
Reviewed-on: #145
2026-03-03 23:40:44 +09:00
c369e01ada log.info 추가 2026-03-03 23:40:18 +09:00
54991622f1 log.info 추가 2026-03-03 23:31:25 +09:00
e9f8bb37fa spotless 적용 2026-03-03 23:06:45 +09:00
17d69486ec spotless 적용 2026-03-03 23:04:55 +09:00
f3c822587f spotless 적용 2026-03-03 23:02:53 +09:00
948f1061da Merge pull request 'spotless 적용' (#144) from feat/training_260202 into develop
Reviewed-on: #144
2026-03-03 23:01:39 +09:00
335f0dbb9b spotless 적용 2026-03-03 23:01:22 +09:00
42438b3cd5 Merge pull request '하드링크 수정' (#143) from feat/training_260202 into develop
Reviewed-on: #143
2026-03-03 22:51:50 +09:00
69eaba1a83 하드링크 수정 2026-03-03 22:51:10 +09:00
524ae200b0 prod 2026-03-03 08:44:31 +09:00
4f763d3c2e prod 2026-03-03 08:44:01 +09:00
9ebf525387 prod 2026-03-03 08:42:16 +09:00
4ab672a96e 중복 경고 2026-02-28 01:56:48 +09:00
7d2a367e3f Merge pull request '리커버리 삭제' (#142) from feat/training_260202 into develop
Reviewed-on: #142
2026-02-28 01:24:49 +09:00
365ad81cad 리커버리 삭제 2026-02-28 01:24:34 +09:00
67a67749c3 Merge pull request '리커버리 추가' (#141) from feat/training_260202 into develop
Reviewed-on: #141
2026-02-28 01:02:10 +09:00
9dfa54fbf9 리커버리 추가 2026-02-28 01:01:38 +09:00
251307b5c9 Merge pull request '하드링크 수정' (#140) from feat/training_260202 into develop
Reviewed-on: #140
2026-02-27 23:31:24 +09:00
12f6bb7154 하드링크 수정 2026-02-27 23:31:04 +09:00
8423a03d31 Merge pull request '하드링크 로그 추가' (#139) from feat/training_260202 into develop
Reviewed-on: #139
2026-02-27 23:12:17 +09:00
aa3af4e9d0 하드링크 로그 추가 2026-02-27 23:12:00 +09:00
d6cdf6b690 Merge pull request '하드링크 로그 추가' (#138) from feat/training_260202 into develop
Reviewed-on: #138
2026-02-27 22:51:51 +09:00
7ca37bf1e4 하드링크 로그 추가 2026-02-27 22:51:27 +09:00
9cfa299e58 Merge pull request 'feat/training_260202' (#137) from feat/training_260202 into develop
Reviewed-on: #137
2026-02-24 16:55:35 +09:00
901dde066d 하이파라미터 상세조회 수정 2026-02-24 16:54:58 +09:00
cb0a38274a val_interval 기본값 1로 수정 2026-02-24 16:24:16 +09:00
b8194df9ae 학습 실패여부 확인 기능 추가 2026-02-24 15:43:35 +09:00
a137e71420 Merge pull request 'feat/training_260202' (#136) from feat/training_260202 into develop
Reviewed-on: #136
2026-02-24 15:11:12 +09:00
7c5f07683e 학습 실패여부 확인 기능 추가 2026-02-24 15:10:48 +09:00
159fb281d4 최근 사용일시 업데이트 2026-02-23 15:40:24 +09:00
97192ff811 최근 사용일시 업데이트 2026-02-23 15:39:25 +09:00
4f3fb675be 하이퍼 파라미터 사용회수 카운트 기능 추가 및 조회 수정 2026-02-23 15:37:28 +09:00
e6caea05b3 하이퍼 파라미터 사용회수 카운트 기능 추가 및 조회 수정 2026-02-23 15:19:40 +09:00
f08f80622f Merge pull request 'feat/training_260202' (#135) from feat/training_260202 into develop
Reviewed-on: #135
2026-02-23 14:31:11 +09:00
fd63824edc 하이퍼 파라미터 미사용 컬럼제거, 사용횟수 컬럼 추가 2026-02-23 14:30:29 +09:00
8a44df26b8 하이퍼 파라미터 상세조회 삭제여부 조건 제거 2026-02-23 14:17:34 +09:00
cb97c5e59e 하이퍼 파라미터 dto 주석 수정 2026-02-23 14:13:25 +09:00
8f75b16dc6 학습실행 주석 추가 2026-02-23 12:30:54 +09:00
e565fd7a34 Merge pull request '전이학습 상세 수정' (#134) from feat/training_260202 into develop
Reviewed-on: #134
2026-02-20 18:34:58 +09:00
c2978e41c2 전이학습 상세 수정 2026-02-20 18:34:32 +09:00
8c45b39dcc Merge pull request 'feat/training_260202' (#133) from feat/training_260202 into develop
Reviewed-on: #133
2026-02-20 18:22:45 +09:00
07429dbe8e 전이학습 상세 수정 2026-02-20 18:22:19 +09:00
83859bb9fe 전이학습 상세 - before dataset 추가 2026-02-20 16:05:29 +09:00
b119f333ac Merge pull request 'best epoch 파일 선택 수정' (#132) from feat/training_260202 into develop
Reviewed-on: #132
2026-02-20 15:41:58 +09:00
564a99448c best epoch 파일 선택 수정 2026-02-20 15:41:34 +09:00
bbe04ee458 Merge pull request 'best epoch 파일 선택 수정' (#131) from feat/training_260202 into develop
Reviewed-on: #131
2026-02-20 15:31:50 +09:00
38ae6e5575 best epoch 파일 선택 수정 2026-02-20 15:31:33 +09:00
fab3c83a69 Merge pull request 'best epoch 파일 선택 수정' (#130) from feat/training_260202 into develop
Reviewed-on: #130
2026-02-20 15:15:39 +09:00
40fe98ae0c best epoch 파일 선택 수정 2026-02-20 15:15:12 +09:00
fb87a0f32f Merge pull request '중복 수정 제거' (#129) from feat/training_260202 into develop
Reviewed-on: #129
2026-02-20 14:31:15 +09:00
255ff10a56 Merge remote-tracking branch 'origin/feat/training_260202' into feat/training_260202 2026-02-20 14:30:42 +09:00
f674f73330 중복 수정 제거 2026-02-20 14:30:34 +09:00
63794ec4ec Merge pull request 'feat/training_260202' (#128) from feat/training_260202 into develop
Reviewed-on: #128
2026-02-20 14:25:13 +09:00
db2bc32e7d test metrics insert 로직 수정 2026-02-20 14:24:23 +09:00
37786a1e44 선택한 테스트 에폭 로그 추가 2026-02-20 14:13:49 +09:00
901ea83fb7 test 선택한 에폭 log 확인 추가 2026-02-20 14:13:22 +09:00
bf6dc9740f Merge pull request 'tmp 하드링크 수정' (#127) from feat/training_260202 into develop
Reviewed-on: #127
2026-02-20 13:37:04 +09:00
832e1b5681 tmp 하드링크 수정 2026-02-20 13:36:48 +09:00
a23bc8dd67 Merge pull request 'tmp 하드링크 수정' (#126) from feat/training_260202 into develop
Reviewed-on: #126
2026-02-20 12:30:23 +09:00
4f16355cda tmp 하드링크 수정 2026-02-20 12:29:57 +09:00
13023a06cc Merge pull request 'feat/training_260202' (#125) from feat/training_260202 into develop
Reviewed-on: #125
2026-02-20 12:23:37 +09:00
df46a8f79f Merge remote-tracking branch 'origin/feat/training_260202' into feat/training_260202 2026-02-20 12:21:50 +09:00
fcd48831c5 tmp 하드링크 수정 2026-02-20 12:21:41 +09:00
28b50bd949 Merge pull request 'test json 수정' (#124) from feat/training_260202 into develop
Reviewed-on: #124
2026-02-20 12:20:36 +09:00
62c9d73b94 test json 수정 2026-02-20 12:20:15 +09:00
78ab928459 Merge pull request 'ing-cnt 로직에 step2도 추가, transactional' (#123) from feat/training_260202 into develop
Reviewed-on: #123
2026-02-20 12:05:40 +09:00
68c0e634c5 ing-cnt 로직에 step2도 추가, transactional 2026-02-20 12:05:20 +09:00
ae3601cff5 Merge pull request '비밀번호 변경 security 로직 수정' (#122) from feat/training_260202 into develop
Reviewed-on: #122
2026-02-20 11:37:08 +09:00
ad421e3c74 비밀번호 변경 security 로직 수정 2026-02-20 11:36:21 +09:00
5f62f4a209 Merge pull request 'test 실행 시 회차별 데이터 적재하기' (#121) from feat/training_260202 into develop
Reviewed-on: #121
2026-02-19 18:19:40 +09:00
46db1512a6 test 실행 시 회차별 데이터 적재하기 2026-02-19 18:18:12 +09:00
29bf155b4f Merge pull request 'LogErrorLevel -> CodeExpose 추가' (#120) from feat/training_260202 into develop
Reviewed-on: #120
2026-02-19 17:35:45 +09:00
2034a8fcb2 LogErrorLevel -> CodeExpose 추가 2026-02-19 17:35:15 +09:00
da03f8b749 Merge pull request '모델학습관리 > 모델별 진행 상황 API 추가' (#119) from feat/training_260202 into develop
Reviewed-on: #119
2026-02-19 17:17:46 +09:00
bf212842d8 모델학습관리 > 모델별 진행 상황 API 추가 2026-02-19 17:17:11 +09:00
6a2deff93b Merge pull request '모델학습관리 > 목록 API 메모,작성자 추가로 인한 수정' (#118) from feat/training_260202 into develop
Reviewed-on: #118
2026-02-19 15:35:03 +09:00
d2ca94ea55 모델학습관리 > 목록 API 메모,작성자 추가로 인한 수정 2026-02-19 15:34:18 +09:00
b0a99afcd3 Merge pull request '모델학습 2단계 패키징 시작,종료일시,상태 로직 추가' (#117) from feat/training_260202 into develop
Reviewed-on: #117
2026-02-19 14:44:12 +09:00
5ddf6dfeeb 모델학습 2단계 패키징 시작,종료일시,상태 로직 추가 2026-02-19 14:43:14 +09:00
eedf72d7aa Merge pull request '공통코드 common-code 로 prefix 변경' (#116) from feat/training_260202 into develop
Reviewed-on: #116
2026-02-19 11:39:23 +09:00
5e13c0b396 공통코드 common-code 로 prefix 변경 2026-02-19 11:38:56 +09:00
25e9941464 Merge pull request '로그관리 로직 커밋' (#115) from feat/training_260202 into develop
Reviewed-on: #115
2026-02-19 11:16:40 +09:00
435f60dcac 로그관리 로직 커밋 2026-02-19 11:13:40 +09:00
a0da0392cf Merge pull request '압축해제 시, 동일 폴더가 있으면 삭제 후 재업로드' (#114) from feat/training_260202 into develop
Reviewed-on: #114
2026-02-18 16:37:13 +09:00
5f5eabca19 압축해제 시, 동일 폴더가 있으면 삭제 후 재업로드 2026-02-18 16:36:46 +09:00
a3ebee12b5 Merge pull request 'feat/training_260202' (#113) from feat/training_260202 into develop
Reviewed-on: #113
2026-02-18 16:29:55 +09:00
413631840f 학습데이터 다운로드 security 제외하기 2026-02-18 16:29:28 +09:00
c7f63d1ad1 압축 해제한 폴더의 갯수 맞는지 log 찍기 + 갯수 맞지 않으면 exception 리턴 2026-02-18 16:22:32 +09:00
c5b14ca09d Merge pull request '업로드 시 uid로 중복체크 -> 삭제인 row는 제외하기' (#112) from feat/training_260202 into develop
Reviewed-on: #112
2026-02-18 15:40:38 +09:00
7529d23488 업로드 시 uid로 중복체크 -> 삭제인 row는 제외하기 2026-02-18 15:40:14 +09:00
44b3b857b1 Merge pull request 'feat/training_260202' (#111) from feat/training_260202 into develop
Reviewed-on: #111
2026-02-18 15:29:25 +09:00
cb3e51d712 업로드 시 exception 메세지 처리, 에폭 10 이상으로 실행되게 수정 2026-02-18 15:28:29 +09:00
99a4597b5f train 결과 +1 했던 거 제거하기 2026-02-18 14:50:53 +09:00
63124455fd Merge pull request '1단계 실행 시, 시작시간 update 추가' (#110) from feat/training_260202 into develop
Reviewed-on: #110
2026-02-18 13:06:33 +09:00
d9da0d4610 1단계 실행 시, 시작시간 update 추가 2026-02-18 13:05:59 +09:00
e75ea8d8a5 Merge pull request '하이퍼 파라미터 수정' (#109) from feat/training_260202 into develop
Reviewed-on: #109
2026-02-13 15:00:50 +09:00
22c481556c 하이퍼 파라미터 수정 2026-02-13 15:00:30 +09:00
31ac4209c3 Merge pull request '하이퍼 파라미터 수정' (#108) from feat/training_260202 into develop
Reviewed-on: #108
2026-02-13 14:47:19 +09:00
0798b352c7 하이퍼 파라미터 수정 2026-02-13 14:46:59 +09:00
df09935789 Merge pull request '하이퍼 파라미터 수정' (#107) from feat/training_260202 into develop
Reviewed-on: #107
2026-02-13 14:42:54 +09:00
5b074bdb81 하이퍼 파라미터 수정 2026-02-13 14:42:39 +09:00
bb15b1b0f2 Merge pull request '하이퍼 파라미터 수정' (#106) from feat/training_260202 into develop
Reviewed-on: #106
2026-02-13 14:38:01 +09:00
28919345c2 하이퍼 파라미터 수정 2026-02-13 14:37:44 +09:00
f4d491ed94 Merge pull request '하이퍼 파라미터 수정' (#105) from feat/training_260202 into develop
Reviewed-on: #105
2026-02-13 14:31:13 +09:00
aa0552aaa7 하이퍼 파라미터 수정 2026-02-13 14:30:45 +09:00
96cb7d2f23 Merge pull request '사용가능 용량 API 수정' (#104) from feat/training_260202 into develop
Reviewed-on: #104
2026-02-13 14:19:26 +09:00
5d0aca14a6 사용가능 용량 API 수정 2026-02-13 14:19:09 +09:00
cc6305b0df Merge pull request '이어하기 수정' (#103) from feat/training_260202 into develop
Reviewed-on: #103
2026-02-13 14:08:51 +09:00
af8d59ddfa 이어하기 수정 2026-02-13 14:08:35 +09:00
3916b13876 Merge pull request '이어하기 수정' (#102) from feat/training_260202 into develop
Reviewed-on: #102
2026-02-13 14:04:52 +09:00
4f24e09c57 이어하기 수정 2026-02-13 14:04:33 +09:00
ee4a06df30 Merge pull request '이어하기 수정' (#101) from feat/training_260202 into develop
Reviewed-on: #101
2026-02-13 13:57:48 +09:00
4da477706f 이어하기 수정 2026-02-13 13:57:01 +09:00
bb5ff7c3cd Merge pull request '이어하기 로그 수정' (#100) from feat/training_260202 into develop
Reviewed-on: #100
2026-02-13 13:27:08 +09:00
a070566048 이어하기 로그 수정 2026-02-13 13:26:54 +09:00
312a96dda1 Merge pull request '트랜젝션처리 임시폴더 uid업데이트' (#99) from feat/training_260202 into develop
Reviewed-on: #99
2026-02-13 13:24:10 +09:00
a5b3ae613f 트랜젝션처리 임시폴더 uid업데이트 2026-02-13 13:23:53 +09:00
e38231e06d Merge pull request '트랜젝션처리 임시폴더 uid업데이트' (#98) from feat/training_260202 into develop
Reviewed-on: #98
2026-02-13 13:17:41 +09:00
979af088be 트랜젝션처리 임시폴더 uid업데이트 2026-02-13 13:17:27 +09:00
bf6e45d706 Merge pull request 'feat/training_260202' (#97) from feat/training_260202 into develop
Reviewed-on: #97
2026-02-13 12:53:32 +09:00
e5a1cab36b Merge branch 'feat/training_260202' of https://kamco.git.gs.dabeeo.com/MVPTeam/kamco-train-api into feat/training_260202 2026-02-13 12:53:04 +09:00
bb67996742 text 2026-02-13 12:53:00 +09:00
1981d6d1ce Merge remote-tracking branch 'origin/feat/training_260202' into feat/training_260202 2026-02-13 12:53:00 +09:00
47f4ffd4db 트랜젝션처리 임시폴더 uid업데이트 2026-02-13 12:52:11 +09:00
7fa8921a25 Merge pull request 'flush 추가해보기' (#96) from feat/training_260202 into develop
Reviewed-on: #96
2026-02-13 12:47:11 +09:00
195856b846 flush 추가해보기 2026-02-13 12:46:54 +09:00
7c940351d9 Merge pull request '트랜젝션처리 임시폴더 uid업데이트' (#95) from feat/training_260202 into develop
Reviewed-on: #95
2026-02-13 12:39:21 +09:00
124da48e51 트랜젝션처리 임시폴더 uid업데이트 2026-02-13 12:38:55 +09:00
6b834da912 Merge pull request 'feat/training_260202' (#94) from feat/training_260202 into develop
Reviewed-on: #94
2026-02-13 12:25:26 +09:00
02724e9508 주석한거 원복 2026-02-13 12:24:50 +09:00
7ed91ccab9 Merge branch 'feat/training_260202' of https://kamco.git.gs.dabeeo.com/MVPTeam/kamco-train-api into feat/training_260202 2026-02-13 12:23:12 +09:00
a7c13b985d responsePath 셋팅 삭제 2026-02-13 12:23:01 +09:00
25aaa97d65 Merge pull request '트랜젝션처리 임시폴더 uid업데이트' (#93) from feat/training_260202 into develop
Reviewed-on: #93
2026-02-13 12:21:11 +09:00
352a28b87f 트랜젝션처리 임시폴더 uid업데이트 2026-02-13 12:20:48 +09:00
da9d47ae4a Merge pull request '주석 처리' (#92) from feat/training_260202 into develop
Reviewed-on: #92
2026-02-13 12:10:46 +09:00
bf8515163c 주석 처리 2026-02-13 12:10:08 +09:00
7d6a77bf2a Merge pull request '트랜젝션처리 임시폴더 uid업데이트' (#91) from feat/training_260202 into develop
Reviewed-on: #91
2026-02-13 12:01:01 +09:00
26828d0968 add log 2026-02-13 12:00:54 +09:00
2691f6ce16 트랜젝션처리 임시폴더 uid업데이트 2026-02-13 12:00:42 +09:00
e2dbae15c0 Merge pull request '트랜젝션처리 임시폴더 uid업데이트' (#90) from feat/training_260202 into develop
Reviewed-on: #90
2026-02-13 11:58:55 +09:00
b246034632 Merge pull request '트랜젝션처리 임시폴더 uid업데이트' (#89) from feat/training_260202 into develop
Reviewed-on: #89
2026-02-13 11:58:25 +09:00
7e5aa5e713 트랜젝션처리 임시폴더 uid업데이트 2026-02-13 11:57:48 +09:00
060a815e1c 트랜젝션처리 임시폴더 uid업데이트 2026-02-13 11:55:35 +09:00
687ea82d78 Merge pull request 'feat/training_260202' (#88) from feat/training_260202 into develop
Reviewed-on: #88
2026-02-13 10:51:07 +09:00
1eb4d04779 Merge branch 'feat/training_260202' of https://kamco.git.gs.dabeeo.com/MVPTeam/kamco-train-api into feat/training_260202 2026-02-13 10:50:35 +09:00
f30c0c6d45 다운로드 시 Access-Control-Expose-Headers 추가 2026-02-13 10:50:28 +09:00
12994aab60 파일 count 기능 추가 2026-02-13 10:44:49 +09:00
4ac0f19908 Merge pull request '파일 count 기능 추가' (#87) from feat/training_260202 into develop
Reviewed-on: #87
2026-02-13 10:38:48 +09:00
11d3afe295 파일 count 기능 추가 2026-02-13 10:38:24 +09:00
9e5e7595eb Merge pull request '학습실행 step1 할 때 best epoch 업데이트' (#86) from feat/training_260202 into develop
Reviewed-on: #86
2026-02-13 10:18:26 +09:00
1e62a8b097 학습실행 step1 할 때 best epoch 업데이트 2026-02-13 10:15:04 +09:00
9cd9274e99 Merge pull request '학습데이터 목록 파일 단위 MB 나오게 하기' (#85) from feat/training_260202 into develop
Reviewed-on: #85
2026-02-13 09:48:08 +09:00
26a4623aa8 학습데이터 목록 파일 단위 MB 나오게 하기 2026-02-13 09:42:36 +09:00
5d82f3ecfe Merge pull request 'tmp 파일 링크 수정' (#84) from feat/training_260202 into develop
Reviewed-on: #84
2026-02-13 09:11:05 +09:00
ce6e4f5aea tmp 파일 링크 수정 2026-02-13 09:10:45 +09:00
2ce249ab33 Merge pull request 'tmp 파일 링크 수정' (#83) from feat/training_260202 into develop
Reviewed-on: #83
2026-02-13 08:44:37 +09:00
c2215836c0 tmp 파일 링크 수정 2026-02-13 08:44:17 +09:00
e34bf68de0 Merge pull request 'tmp 파일 링크 수정' (#82) from feat/training_260202 into develop
Reviewed-on: #82
2026-02-13 08:33:58 +09:00
8c19c996f7 tmp 파일 링크 수정 2026-02-13 08:33:36 +09:00
862bda0cb9 Merge pull request '이어하기 수정' (#81) from feat/training_260202 into develop
Reviewed-on: #81
2026-02-12 23:02:12 +09:00
b5ce3ab1fb 이어하기 수정 2026-02-12 23:01:56 +09:00
90f7b17d07 Merge pull request '학습데이터 다운로드 파일 정보 API 추가' (#80) from feat/training_260202 into develop
Reviewed-on: #80
2026-02-12 22:58:47 +09:00
e1ceb769dd 학습데이터 다운로드 파일 정보 API 추가 2026-02-12 22:47:13 +09:00
2128baa46a Merge pull request 'feat/training_260202' (#79) from feat/training_260202 into develop
Reviewed-on: #79
2026-02-12 22:26:26 +09:00
4219b88fb3 학습데이터 다운로드 API 추가 2026-02-12 22:25:55 +09:00
4f94c99b64 이어하기 수정 2026-02-12 22:09:54 +09:00
875c30f467 Merge pull request 'feat/training_260202' (#78) from feat/training_260202 into develop
Reviewed-on: #78
2026-02-12 21:52:16 +09:00
d42e1afbd4 스케줄러 api 수동 호출 2026-02-12 21:51:53 +09:00
b3b8016673 csv 결과 받아오는 것 변경 2026-02-12 21:45:38 +09:00
2b29cd1ac6 Merge pull request '파라미터 변경' (#77) from feat/training_260202 into develop
Reviewed-on: #77
2026-02-12 21:30:18 +09:00
79e8259f28 파라미터 변경 2026-02-12 21:30:03 +09:00
9206fff5d0 Merge pull request 'feat/training_260202' (#76) from feat/training_260202 into develop
Reviewed-on: #76
2026-02-12 21:17:33 +09:00
032c82c2f0 file 경로 넣기 2026-02-12 21:17:10 +09:00
6204a6e5fa Merge remote-tracking branch 'origin/develop' into feat/training_260202
# Conflicts:
#	src/main/resources/application-prod.yml
2026-02-12 21:15:21 +09:00
4d9c9a86b4 패키징 zip파일 만들기 커밋 2026-02-12 21:09:40 +09:00
83204abfe9 Merge pull request '성공시 csv 파일 테이블에 저장 연결' (#74) from feat/training_260202 into develop
Reviewed-on: #74
2026-02-12 21:01:48 +09:00
5b682c1386 성공시 csv 파일 테이블에 저장 연결 2026-02-12 21:01:24 +09:00
452494d44d Merge pull request '테스트 실행 경로 수정' (#73) from feat/training_260202 into develop
Reviewed-on: #73
2026-02-12 20:49:30 +09:00
8ada26448b 테스트 실행 경로 수정 2026-02-12 20:49:14 +09:00
e442f105bc Merge pull request '도커명 변경' (#72) from feat/training_260202 into develop
Reviewed-on: #72
2026-02-12 20:37:00 +09:00
5e0a771848 도커명 변경 2026-02-12 20:36:38 +09:00
b4c2685059 Merge pull request '도커 설정 추가' (#71) from feat/training_260202 into develop
Reviewed-on: #71
2026-02-12 20:25:16 +09:00
e238f3ca88 도커 설정 추가 2026-02-12 20:24:57 +09:00
97b06eb3b3 Merge pull request '임시파일생성 경로 수정' (#70) from feat/training_260202 into develop
Reviewed-on: #70
2026-02-12 20:03:32 +09:00
ad32ca18ca 임시파일생성 경로 수정 2026-02-12 20:03:03 +09:00
98a1283ebe Merge pull request '임시파일생성 경로 수정' (#69) from feat/training_260202 into develop
Reviewed-on: #69
2026-02-12 19:36:06 +09:00
a10fccaae3 임시파일생성 경로 수정 2026-02-12 19:35:47 +09:00
c3c9191d9d Merge pull request 'hyperparam_with_modeltype' (#68) from feat/dean/hyperparam_with_modelType-bug into develop
Reviewed-on: #68
2026-02-12 19:30:29 +09:00
9fd5a15a72 hyperparam_with_modeltype 2026-02-12 19:30:08 +09:00
12f9de7367 hyperparam_with_modeltype 2026-02-12 19:16:24 +09:00
5455da1e96 hyperparam_with_modeltype 2026-02-12 19:16:13 +09:00
9e803661cd Merge pull request 'feat/training_260202' (#67) from feat/training_260202 into develop
Reviewed-on: #67
2026-02-12 19:14:39 +09:00
b0cf9e77ec Merge branch 'develop' of https://kamco.git.gs.dabeeo.com/MVPTeam/kamco-train-api into develop 2026-02-12 19:14:10 +09:00
c92426aefc Merge branch 'feat/training_260202' of https://kamco.git.gs.dabeeo.com/MVPTeam/kamco-train-api into feat/training_260202 2026-02-12 19:14:09 +09:00
d5b2b8ecec hyperparam_with_modeltype 2026-02-12 19:14:01 +09:00
6185a18a7c 모델목록 검색 조건 상태값 변경 2026-02-12 19:13:56 +09:00
49d3e37458 Merge pull request '임시파일생성 경로 수정' (#66) from feat/training_260202 into develop
Reviewed-on: #66
2026-02-12 19:12:37 +09:00
1fb10830b9 임시파일생성 경로 수정 2026-02-12 19:11:51 +09:00
d7766edd24 Merge pull request 'return 형식 수정' (#65) from feat/training_260202 into develop
Reviewed-on: #65
2026-02-12 18:59:37 +09:00
0bc4453c9c hyperparam_with_modeltype 2026-02-12 18:56:32 +09:00
ae0d30e5da return 형식 수정 2026-02-12 18:55:42 +09:00
100 changed files with 6340 additions and 861 deletions

6
.gitignore vendored
View File

@@ -72,3 +72,9 @@ docker-compose.override.yml
*.swo
*~
!/CLAUDE.md
### SSL Certificates ###
nginx/ssl/
*.crt
*.key
*.pem

415
DEPLOY.md Normal file
View File

@@ -0,0 +1,415 @@
# KAMCO Training API 배포 가이드 (RedHat 9.6)
## 빠른 배포 (Quick Start)
이 문서는 RedHat 9.6 환경에서 HTTPS로 KAMCO Training API를 배포하는 방법을 설명합니다.
**접속 URL**:
- `https://api.train-kamco.com`
- `https://train-kamco.com`
## 사전 요구사항
- [x] Docker & Docker Compose 설치
- [x] Git 설치
- [x] sudo 권한
- [x] 포트 80, 443 사용 가능
## 1단계: /etc/hosts 설정
```bash
# root 권한으로 도메인 추가
echo "127.0.0.1 api.train-kamco.com train-kamco.com" | sudo tee -a /etc/hosts
# 확인
cat /etc/hosts | grep train-kamco
```
**예상 결과**:
```
127.0.0.1 api.train-kamco.com train-kamco.com
```
## 2단계: 방화벽 설정 (필요시)
```bash
# 방화벽 상태 확인
sudo firewall-cmd --state
# HTTP/HTTPS 포트 개방
sudo firewall-cmd --permanent --add-port=80/tcp
sudo firewall-cmd --permanent --add-port=443/tcp
# 방화벽 재로드
sudo firewall-cmd --reload
# 확인
sudo firewall-cmd --list-ports
```
**예상 결과**: `80/tcp 443/tcp`
## 3단계: 프로젝트 디렉토리로 이동
```bash
cd /path/to/kamco-train-api
# 현재 위치 확인
pwd
# 예상: /home/username/kamco-train-api
```
## 4단계: 파일 구조 확인
배포 전 필수 파일이 모두 있는지 확인하세요:
```bash
# SSL 인증서 확인
ls -la nginx/ssl/
# 예상 결과:
# train-kamco.com.crt (인증서)
# train-kamco.com.key (개인 키)
# openssl.cnf (설정 파일)
```
```bash
# Docker Compose 파일 확인
ls -la docker-compose-prod.yml nginx/nginx.conf
# 예상: 두 파일 모두 존재
```
## 5단계: Docker 네트워크 생성 (최초 1회)
```bash
# kamco-cds 네트워크가 있는지 확인
docker network ls | grep kamco-cds
# 없으면 생성
docker network create kamco-cds
```
## 6단계: Docker Compose 배포
```bash
# 기존 컨테이너 중지 (있는 경우)
docker-compose -f docker-compose-prod.yml down
# 새로운 이미지 빌드 및 실행
docker-compose -f docker-compose-prod.yml up -d --build
# 컨테이너 상태 확인
docker-compose -f docker-compose-prod.yml ps
```
**예상 결과**:
```
NAME STATUS
kamco-cd-nginx Up (healthy)
kamco-cd-training-api Up (healthy)
```
## 7단계: 배포 확인
### 컨테이너 로그 확인
```bash
# Nginx 로그
docker logs kamco-cd-nginx --tail 50
# API 로그
docker logs kamco-cd-training-api --tail 50
# 실시간 로그 (Ctrl+C로 종료)
docker-compose -f docker-compose-prod.yml logs -f
```
### HTTP → HTTPS 리다이렉트 테스트
```bash
# HTTP 접속 시 HTTPS로 리다이렉트되는지 확인
curl -I http://api.train-kamco.com
curl -I http://train-kamco.com
# 예상 결과: 301 Moved Permanently
# Location: https://api.train-kamco.com/ 또는 https://train-kamco.com/
```
### HTTPS 헬스체크
```bash
# -k 플래그: 사설 인증서 경고 무시
curl -k https://api.train-kamco.com/monitor/health
curl -k https://train-kamco.com/monitor/health
# 예상 결과: {"status":"UP","components":{...}}
```
### 브라우저 테스트
브라우저에서 다음 URL에 접속:
- `https://api.train-kamco.com/monitor/health`
- `https://train-kamco.com/monitor/health`
**사설 인증서 경고**:
- "안전하지 않음" 경고가 표시되면 **"고급"** → **"계속 진행"** 클릭
## 8단계: SSL 인증서 확인 (선택사항)
```bash
# 인증서 정보 확인
openssl x509 -in nginx/ssl/train-kamco.com.crt -text -noout | head -30
# 유효 기간 확인 (100년)
openssl x509 -in nginx/ssl/train-kamco.com.crt -noout -dates
# SAN (멀티 도메인) 확인
openssl x509 -in nginx/ssl/train-kamco.com.crt -text -noout | grep -A1 "Subject Alternative Name"
# 예상 결과:
# X509v3 Subject Alternative Name:
# DNS:api.train-kamco.com, DNS:train-kamco.com
```
## 트러블슈팅
### 문제 1: "Connection refused"
**원인**: 컨테이너가 실행되지 않음
**해결**:
```bash
# 컨테이너 상태 확인
docker ps -a | grep kamco-cd
# 컨테이너 재시작
docker-compose -f docker-compose-prod.yml restart
# 로그 확인
docker logs kamco-cd-nginx
docker logs kamco-cd-training-api
```
### 문제 2: "502 Bad Gateway"
**원인**: Nginx는 실행 중이지만 API 컨테이너가 준비되지 않음
**해결**:
```bash
# API 컨테이너 상태 확인
docker logs kamco-cd-training-api
# API 헬스체크 (컨테이너 내부에서)
docker exec kamco-cd-nginx wget -qO- http://kamco-changedetection-api:8080/monitor/health
# API 컨테이너 재시작
docker-compose -f docker-compose-prod.yml restart kamco-changedetection-api
```
### 문제 3: "Name or service not known"
**원인**: /etc/hosts에 도메인이 설정되지 않음
**해결**:
```bash
# /etc/hosts 확인
cat /etc/hosts | grep train-kamco
# 없으면 추가
echo "127.0.0.1 api.train-kamco.com train-kamco.com" | sudo tee -a /etc/hosts
```
### 문제 4: 포트 80 또는 443이 이미 사용 중
**원인**: 다른 프로세스가 포트를 사용 중
**해결**:
```bash
# 포트 사용 확인
sudo lsof -i :80
sudo lsof -i :443
# 사용 중인 프로세스 종료 (예: httpd, nginx)
sudo systemctl stop httpd
sudo systemctl stop nginx
# Docker Compose 재시작
docker-compose -f docker-compose-prod.yml restart
```
### 문제 5: SELinux 권한 오류
**원인**: SELinux가 Docker 볼륨 마운트를 차단
**해결**:
```bash
# SELinux 상태 확인
getenforce
# Permissive 모드로 임시 변경 (재부팅 시 초기화됨)
sudo setenforce 0
# 영구 변경 (권장하지 않음)
sudo vi /etc/selinux/config
# SELINUX=permissive 또는 SELINUX=disabled로 변경
```
## 컨테이너 관리 명령어
### 시작/중지/재시작
```bash
# 시작
docker-compose -f docker-compose-prod.yml up -d
# 중지
docker-compose -f docker-compose-prod.yml down
# 재시작
docker-compose -f docker-compose-prod.yml restart
# 특정 서비스만 재시작
docker-compose -f docker-compose-prod.yml restart nginx
docker-compose -f docker-compose-prod.yml restart kamco-changedetection-api
```
### 로그 확인
```bash
# 전체 로그
docker-compose -f docker-compose-prod.yml logs
# 특정 서비스 로그
docker-compose -f docker-compose-prod.yml logs nginx
docker-compose -f docker-compose-prod.yml logs kamco-changedetection-api
# 실시간 로그
docker-compose -f docker-compose-prod.yml logs -f
# 마지막 N줄만 보기
docker logs kamco-cd-nginx --tail 100
```
### 컨테이너 상태 확인
```bash
# 실행 중인 컨테이너
docker-compose -f docker-compose-prod.yml ps
# 상세 정보
docker inspect kamco-cd-nginx
docker inspect kamco-cd-training-api
# 리소스 사용량
docker stats kamco-cd-nginx kamco-cd-training-api
```
### 컨테이너 내부 접속
```bash
# Nginx 컨테이너 내부 접속
docker exec -it kamco-cd-nginx sh
# API 컨테이너 내부 접속
docker exec -it kamco-cd-training-api sh
# 내부에서 빠져나오기
exit
```
## 업데이트 및 재배포
### 코드 업데이트 후 재배포
```bash
# 1. Git pull (코드 업데이트)
git pull origin develop
# 2. JAR 파일 빌드 (Jenkins에서 수행하는 경우 생략)
./gradlew clean build -x test
# 3. 컨테이너 재빌드 및 재시작
docker-compose -f docker-compose-prod.yml down
docker-compose -f docker-compose-prod.yml up -d --build
# 4. 로그 확인
docker-compose -f docker-compose-prod.yml logs -f
```
### 설정 파일만 변경한 경우
```bash
# nginx.conf 또는 docker-compose-prod.yml 변경 시
docker-compose -f docker-compose-prod.yml down
docker-compose -f docker-compose-prod.yml up -d
# 또는
docker-compose -f docker-compose-prod.yml restart nginx
```
## 모니터링
### 헬스체크 엔드포인트
```bash
# API 헬스체크
curl -k https://api.train-kamco.com/monitor/health
# 예상 결과:
# {
# "status": "UP",
# "components": {
# "db": {"status": "UP"},
# "diskSpace": {"status": "UP"}
# }
# }
```
### 시스템 리소스
```bash
# 디스크 사용량
df -h
# 메모리 사용량
free -h
# Docker 이미지 및 컨테이너 용량
docker system df
```
## 보안 권장 사항
1. **사설 인증서**: 현재 사설 인증서를 사용 중입니다. 프로덕션 환경에서는 **Let's Encrypt** 또는 **GlobalSign** 같은 공인 인증서 사용을 권장합니다.
2. **방화벽**: 필요한 포트(80, 443)만 개방하고, 불필요한 포트는 차단하세요.
3. **정기 업데이트**: Docker 이미지와 시스템 패키지를 정기적으로 업데이트하세요.
4. **로그 모니터링**: 정기적으로 로그를 확인하여 비정상적인 활동을 감지하세요.
5. **백업**: SSL 인증서 키 파일(`train-kamco.com.key`)과 데이터베이스를 정기적으로 백업하세요.
## 참고 문서
- **SSL 인증서 설정**: [nginx/SSL_SETUP.md](nginx/SSL_SETUP.md)
- **프로젝트 개요**: [README.md](README.md)
- **CLAUDE.md**: [CLAUDE.md](CLAUDE.md)
## 지원
문제가 발생하면 다음을 확인하세요:
1. 컨테이너 로그: `docker-compose -f docker-compose-prod.yml logs`
2. 컨테이너 상태: `docker-compose -f docker-compose-prod.yml ps`
3. /etc/hosts 설정: `cat /etc/hosts | grep train-kamco`
4. 방화벽 상태: `sudo firewall-cmd --list-ports`
---
**배포 완료!** 🎉
접속 URL:
- `https://api.train-kamco.com`
- `https://train-kamco.com`

443
DEPLOYMENT.md Normal file
View File

@@ -0,0 +1,443 @@
# KAMCO Train API - Production Deployment Guide
프로덕션 환경 배포 가이드
## 목차
- [사전 요구사항](#사전-요구사항)
- [초기 설정](#초기-설정)
- [배포 순서](#배포-순서)
- [개별 서비스 관리](#개별-서비스-관리)
- [롤백 절차](#롤백-절차)
- [모니터링 및 헬스체크](#모니터링-및-헬스체크)
- [트러블슈팅](#트러블슈팅)
---
## 사전 요구사항
### 시스템 요구사항
- Docker Engine 20.10+
- Docker Compose 2.0+
- 최소 메모리: 4GB
- 디스크 공간: 20GB 이상
### 네트워크 요구사항
- 도메인 설정:
- `train-kamco.com` → 서버 IP (Web UI)
- `api.train-kamco.com` → 서버 IP (API)
- 포트:
- 80 (HTTP)
- 443 (HTTPS)
- 8080 (API - internal)
- 3002 (Web - internal)
### 필수 파일
- SSL 인증서:
- `nginx/ssl/train-kamco.com.crt`
- `nginx/ssl/train-kamco.com.key`
- 환경 설정:
- `application-prod.yml` (API 설정)
- `.env` 파일 (IMAGE_TAG 등)
---
## 초기 설정
### 1. Docker 네트워크 생성
```bash
# kamco-cds 네트워크 생성 (최초 1회만)
docker network create kamco-cds
# 네트워크 확인
docker network ls | grep kamco-cds
```
### 2. SSL 인증서 배치
```bash
# 인증서 디렉토리 생성
mkdir -p nginx/ssl
# 인증서 파일 복사
cp /path/to/train-kamco.com.crt nginx/ssl/
cp /path/to/train-kamco.com.key nginx/ssl/
# 권한 설정
chmod 600 nginx/ssl/train-kamco.com.key
chmod 644 nginx/ssl/train-kamco.com.crt
```
### 3. 환경 변수 설정
```bash
# .env 파일 생성
cat > .env << EOF
IMAGE_TAG=latest
SPRING_PROFILES_ACTIVE=prod
TZ=Asia/Seoul
EOF
```
### 4. 볼륨 디렉토리 생성
```bash
# 데이터 디렉토리 생성
mkdir -p ./app/model_output
mkdir -p ./app/train_dataset
# 권한 설정
chmod -R 755 ./app
```
---
## 배포 순서
### 전체 스택 초기 배포
**중요**: 반드시 아래 순서대로 실행해야 합니다.
```bash
# 1. Nginx 시작
docker-compose -f docker-compose-nginx.yml up -d
# 2. Nginx 상태 확인
docker ps | grep kamco-train-nginx
# 3. API 서비스 시작
docker-compose -f docker-compose-prod.yml up -d
# 4. API 헬스체크 대기 (최대 40초)
sleep 40
curl -f http://localhost:8080/monitor/health
# 5. Web 서비스 시작 (kamco-train-web 프로젝트에서)
# cd ../kamco-train-web
# docker-compose -f docker-compose-prod.yml up -d
# 6. 전체 상태 확인
docker ps -a | grep kamco
```
### 배포 검증
```bash
# 서비스별 헬스체크
curl -f http://localhost:8080/monitor/health # API (internal)
curl -kf https://api.train-kamco.com/monitor/health # API (external)
curl -kf https://train-kamco.com # Web (external)
# Nginx 설정 검증
docker exec kamco-train-nginx nginx -t
# 로그 확인
docker-compose -f docker-compose-nginx.yml logs --tail=50
docker-compose -f docker-compose-prod.yml logs --tail=50
```
---
## 개별 서비스 관리
### Nginx 관리
```bash
# 설정 변경 후 리로드 (다운타임 없음)
docker exec kamco-train-nginx nginx -s reload
# 재시작
docker-compose -f docker-compose-nginx.yml restart
# 로그 확인
docker-compose -f docker-compose-nginx.yml logs -f
# 컨테이너 내부 접근
docker exec -it kamco-train-nginx sh
```
### API 서비스 관리
```bash
# 재배포 (새 이미지 빌드)
docker-compose -f docker-compose-prod.yml up -d --build
# 재시작 (이미지 변경 없이)
docker-compose -f docker-compose-prod.yml restart
# 중지
docker-compose -f docker-compose-prod.yml down
# 로그 확인
docker-compose -f docker-compose-prod.yml logs -f kamco-train-api
# 컨테이너 내부 접근
docker exec -it kamco-train-api bash
```
### Web 서비스 관리
```bash
# kamco-train-web 프로젝트에서 실행
cd ../kamco-train-web
# 재배포
docker-compose -f docker-compose-prod.yml up -d --build
# 재시작
docker-compose -f docker-compose-prod.yml restart
# 로그 확인
docker-compose -f docker-compose-prod.yml logs -f
```
---
## 롤백 절차
### 이미지 기반 롤백
```bash
# 1. 사용 가능한 이미지 확인
docker images | grep kamco-train-api
# 2. 이전 이미지 태그로 롤백
export IMAGE_TAG=previous-commit-hash
docker-compose -f docker-compose-prod.yml up -d
# 3. 헬스체크 확인
curl -f http://localhost:8080/monitor/health
```
### Git 기반 롤백
```bash
# 1. 이전 커밋으로 체크아웃
git log --oneline -10
git checkout <previous-commit-hash>
# 2. 재빌드 및 배포
docker-compose -f docker-compose-prod.yml up -d --build
# 3. 검증 후 브랜치 업데이트 (필요시)
# git checkout develop
# git reset --hard <previous-commit-hash>
# git push -f origin develop
```
---
## 모니터링 및 헬스체크
### 헬스체크 엔드포인트
```bash
# API 헬스체크
curl http://localhost:8080/monitor/health
curl http://localhost:8080/monitor/health/readiness
curl http://localhost:8080/monitor/health/liveness
# Nginx를 통한 헬스체크
curl -k https://api.train-kamco.com/monitor/health
```
### 컨테이너 상태 모니터링
```bash
# 모든 컨테이너 상태
docker ps -a | grep kamco
# 리소스 사용량 실시간 모니터링
docker stats kamco-train-nginx kamco-train-api
# 헬스체크 상태
docker inspect kamco-train-api | grep -A 10 Health
```
### 로그 모니터링
```bash
# 실시간 로그 (모든 서비스)
docker-compose -f docker-compose-nginx.yml logs -f &
docker-compose -f docker-compose-prod.yml logs -f &
# 에러 로그만 필터링
docker-compose -f docker-compose-prod.yml logs | grep -i error
# 최근 100줄
docker-compose -f docker-compose-prod.yml logs --tail=100
```
---
## 트러블슈팅
### 1. Nginx 502 Bad Gateway
**원인**: API 서비스가 준비되지 않음
```bash
# API 컨테이너 상태 확인
docker ps | grep kamco-train-api
# API 로그 확인
docker logs kamco-train-api --tail=100
# 네트워크 연결 확인
docker network inspect kamco-cds | grep kamco-train-api
# 해결: API 재시작
docker-compose -f docker-compose-prod.yml restart
```
### 2. SSL 인증서 오류
**원인**: 인증서 파일 누락 또는 권한 문제
```bash
# 인증서 파일 확인
ls -la nginx/ssl/
# Nginx 설정 검증
docker exec kamco-train-nginx nginx -t
# 해결: 인증서 재배치 및 권한 설정
chmod 600 nginx/ssl/train-kamco.com.key
chmod 644 nginx/ssl/train-kamco.com.crt
docker-compose -f docker-compose-nginx.yml restart
```
### 3. 컨테이너 시작 실패
**원인**: 포트 충돌, 볼륨 권한, 메모리 부족
```bash
# 포트 사용 확인
netstat -tulpn | grep -E '80|443|8080'
# 볼륨 권한 확인
ls -la ./app/
# 메모리 사용량 확인
free -h
docker system df
# 해결: 충돌 프로세스 종료 또는 포트 변경
# 메모리 정리
docker system prune -a
```
### 4. 네트워크 연결 문제
**원인**: kamco-cds 네트워크 미생성 또는 컨테이너 미연결
```bash
# 네트워크 확인
docker network ls | grep kamco-cds
# 네트워크 상세 정보
docker network inspect kamco-cds
# 해결: 네트워크 생성
docker network create kamco-cds
# 컨테이너를 네트워크에 연결
docker network connect kamco-cds kamco-train-api
docker network connect kamco-cds kamco-train-nginx
```
### 5. 데이터베이스 연결 실패
**원인**: application-prod.yml의 DB 설정 오류
```bash
# API 로그에서 DB 연결 에러 확인
docker logs kamco-train-api | grep -i "connection"
# DB 호스트 연결 테스트
docker exec kamco-train-api ping <db-host>
# 해결: application-prod.yml 수정 후 재배포
vim src/main/resources/application-prod.yml
docker-compose -f docker-compose-prod.yml up -d --build
```
---
## Jenkins CI/CD 연동
현재 프로젝트는 Jenkins 파이프라인으로 자동 배포됩니다.
### Jenkinsfile-dev 주요 단계
1. **Checkout**: develop 브랜치 체크아웃
2. **Build**: `./gradlew clean build -x test`
3. **Extract Commit**: IMAGE_TAG로 사용
4. **Transfer**: 배포 서버로 파일 전송
5. **Deploy**: Docker Compose 빌드 및 배포
6. **Health Check**: 30초 대기 후 헬스체크
7. **Cleanup**: 오래된 이미지 정리 (최신 5개 유지)
### 배포 서버 정보
- **서버**: 192.168.2.109
- **사용자**: space
- **배포 경로**: `/home/space/kamco-training-api`
- **헬스체크**: `http://localhost:7200/monitor/health`
---
## 백업 및 복구
### 데이터 백업
```bash
# 볼륨 데이터 백업
tar -czf backup-$(date +%Y%m%d).tar.gz ./app/model_output ./app/train_dataset
# 설정 파일 백업
tar -czf config-backup-$(date +%Y%m%d).tar.gz \
nginx/nginx.conf \
nginx/ssl/ \
src/main/resources/application-prod.yml
```
### 이미지 백업
```bash
# 현재 이미지 저장
docker save kamco-train-api:latest | gzip > kamco-train-api-latest.tar.gz
# 이미지 복구
gunzip -c kamco-train-api-latest.tar.gz | docker load
```
---
## 보안 체크리스트
- [ ] SSL 인증서 유효기간 확인
- [ ] nginx/ssl/ 디렉토리 권한 600
- [ ] application-prod.yml에 DB 비밀번호 암호화
- [ ] JWT secret key 환경변수로 관리
- [ ] Docker 소켓 권한 최소화
- [ ] 방화벽 규칙 설정 (80, 443만 외부 노출)
- [ ] 정기 보안 업데이트 (docker images)
---
## 참고 문서
- [CLAUDE.md](./CLAUDE.md) - 프로젝트 개발 가이드
- [README.md](./README.md) - 프로젝트 개요
- [Jenkinsfile-dev](./Jenkinsfile-dev) - CI/CD 파이프라인
- [nginx/nginx.conf](./nginx/nginx.conf) - Nginx 설정
---
## 연락처
문제 발생 시:
1. 로그 수집: `docker-compose logs` 출력
2. 시스템 정보: `docker ps -a`, `docker network ls`
3. 이슈 리포트: GitHub Issues 또는 내부 이슈 트래커

20
Dockerfile Normal file
View File

@@ -0,0 +1,20 @@
# Stage 1: Build stage (gradle build는 Jenkins에서 이미 수행)
FROM eclipse-temurin:21-jre-jammy
# docker CLI 설치 (컨테이너에서 호스트 Docker 제어용) 260212 추가
RUN apt-get update && \
apt-get install -y --no-install-recommends docker.io ca-certificates && \
rm -rf /var/lib/apt/lists/*
# 작업 디렉토리 설정
WORKDIR /app
# JAR 파일 복사 (Jenkins에서 빌드된 ROOT.jar)
COPY build/libs/ROOT.jar app.jar
# 포트 노출
EXPOSE 8080
# 애플리케이션 실행
# dev 프로파일로 실행
ENTRYPOINT ["java", "-jar", "-Dspring.profiles.active=prod", "app.jar"]

View File

@@ -1,6 +1,11 @@
# Stage 1: Build stage (gradle build는 Jenkins에서 이미 수행)
FROM eclipse-temurin:21-jre-jammy
# docker CLI 설치 (컨테이너에서 호스트 Docker 제어용) 260212 추가
RUN apt-get update && \
apt-get install -y --no-install-recommends docker.io ca-certificates && \
rm -rf /var/lib/apt/lists/*
# 작업 디렉토리 설정
WORKDIR /app

View File

@@ -5,6 +5,13 @@ services:
dockerfile: Dockerfile-dev
image: kamco-cd-training-api:${IMAGE_TAG:-latest}
container_name: kamco-cd-training-api
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: all
capabilities: [gpu]
ports:
- "7200:8080"
environment:
@@ -15,6 +22,7 @@ services:
- /mnt/nfs_share/model_output:/app/model-outputs
- /mnt/nfs_share/train_dataset:/app/train-dataset
- /home/kcomu/data:/home/kcomu/data
- /var/run/docker.sock:/var/run/docker.sock
networks:
- kamco-cds
restart: unless-stopped

28
docker-compose-nginx.yml Normal file
View File

@@ -0,0 +1,28 @@
services:
nginx:
image: nginx:alpine
container_name: kamco-train-nginx
ports:
- "80:80"
- "443:443"
volumes:
- ./nginx/nginx.conf:/etc/nginx/nginx.conf:ro
- ./nginx/ssl:/etc/nginx/ssl:ro
- nginx-logs:/var/log/nginx
networks:
- kamco-cds
restart: unless-stopped
healthcheck:
test: ["CMD", "wget", "--quiet", "--tries=1", "--spider", "--no-check-certificate", "https://localhost/monitor/health"]
interval: 30s
timeout: 10s
retries: 3
start_period: 10s
networks:
kamco-cds:
external: true
volumes:
nginx-logs:
driver: local

36
docker-compose-prod.yml Normal file
View File

@@ -0,0 +1,36 @@
services:
kamco-train-api:
build:
context: .
dockerfile: Dockerfile
image: kamco-train-api:${IMAGE_TAG:-latest}
container_name: kamco-train-api
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: all
capabilities: [gpu]
expose:
- "8080"
environment:
- SPRING_PROFILES_ACTIVE=prod
- TZ=Asia/Seoul
volumes:
- ./app/model_output:/app/model-outputs
- ./app/train_dataset:/app/train-dataset
- /var/run/docker.sock:/var/run/docker.sock
networks:
- kamco-cds
restart: unless-stopped
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8080/monitor/health"]
interval: 10s
timeout: 5s
retries: 5
start_period: 40s
networks:
kamco-cds:
external: true

414
nginx/SSL_SETUP.md Normal file
View File

@@ -0,0 +1,414 @@
# SSL 사설 인증서 설정 가이드 (RedHat 9.6)
## 개요
이 문서는 RedHat 9.6 환경에서 `https://api.train-kamco.com``https://train-kamco.com` 도메인을 위한 100년 유효한 사설 SSL 인증서 설정 방법을 설명합니다.
## 디렉토리 구조
```
nginx/
├── nginx.conf # Nginx 설정 파일
├── ssl/
│ ├── openssl.cnf # OpenSSL 설정 파일 (SAN 포함)
│ ├── train-kamco.com.crt # 사설 SSL 인증서 (멀티 도메인)
│ └── train-kamco.com.key # 개인 키 (비공개)
└── SSL_SETUP.md # 이 문서
```
## 인증서 정보
- **도메인**: api.train-kamco.com, train-kamco.com (멀티 도메인)
- **유효 기간**: 100년 (36500일)
- **알고리즘**: RSA 4096-bit
- **CN (Common Name)**: api.train-kamco.com
- **SAN (Subject Alternative Names)**: api.train-kamco.com, train-kamco.com
## 사설 SSL 인증서 생성 (이미 생성됨)
인증서가 이미 생성되어 있습니다. 재생성이 필요한 경우 아래 단계를 따르세요.
### 1. OpenSSL 설정 파일 생성
```bash
cd /path/to/kamco-train-api
cat > nginx/ssl/openssl.cnf << 'EOF'
[req]
default_bits = 4096
prompt = no
default_md = sha256
distinguished_name = dn
req_extensions = v3_req
[dn]
C=KR
ST=Seoul
L=Seoul
O=KAMCO
OU=Training
CN=api.train-kamco.com
[v3_req]
subjectAltName = @alt_names
[alt_names]
DNS.1 = api.train-kamco.com
DNS.2 = train-kamco.com
EOF
```
### 2. SSL 인증서 및 개인 키 생성
```bash
# nginx/ssl 디렉토리 생성 (없는 경우)
mkdir -p nginx/ssl
chmod 700 nginx/ssl
# 인증서 및 개인 키 생성 (100년 유효)
openssl req -new -x509 -newkey rsa:4096 -sha256 -nodes \
-keyout nginx/ssl/train-kamco.com.key \
-out nginx/ssl/train-kamco.com.crt \
-days 36500 \
-config nginx/ssl/openssl.cnf \
-extensions v3_req
# 파일 권한 설정
chmod 600 nginx/ssl/train-kamco.com.key
chmod 644 nginx/ssl/train-kamco.com.crt
```
### 3. 인증서 검증
```bash
# 인증서 정보 확인
openssl x509 -in nginx/ssl/train-kamco.com.crt -text -noout
# 유효 기간 확인
openssl x509 -in nginx/ssl/train-kamco.com.crt -text -noout | grep -A2 "Validity"
# SAN (멀티 도메인) 확인
openssl x509 -in nginx/ssl/train-kamco.com.crt -text -noout | grep -A1 "Subject Alternative Name"
# CN 확인
openssl x509 -in nginx/ssl/train-kamco.com.crt -noout -subject
# 개인 키 확인
openssl rsa -in nginx/ssl/train-kamco.com.key -check
```
**예상 결과**:
```
X509v3 Subject Alternative Name:
DNS:api.train-kamco.com, DNS:train-kamco.com
Validity
Not Before: Mar 2 23:39:XX 2026 GMT
Not After : Feb 6 23:39:XX 2126 GMT
```
## /etc/hosts 설정 (RedHat 9.6)
### 1. hosts 파일에 도메인 추가
```bash
# root 권한으로 실행
echo "127.0.0.1 api.train-kamco.com train-kamco.com" | sudo tee -a /etc/hosts
# 확인
cat /etc/hosts | grep train-kamco
```
**예상 결과**:
```
127.0.0.1 api.train-kamco.com train-kamco.com
```
### 2. 도메인 확인
```bash
# ping 테스트
ping -c 2 api.train-kamco.com
ping -c 2 train-kamco.com
```
## Docker Compose 배포
### 1. 기존 컨테이너 중지 (실행 중인 경우)
```bash
cd /path/to/kamco-train-api
docker-compose -f docker-compose-prod.yml down
```
### 2. Production 환경 실행
```bash
# IMAGE_TAG 환경 변수 설정 (선택사항)
export IMAGE_TAG=latest
# Docker Compose 실행
docker-compose -f docker-compose-prod.yml up -d
# 컨테이너 상태 확인
docker-compose -f docker-compose-prod.yml ps
```
### 3. 로그 확인
```bash
# Nginx 로그
docker logs kamco-cd-nginx
# API 로그
docker logs kamco-cd-training-api
# 실시간 로그 확인
docker-compose -f docker-compose-prod.yml logs -f
```
## HTTPS 접속 테스트
### 1. HTTP → HTTPS 리다이렉트 테스트
```bash
# api.train-kamco.com
curl -I http://api.train-kamco.com
# train-kamco.com
curl -I http://train-kamco.com
# 예상 결과: 301 Moved Permanently
# Location: https://api.train-kamco.com/ 또는 https://train-kamco.com/
```
### 2. HTTPS 헬스체크 (-k: 사설 인증서 경고 무시)
```bash
# api.train-kamco.com
curl -k https://api.train-kamco.com/monitor/health
# train-kamco.com
curl -k https://train-kamco.com/monitor/health
# 예상 결과: {"status":"UP","components":{...}}
```
### 3. SSL 인증서 확인
```bash
# api.train-kamco.com
openssl s_client -connect api.train-kamco.com:443 -showcerts
# train-kamco.com
openssl s_client -connect train-kamco.com:443 -showcerts
# CN 및 SAN 확인
```
### 4. 브라우저 테스트
브라우저에서 다음 URL에 접속:
- `https://api.train-kamco.com/monitor/health`
- `https://train-kamco.com/monitor/health`
**주의**: 사설 인증서이므로 "안전하지 않음" 경고가 표시됩니다.
- **Chrome/Edge**: "고급" → "계속 진행" 클릭
- **Firefox**: "위험 감수 및 계속" 클릭
## 브라우저에서 사설 인증서 신뢰 설정 (선택사항)
사설 인증서를 브라우저에 등록하면 경고 없이 접속 가능합니다.
### Chrome/Edge (RedHat Desktop)
1. `chrome://settings/certificates` 접속
2. **Authorities** 탭 선택
3. **Import** 클릭
4. `nginx/ssl/train-kamco.com.crt` 선택
5. **Trust this certificate for identifying websites** 체크
6. **OK** 클릭
### Firefox
1. `about:preferences#privacy` 접속
2. **Certificates****View Certificates** 클릭
3. **Authorities** 탭 선택
4. **Import** 클릭
5. `nginx/ssl/train-kamco.com.crt` 선택
6. **Trust this CA to identify websites** 체크
7. **OK** 클릭
## 방화벽 설정 (RedHat 9.6)
### 1. 방화벽 상태 확인
```bash
sudo firewall-cmd --state
```
### 2. HTTP (80) 및 HTTPS (443) 포트 개방
```bash
# HTTP 포트 개방
sudo firewall-cmd --permanent --add-port=80/tcp
# HTTPS 포트 개방
sudo firewall-cmd --permanent --add-port=443/tcp
# 방화벽 재로드
sudo firewall-cmd --reload
# 확인
sudo firewall-cmd --list-ports
```
**예상 결과**:
```
80/tcp 443/tcp
```
## 보안 체크리스트
- [x] `train-kamco.com.key` 파일 권한이 600으로 설정됨
- [x] ssl 디렉토리가 버전 관리에서 제외됨 (.gitignore 확인)
- [x] 두 도메인(api.train-kamco.com, train-kamco.com) 모두 SAN에 포함됨
- [ ] 방화벽에서 80, 443 포트 개방 확인
- [x] HSTS 헤더 활성화 확인
- [x] TLS 1.2 이상만 허용 확인
- [ ] /etc/hosts에 도메인 매핑 확인
## 트러블슈팅
### 인증서 관련 오류
**"certificate verify failed"**
```bash
# 해결: -k 플래그 사용 (사설 인증서 경고 무시)
curl -k https://api.train-kamco.com/monitor/health
```
**"NET::ERR_CERT_AUTHORITY_INVALID" (브라우저)**
- 정상 동작: 사설 인증서이므로 브라우저 경고는 예상된 동작입니다
- 해결: 브라우저에 인증서 등록 (위 "브라우저에서 사설 인증서 신뢰 설정" 참조)
### 연결 오류
**"Connection refused"**
```bash
# 컨테이너 상태 확인
docker ps | grep kamco-cd
# 포트 바인딩 확인
docker port kamco-cd-nginx
# 예상 결과:
# 80/tcp -> 0.0.0.0:80
# 443/tcp -> 0.0.0.0:443
```
**"502 Bad Gateway"**
```bash
# API 컨테이너 상태 확인
docker logs kamco-cd-training-api
# nginx → API 연결 확인
docker exec kamco-cd-nginx wget -qO- http://kamco-changedetection-api:8080/monitor/health
```
**"Name or service not known" (도메인 해석 실패)**
```bash
# /etc/hosts 확인
cat /etc/hosts | grep train-kamco
# 없으면 추가
echo "127.0.0.1 api.train-kamco.com train-kamco.com" | sudo tee -a /etc/hosts
```
### 방화벽 관련 오류
**외부에서 접속 안 됨**
```bash
# 방화벽 확인
sudo firewall-cmd --list-ports
# 포트 개방
sudo firewall-cmd --permanent --add-port=80/tcp
sudo firewall-cmd --permanent --add-port=443/tcp
sudo firewall-cmd --reload
```
## 인증서 만료 및 갱신
### 만료 확인
```bash
# 인증서 만료일 확인
openssl x509 -in nginx/ssl/train-kamco.com.crt -noout -enddate
# 예상 결과: notAfter=Feb 6 23:39:XX 2126 GMT (100년 후)
```
### 갱신 방법 (필요시)
100년 유효한 인증서이므로 갱신이 필요하지 않지만, 재생성이 필요한 경우:
```bash
# 기존 인증서 백업
cp nginx/ssl/train-kamco.com.crt nginx/ssl/train-kamco.com.crt.bak
cp nginx/ssl/train-kamco.com.key nginx/ssl/train-kamco.com.key.bak
# 위의 "사설 SSL 인증서 생성" 단계 재실행
# nginx 재시작
docker-compose -f docker-compose-prod.yml restart nginx
```
## 주의사항
1. **사설 인증서 경고**: 브라우저에서 "안전하지 않음" 경고가 표시됩니다. 프로덕션 환경에서는 **공인 인증서(Let's Encrypt, GlobalSign 등) 사용을 권장**합니다.
2. **포트 80/443**: Docker가 자동으로 처리하지만, 이미 사용 중인 프로세스가 있으면 충돌할 수 있습니다.
```bash
# 포트 사용 확인
sudo lsof -i :80
sudo lsof -i :443
```
3. **대용량 파일 업로드**: `client_max_body_size`를 10GB로 설정했으므로, 서버 메모리 및 디스크 용량을 충분히 확보하세요.
4. **인증서 백업**: `train-kamco.com.key` 파일은 매우 중요합니다. 안전한 곳에 백업하세요.
5. **SELinux**: RedHat 9.6에서 SELinux가 활성화된 경우, Docker 볼륨 마운트 권한 문제가 발생할 수 있습니다.
```bash
# SELinux 상태 확인
getenforce
# 필요시 permissive 모드로 변경
sudo setenforce 0
```
### 2단계: 시스템 신뢰 폴더로 복사
터미널을 열고 관리자 권한(sudo)을 사용해 인증서를 시스템 폴더로 복사합니다.
```
sudo cp mycert.crt /etc/pki/ca-trust/source/anchors/
```
### 3단계: 시스템 신뢰 목록 업데이트
아래 명령어를 입력해 추가한 인증서를 시스템에 갱신시킵니다.
```
sudo update-ca-trust
```
## 참고 자료
- [OpenSSL Documentation](https://www.openssl.org/docs/)
- [Nginx SSL Configuration](https://nginx.org/en/docs/http/configuring_https_servers.html)
- [Docker Compose Documentation](https://docs.docker.com/compose/)
- [Let's Encrypt (공인 인증서)](https://letsencrypt.org/)

179
nginx/nginx.conf Normal file
View File

@@ -0,0 +1,179 @@
events {
worker_connections 1024;
}
http {
include /etc/nginx/mime.types;
default_type application/octet-stream;
# 로그 설정
log_format main '$remote_addr - $remote_user [$time_local] "$request" '
'$status $body_bytes_sent "$http_referer" '
'"$http_user_agent" "$http_x_forwarded_for"';
access_log /var/log/nginx/access.log main;
error_log /var/log/nginx/error.log warn;
sendfile on;
keepalive_timeout 65;
# 업로드 파일 크기 제한 (10GB)
client_max_body_size 10G;
# Upstream 설정
upstream api_backend {
server kamco-train-api:8080;
}
upstream web_backend {
server kamco-train-web:3002;
}
# HTTP → HTTPS 리다이렉트 서버
server {
listen 80;
server_name api.train-kamco.com train-kamco.com;
# 모든 HTTP 요청을 HTTPS로 리다이렉트
return 301 https://$server_name$request_uri;
}
# HTTPS 서버 설정
server {
listen 443 ssl http2;
server_name api.train-kamco.com;
# SSL 인증서 설정 (사설 인증서 - 멀티 도메인)
ssl_certificate /etc/nginx/ssl/train-kamco.com.crt;
ssl_certificate_key /etc/nginx/ssl/train-kamco.com.key;
# SSL 프로토콜 및 암호화 설정
ssl_protocols TLSv1.2 TLSv1.3;
ssl_ciphers 'ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384';
ssl_prefer_server_ciphers off;
# SSL 세션 캐시
ssl_session_cache shared:SSL:10m;
ssl_session_timeout 10m;
# HSTS (HTTP Strict Transport Security)
add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always;
# 보안 헤더
add_header X-Frame-Options "SAMEORIGIN" always;
add_header X-Content-Type-Options "nosniff" always;
add_header X-XSS-Protection "1; mode=block" always;
# 프록시 설정
location / {
proxy_pass http://api_backend;
proxy_http_version 1.1;
# 프록시 헤더 설정
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header X-Forwarded-Host $server_name;
# 인증 헤더 및 쿠키 전달 (JWT 토큰 전달 보장)
proxy_pass_request_headers on;
proxy_set_header Cookie $http_cookie;
proxy_set_header Authorization $http_authorization;
# 타임아웃 설정 (대용량 파일 업로드 지원)
proxy_connect_timeout 300s;
proxy_send_timeout 300s;
proxy_read_timeout 300s;
# 버퍼 설정
proxy_buffering on;
proxy_buffer_size 4k;
proxy_buffers 8 4k;
proxy_busy_buffers_size 8k;
}
# 헬스체크 엔드포인트
location /monitor/health {
proxy_pass http://api_backend/monitor/health;
access_log off;
}
}
# HTTPS 서버 설정 - Next.js Web Application
server {
listen 443 ssl http2;
server_name train-kamco.com;
# SSL 인증서 설정 (사설 인증서 - 멀티 도메인)
ssl_certificate /etc/nginx/ssl/train-kamco.com.crt;
ssl_certificate_key /etc/nginx/ssl/train-kamco.com.key;
# SSL 프로토콜 및 암호화 설정
ssl_protocols TLSv1.2 TLSv1.3;
ssl_ciphers 'ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384';
ssl_prefer_server_ciphers off;
# SSL 세션 캐시
ssl_session_cache shared:SSL:10m;
ssl_session_timeout 10m;
# HSTS (HTTP Strict Transport Security)
add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always;
# 보안 헤더
add_header X-Frame-Options "SAMEORIGIN" always;
add_header X-Content-Type-Options "nosniff" always;
add_header X-XSS-Protection "1; mode=block" always;
# API 프록시 설정 (Web에서 API 호출 시)
location /api/ {
proxy_pass http://api_backend/api/;
proxy_http_version 1.1;
# 프록시 헤더 설정
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header X-Forwarded-Host $server_name;
# 인증 헤더 및 쿠키 전달
proxy_pass_request_headers on;
proxy_set_header Cookie $http_cookie;
# 타임아웃 설정
proxy_connect_timeout 300s;
proxy_send_timeout 300s;
proxy_read_timeout 300s;
}
# 프록시 설정
location / {
proxy_pass http://web_backend;
proxy_http_version 1.1;
# 프록시 헤더 설정
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header X-Forwarded-Host $server_name;
# Next.js WebSocket 지원을 위한 Upgrade 헤더
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
# 타임아웃 설정
proxy_connect_timeout 600s;
proxy_send_timeout 600s;
proxy_read_timeout 600s;
# 버퍼 설정
proxy_buffering on;
proxy_buffer_size 4k;
proxy_buffers 8 4k;
proxy_busy_buffers_size 8k;
}
}
}

View File

@@ -2,10 +2,8 @@ package com.kamco.cd.training;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.scheduling.annotation.EnableAsync;
import org.springframework.scheduling.annotation.EnableScheduling;
@EnableAsync
@SpringBootApplication
@EnableScheduling
public class KamcoTrainingApplication {

View File

@@ -23,7 +23,8 @@ public class JwtAuthenticationFilter extends OncePerRequestFilter {
private final UserDetailsService userDetailsService;
private static final AntPathMatcher PATH_MATCHER = new AntPathMatcher();
private static final String[] EXCLUDE_PATHS = {
"/api/auth/signin", "/api/auth/refresh", "/api/auth/logout", "/api/members/*/password"
// "/api/auth/signin", "/api/auth/refresh", "/api/auth/logout", "/api/members/*/password"
"/api/auth/signin", "/api/auth/refresh", "/api/auth/logout"
};
@Override

View File

@@ -20,7 +20,7 @@ import org.springframework.web.bind.annotation.*;
@Tag(name = "공통코드 관리", description = "공통코드 관리 API")
@RestController
@RequiredArgsConstructor
@RequestMapping("/api/code")
@RequestMapping("/api/common-code")
public class CommonCodeApiController {
private final CommonCodeService commonCodeService;

View File

@@ -0,0 +1,48 @@
package com.kamco.cd.training.common.download;
import com.kamco.cd.training.common.download.dto.DownloadSpec;
import com.kamco.cd.training.common.utils.UserUtil;
import java.io.IOException;
import java.io.InputStream;
import java.nio.file.Files;
import lombok.RequiredArgsConstructor;
import org.springframework.http.HttpHeaders;
import org.springframework.http.MediaType;
import org.springframework.http.ResponseEntity;
import org.springframework.stereotype.Service;
import org.springframework.web.servlet.mvc.method.annotation.StreamingResponseBody;
@Service
@RequiredArgsConstructor
public class DownloadExecutor {
private final UserUtil userUtil;
public ResponseEntity<StreamingResponseBody> stream(DownloadSpec spec) throws IOException {
if (!Files.isReadable(spec.filePath())) {
return ResponseEntity.notFound().build();
}
StreamingResponseBody body =
os -> {
try (InputStream in = Files.newInputStream(spec.filePath())) {
in.transferTo(os);
os.flush();
} catch (Exception e) {
// 고용량은 중간 끊김 흔하니까 throw 금지
}
};
String fileName =
spec.downloadName() != null
? spec.downloadName()
: spec.filePath().getFileName().toString();
return ResponseEntity.ok()
.contentType(
spec.contentType() != null ? spec.contentType() : MediaType.APPLICATION_OCTET_STREAM)
.header(HttpHeaders.CONTENT_DISPOSITION, "attachment; filename=\"" + fileName + "\"")
.body(body);
}
}

View File

@@ -0,0 +1,19 @@
package com.kamco.cd.training.common.download;
import org.springframework.util.AntPathMatcher;
public final class DownloadPaths {
private DownloadPaths() {}
public static final String[] PATTERNS = {
"/api/inference/download/**", "/api/training-data/stage/download/**"
};
public static boolean matches(String uri) {
AntPathMatcher m = new AntPathMatcher();
for (String p : PATTERNS) {
if (m.match(p, uri)) return true;
}
return false;
}
}

View File

@@ -0,0 +1,81 @@
package com.kamco.cd.training.common.download;
import jakarta.servlet.http.HttpServletRequest;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.List;
import org.springframework.core.io.FileSystemResource;
import org.springframework.core.io.Resource;
import org.springframework.core.io.support.ResourceRegion;
import org.springframework.http.HttpHeaders;
import org.springframework.http.HttpRange;
import org.springframework.http.MediaType;
import org.springframework.http.ResponseEntity;
import org.springframework.stereotype.Component;
@Component
public class RangeDownloadResponder {
public ResponseEntity<?> buildZipResponse(
Path filePath, String downloadFileName, HttpServletRequest request) throws IOException {
if (!Files.isRegularFile(filePath)) {
return ResponseEntity.notFound().build();
}
long totalSize = Files.size(filePath);
Resource resource = new FileSystemResource(filePath);
String disposition = "attachment; filename=\"" + downloadFileName + "\"";
String rangeHeader = request.getHeader(HttpHeaders.RANGE);
// 🔥 공통 헤더 (여기 고정)
ResponseEntity.BodyBuilder base =
ResponseEntity.ok()
.contentType(MediaType.APPLICATION_OCTET_STREAM)
.header(HttpHeaders.CONTENT_DISPOSITION, disposition)
.header(HttpHeaders.ACCEPT_RANGES, "bytes")
.header("Access-Control-Expose-Headers", "Content-Disposition")
.header("X-Accel-Buffering", "no");
if (rangeHeader == null || rangeHeader.isBlank()) {
return base.contentLength(totalSize).body(resource);
}
List<HttpRange> ranges;
try {
ranges = HttpRange.parseRanges(rangeHeader);
} catch (IllegalArgumentException ex) {
return ResponseEntity.status(416)
.header(HttpHeaders.CONTENT_RANGE, "bytes */" + totalSize)
.header("X-Accel-Buffering", "no")
.build();
}
HttpRange range = ranges.get(0);
long start = range.getRangeStart(totalSize);
long end = range.getRangeEnd(totalSize);
if (start >= totalSize) {
return ResponseEntity.status(416)
.header(HttpHeaders.CONTENT_RANGE, "bytes */" + totalSize)
.header("X-Accel-Buffering", "no")
.build();
}
long regionLength = end - start + 1;
ResourceRegion region = new ResourceRegion(resource, start, regionLength);
return ResponseEntity.status(206)
.contentType(MediaType.APPLICATION_OCTET_STREAM)
.header(HttpHeaders.CONTENT_DISPOSITION, disposition)
.header(HttpHeaders.ACCEPT_RANGES, "bytes")
.header("Access-Control-Expose-Headers", "Content-Disposition")
.header("X-Accel-Buffering", "no")
.header(HttpHeaders.CONTENT_RANGE, "bytes " + start + "-" + end + "/" + totalSize)
.contentLength(regionLength)
.body(region);
}
}

View File

@@ -0,0 +1,12 @@
package com.kamco.cd.training.common.download.dto;
import java.nio.file.Path;
import java.util.UUID;
import org.springframework.http.MediaType;
public record DownloadSpec(
UUID uuid, // 다운로드 식별(로그/정책용)
Path filePath, // 실제 파일 경로
String downloadName, // 사용자에게 보일 파일명
MediaType contentType // 보통 OCTET_STREAM
) {}

View File

@@ -12,13 +12,14 @@ import lombok.Setter;
@AllArgsConstructor
@NoArgsConstructor
public class HyperParam {
@Schema(description = "모델", example = "G1")
private ModelType model; // G1, G2, G3
// -------------------------
// Important
// -------------------------
@Schema(description = "모델", example = "large")
private ModelType model; // backbone
@Schema(description = "백본 네트워크", example = "large")
private String backbone; // backbone

View File

@@ -0,0 +1,8 @@
package com.kamco.cd.training.common.dto;
public class MonitorDto {
public int cpu; // CPU 사용률 (%)
public long[] memory; // "사용/전체"
public int gpu; // 🔥 전체 GPU 평균 (%)
}

View File

@@ -0,0 +1,27 @@
package com.kamco.cd.training.common.enums;
import com.kamco.cd.training.common.utils.enums.EnumType;
import lombok.AllArgsConstructor;
import lombok.Getter;
@Getter
@AllArgsConstructor
public enum JobStatusType implements EnumType {
QUEUED("대기중"),
RUNNING("실행중"),
SUCCESS("성공"),
FAILED("실패"),
CANCELED("취소");
private final String desc;
@Override
public String getId() {
return name();
}
@Override
public String getText() {
return desc;
}
}

View File

@@ -0,0 +1,24 @@
package com.kamco.cd.training.common.enums;
import com.kamco.cd.training.common.utils.enums.EnumType;
import lombok.AllArgsConstructor;
import lombok.Getter;
@Getter
@AllArgsConstructor
public enum JobType implements EnumType {
TRAIN("학습"),
TEST("테스트");
private final String desc;
@Override
public String getId() {
return name();
}
@Override
public String getText() {
return desc;
}
}

View File

@@ -17,7 +17,10 @@ public enum ModelType implements EnumType {
private String desc;
public static ModelType getValueData(String modelNo) {
return Arrays.stream(ModelType.values()).filter(m -> m.getId().equals(modelNo)).findFirst().orElse(G1);
return Arrays.stream(ModelType.values())
.filter(m -> m.getId().equals(modelNo))
.findFirst()
.orElse(G1);
}
@Override

View File

@@ -0,0 +1,142 @@
package com.kamco.cd.training.common.service;
import jakarta.annotation.PostConstruct;
import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import lombok.extern.log4j.Log4j2;
import org.springframework.stereotype.Component;
@Component
@Log4j2
public class GpuDmonReader {
// =========================
// GPU 사용률 저장소
// key: GPU index (0,1,2...)
// value: 현재 GPU 사용률 (%)
// ConcurrentHashMap → 멀티스레드 안전
// =========================
private final Map<Integer, Integer> gpuUtilMap = new ConcurrentHashMap<>();
// =========================
// 외부 조회용
// SystemMonitorService에서 호출
// =========================
public Map<Integer, Integer> getGpuUtilMap() {
return gpuUtilMap;
}
// =========================
// Bean 초기화 시 실행
// - 별도 스레드에서 GPU 모니터링 시작
// - 메인 스레드 block 방지
// =========================
@PostConstruct
public void start() {
// nvidia-smi 없는 환경이면 GPU 모니터링 비활성화
if (!isNvidiaAvailable()) {
log.warn("nvidia-smi not found. GPU monitoring disabled.");
return;
}
// 데몬 스레드로 실행 (서버 종료 시 자동 종료)
Thread t = new Thread(this::runLoop, "gpu-dmon-thread");
t.setDaemon(true);
t.start();
}
// =========================
// 무한 루프
// - dmon 실행
// - 죽으면 자동 재시작
// =========================
private void runLoop() {
while (true) {
try {
runDmon(); // GPU 사용률 수집 시작
} catch (Exception e) {
// dmon 프로세스 종료되면 여기로 들어옴
log.warn("dmon restart: {}", e.getMessage());
}
// 5초 대기 후 재시작
sleep(5000);
}
}
// =========================
// nvidia-smi dmon 실행
// - GPU 사용률 스트리밍으로 계속 수신
// =========================
private void runDmon() throws Exception {
// -s u → GPU utilization만 출력
ProcessBuilder pb = new ProcessBuilder("nvidia-smi", "dmon", "-s", "u");
// 프로세스 실행 후 stdout 읽기
try (BufferedReader br =
new BufferedReader(new InputStreamReader(pb.start().getInputStream()))) {
String line;
// dmon은 계속 출력됨 (스트리밍)
while ((line = br.readLine()) != null) {
// 헤더 제거 (#로 시작)
if (line.startsWith("#")) continue;
line = line.trim();
if (line.isEmpty()) continue;
// 공백 기준 분리
String[] parts = line.split("\\s+");
// 첫 번째 값이 GPU index인지 확인
if (!parts[0].matches("\\d+")) continue;
int index = Integer.parseInt(parts[0]);
try {
// 두 번째 값이 GPU 사용률 (sm)
int util = Integer.parseInt(parts[1]);
// 최신 값 갱신
gpuUtilMap.put(index, util);
} catch (Exception ignored) {
// 파싱 실패 시 무시
}
}
}
// 여기까지 왔다는 건 dmon 프로세스 종료됨
// → runLoop에서 재시작하도록 예외 발생
throw new IllegalStateException("dmon stopped");
}
// =========================
// nvidia-smi 존재 여부 확인
// =========================
private boolean isNvidiaAvailable() {
try {
Process p = new ProcessBuilder("which", "nvidia-smi").start();
return p.waitFor() == 0;
} catch (Exception e) {
return false;
}
}
// =========================
// sleep 유틸
// =========================
private void sleep(long ms) {
try {
Thread.sleep(ms);
} catch (InterruptedException ignored) {
}
}
}

View File

@@ -0,0 +1,224 @@
package com.kamco.cd.training.common.service;
import com.kamco.cd.training.common.dto.MonitorDto;
import java.io.BufferedReader;
import java.io.FileReader;
import java.util.ArrayDeque;
import java.util.Deque;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import lombok.RequiredArgsConstructor;
import lombok.extern.log4j.Log4j2;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Service;
@Service
@RequiredArgsConstructor
@Log4j2
public class SystemMonitorService {
// =========================
// CPU 이전값 (delta 계산용)
// - /proc/stat은 누적값이기 때문에
// - 이전 값과 비교해서 사용률 계산
// =========================
private long prevIdle = 0;
private long prevTotal = 0;
// =========================
// 최근 30초 히스토리
// - CPU: 30개 (1초 * 30)
// - GPU: GPU별 30개
// =========================
private final Deque<Double> cpuHistory = new ArrayDeque<>();
// key: GPU index
// value: 최근 30개 사용률
private final Map<Integer, Deque<Integer>> gpuHistory = new ConcurrentHashMap<>();
// =========================
// GPU 데이터 제공 (dmon reader)
// =========================
private final GpuDmonReader gpuReader;
// =========================
// 캐시 (API 응답용)
// - 매 요청마다 계산하지 않기 위해 사용
// - volatile → 멀티스레드 안전하게 최신값 유지
// =========================
private volatile MonitorDto cached = new MonitorDto();
// =========================
// 1초마다 수집
// =========================
@Scheduled(fixedRate = 1000)
public void collect() {
try {
// =====================
// 1. CPU 수집
// =====================
double cpu = readCpu();
cpuHistory.add(cpu);
// 30개 유지 (rolling window)
if (cpuHistory.size() > 30) cpuHistory.poll();
// =====================
// 2. GPU 수집
// =====================
Map<Integer, Integer> gpuMap = gpuReader.getGpuUtilMap();
for (Map.Entry<Integer, Integer> entry : gpuMap.entrySet()) {
int index = entry.getKey();
int util = entry.getValue();
// GPU별 히스토리 생성 및 추가
gpuHistory.computeIfAbsent(index, k -> new ArrayDeque<>()).add(util);
// 30개 유지
Deque<Integer> q = gpuHistory.get(index);
if (q.size() > 30) q.poll();
}
// =====================
// 3. 캐시 업데이트
// =====================
updateCache();
} catch (Exception e) {
log.error("collect error", e);
}
}
// =========================
// CPU 사용률 계산
// - /proc/stat 사용
// - 이전값과의 차이로 계산 (delta 방식)
// =========================
private double readCpu() throws Exception {
if (!isLinux()) return 0;
try (BufferedReader br = new BufferedReader(new FileReader("/proc/stat"))) {
String[] p = br.readLine().split("\\s+");
long user = Long.parseLong(p[1]);
long nice = Long.parseLong(p[2]);
long system = Long.parseLong(p[3]);
long idle = Long.parseLong(p[4]);
long iowait = Long.parseLong(p[5]);
long irq = Long.parseLong(p[6]);
long softirq = Long.parseLong(p[7]);
long total = user + nice + system + idle + iowait + irq + softirq;
long idleAll = idle + iowait;
// 최초 실행 시 기준값만 저장
if (prevTotal == 0) {
prevTotal = total;
prevIdle = idleAll;
return 0;
}
long totalDiff = total - prevTotal;
long idleDiff = idleAll - prevIdle;
prevTotal = total;
prevIdle = idleAll;
if (totalDiff == 0) return 0;
// CPU 사용률 (%)
return (1.0 - (double) idleDiff / totalDiff) * 100;
}
}
// =========================
// Linux 환경 체크
// =========================
private boolean isLinux() {
return System.getProperty("os.name").toLowerCase().contains("linux");
}
// =========================
// Memory 조회 (/proc/meminfo)
// - OS 값 그대로 사용 (kB)
// - [사용량, 전체]
// =========================
private long[] readMemory() throws Exception {
if (!isLinux()) return new long[] {0, 0};
try (BufferedReader br = new BufferedReader(new FileReader("/proc/meminfo"))) {
long total = 0;
long available = 0;
String line;
while ((line = br.readLine()) != null) {
if (line.startsWith("MemTotal")) {
total = Long.parseLong(line.replaceAll("\\D+", ""));
} else if (line.startsWith("MemAvailable")) {
available = Long.parseLong(line.replaceAll("\\D+", ""));
}
}
long used = total - available;
return new long[] {used, total};
}
}
// =========================
// 캐시 업데이트
// - CPU: 30초 평균
// - GPU: 전체 샘플 평균
// - Memory: 현재값
// =========================
private void updateCache() throws Exception {
MonitorDto dto = new MonitorDto();
// =====================
// CPU 평균 (30초)
// =====================
dto.cpu = (int) cpuHistory.stream().mapToDouble(Double::doubleValue).average().orElse(0);
// =====================
// Memory (kB 그대로)
// =====================
dto.memory = readMemory();
// =====================
// GPU 평균 (🔥 전체 샘플 기준)
// =====================
int sum = 0;
int count = 0;
for (Deque<Integer> q : gpuHistory.values()) {
for (int v : q) {
sum += v;
count++;
}
}
dto.gpu = (count == 0) ? 0 : sum / count;
// =====================
// 캐시 교체 (atomic)
// =====================
this.cached = dto;
}
// =========================
// 외부 조회
// - Controller에서 호출
// =========================
public MonitorDto get() {
return cached;
}
}

View File

@@ -3,9 +3,10 @@ package com.kamco.cd.training.common.utils;
import static java.lang.String.CASE_INSENSITIVE_ORDER;
import com.jcraft.jsch.ChannelExec;
import com.jcraft.jsch.ChannelSftp;
import com.jcraft.jsch.JSch;
import com.jcraft.jsch.Session;
import com.kamco.cd.training.common.exception.CustomApiException;
import com.kamco.cd.training.config.api.ApiResponseDto.ApiResponseCode;
import io.swagger.v3.oas.annotations.media.Schema;
import java.io.BufferedReader;
import java.io.File;
@@ -15,6 +16,7 @@ import java.io.FileReader;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.UncheckedIOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
@@ -39,6 +41,7 @@ import lombok.extern.slf4j.Slf4j;
import org.apache.commons.io.FilenameUtils;
import org.geotools.coverage.grid.GridCoverage2D;
import org.geotools.gce.geotiff.GeoTiffReader;
import org.springframework.http.HttpStatus;
import org.springframework.util.FileSystemUtils;
import org.springframework.web.multipart.MultipartFile;
@@ -716,13 +719,26 @@ public class FIleChecker {
public static void unzip(String fileName, String destDirectory) throws IOException {
String zipFilePath = destDirectory + File.separator + fileName;
log.info("fileName : {}", fileName);
log.info("destDirectory : {}", destDirectory);
log.info("zipFilePath : {}", zipFilePath);
// zip 이름으로 폴더 생성 (확장자 제거)
String folderName =
fileName.endsWith(".zip") ? fileName.substring(0, fileName.length() - 4) : fileName;
log.info("folderName : {}", folderName);
File destDir = new File(destDirectory, folderName);
log.info("destDir : {}", destDir);
// 동일 폴더가 이미 있으면 삭제
log.info("111 destDir.exists() : {}", destDir.exists());
if (destDir.exists()) {
deleteDirectoryRecursively(destDir.toPath());
}
log.info("222 destDir.exists() : {}", destDir.exists());
if (!destDir.exists()) {
log.info("mkdirs : {}", destDir.exists());
destDir.mkdirs();
}
@@ -757,6 +773,11 @@ public class FIleChecker {
zipEntry = zis.getNextEntry();
}
zis.closeEntry();
} catch (IOException e) {
throw new CustomApiException(
ApiResponseCode.INTERNAL_SERVER_ERROR.getId(),
HttpStatus.INTERNAL_SERVER_ERROR,
"압축 해제 중 오류가 발생했습니다: " + e.getMessage());
}
}
@@ -773,92 +794,6 @@ public class FIleChecker {
return destFile;
}
public static void uploadTo86(Path localFile) {
String host = "192.168.2.86";
int port = 22;
String username = "kcomu";
String password = "Kamco2025!";
String remoteDir = "/home/kcomu/data/request";
Session session = null;
ChannelSftp channel = null;
try {
JSch jsch = new JSch();
session = jsch.getSession(username, host, port);
session.setPassword(password);
Properties config = new Properties();
config.put("StrictHostKeyChecking", "no");
session.setConfig(config);
session.connect(10_000);
channel = (ChannelSftp) session.openChannel("sftp");
channel.connect(10_000);
// 목적지 디렉토리 이동
channel.cd(remoteDir);
// 업로드
channel.put(localFile.toString(), localFile.getFileName().toString());
} catch (Exception e) {
throw new RuntimeException("SFTP upload failed", e);
} finally {
if (channel != null) channel.disconnect();
if (session != null) session.disconnect();
}
}
public static void unzipOn86Server(String zipPath, String targetDir) {
String host = "192.168.2.86";
String user = "kcomu";
String password = "Kamco2025!";
Session session = null;
ChannelExec channel = null;
try {
JSch jsch = new JSch();
session = jsch.getSession(user, host, 22);
session.setPassword(password);
Properties config = new Properties();
config.put("StrictHostKeyChecking", "no");
session.setConfig(config);
session.connect(10_000);
String command = "unzip -o " + zipPath + " -d " + targetDir;
channel = (ChannelExec) session.openChannel("exec");
channel.setCommand(command);
channel.setErrStream(System.err);
InputStream in = channel.getInputStream();
channel.connect();
// 출력 읽기(선택)
try (BufferedReader br = new BufferedReader(new InputStreamReader(in))) {
while (br.readLine() != null) {
// 필요하면 로그
}
}
} catch (Exception e) {
throw new RuntimeException(e);
} finally {
if (channel != null) channel.disconnect();
if (session != null) session.disconnect();
}
}
public static List<String> execCommandAndReadLines(String command) {
List<String> result = new ArrayList<>();
@@ -906,4 +841,22 @@ public class FIleChecker {
if (session != null) session.disconnect();
}
}
/** ✅ 폴더 재귀 삭제 */
private static void deleteDirectoryRecursively(Path path) throws IOException {
if (!Files.exists(path)) return;
// 하위부터 지워야 하므로 reverse order
Files.walk(path)
.sorted(Comparator.reverseOrder())
.forEach(
p -> {
try {
Files.deleteIfExists(p);
} catch (IOException e) {
// 여기서 바로 RuntimeException으로 올려서 상위 catch(IOException)로 잡히게 함
throw new UncheckedIOException("폴더 삭제 실패: " + p.toAbsolutePath(), e);
}
});
}
}

View File

@@ -0,0 +1,23 @@
package com.kamco.cd.training.common.utils;
import jakarta.servlet.http.HttpServletRequest;
public final class HeaderUtil {
private HeaderUtil() {}
/** 특정 Header 값 조회 */
public static String get(HttpServletRequest request, String headerName) {
if (request == null || headerName == null) {
return null;
}
String value = request.getHeader(headerName);
return (value != null && !value.isBlank()) ? value : null;
}
/** 필수 Header 조회 (없으면 null) */
public static String getRequired(HttpServletRequest request, String headerName) {
return get(request, headerName);
}
}

View File

@@ -0,0 +1,23 @@
package com.kamco.cd.training.config;
import java.util.concurrent.Executor;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.scheduling.annotation.EnableAsync;
import org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor;
@Configuration
@EnableAsync
public class AsyncConfig {
@Bean(name = "trainJobExecutor")
public Executor trainJobExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(4); // 동시에 4개 실행
executor.setMaxPoolSize(8); // 최대 8개
executor.setQueueCapacity(200); // 대기 큐
executor.setThreadNamePrefix("train-job-");
executor.initialize();
return executor;
}
}

View File

@@ -76,11 +76,13 @@ public class SecurityConfig {
"/api/auth/logout",
"/swagger-ui/**",
"/v3/api-docs/**",
"/api/members/*/password",
"/api/upload/chunk-upload-dataset",
"/api/upload/chunk-upload-complete")
"/api/upload/chunk-upload-complete",
"/download_progress_test.html",
"/api/models/download/**")
.permitAll()
.requestMatchers("/api/members/*/password")
.authenticated()
// default
.anyRequest()
.authenticated())
@@ -102,15 +104,19 @@ public class SecurityConfig {
return new BCryptPasswordEncoder();
}
/** CORS 설정 */
/** CORS 설정 - application.yml에서 환경별로 관리 */
@Bean
public CorsConfigurationSource corsConfigurationSource() {
CorsConfiguration config = new CorsConfiguration(); // CORS 객체 생성
// application.yml에서 환경별로 설정된 도메인 사용
config.setAllowedOriginPatterns(List.of("*")); // 도메인 허용
config.setAllowedMethods(List.of("GET", "POST", "PUT", "PATCH", "DELETE", "OPTIONS"));
config.setAllowedHeaders(List.of("*")); // 헤더요청 Authorization, Content-Type, X-Custom-Header
config.setAllowCredentials(true); // 쿠키, Authorization 헤더, Bearer Token 등 자격증명 포함 요청을 허용할지 설정
config.setExposedHeaders(List.of("Content-Disposition"));
config.setExposedHeaders(List.of("Content-Disposition", "Authorization"));
config.setMaxAge(3600L); // Preflight 요청 캐시 (1시간)
UrlBasedCorsConfigurationSource source = new UrlBasedCorsConfigurationSource();
/** "/**" → 모든 API 경로에 대해 이 CORS 규칙을 적용 /api/** 같이 특정 경로만 지정 가능. */

View File

@@ -57,7 +57,7 @@ public class StartupLogger {
"""
╔════════════════════════════════════════════════════════════════════════════════╗
║ 🚀 APPLICATION STARTUP INFORMATION
║ 🚀 APPLICATION STARTUP INFORMATION 2
╠════════════════════════════════════════════════════════════════════════════════╣
║ PROFILE CONFIGURATION ║
╠────────────────────────────────────────────────────────────────────────────────╣

View File

@@ -5,11 +5,14 @@ import com.kamco.cd.training.log.dto.EventType;
import com.kamco.cd.training.menu.dto.MenuDto;
import jakarta.servlet.http.HttpServletRequest;
import java.io.UnsupportedEncodingException;
import java.util.Comparator;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;
import lombok.extern.slf4j.Slf4j;
import org.springframework.web.util.ContentCachingRequestWrapper;
@Slf4j
public class ApiLogFunction {
// 클라이언트 IP 추출
@@ -34,6 +37,14 @@ public class ApiLogFunction {
return ip;
}
public static String getXFowardedForIp(HttpServletRequest request) {
String ip = request.getHeader("X-Forwarded-For");
if (ip != null) {
ip = ip.split(",")[0].trim();
}
return ip;
}
// 사용자 ID 추출 예시 (Spring Security 기준)
public static String getUserId(HttpServletRequest request) {
try {
@@ -47,20 +58,20 @@ public class ApiLogFunction {
String method = request.getMethod().toUpperCase();
String uri = request.getRequestURI().toLowerCase();
// URL 기반 DOWNLOAD/PRINT 분류
// URL 기반 DOWNLOAD/PRINT 분류 -> /download는 FileDownloadInterceptor로 옮김
if (uri.contains("/download") || uri.contains("/export")) {
return EventType.DOWNLOAD;
}
if (uri.contains("/print")) {
return EventType.PRINT;
return EventType.OTHER;
}
// 일반 CRUD
return switch (method) {
case "POST" -> EventType.CREATE;
case "GET" -> EventType.READ;
case "DELETE" -> EventType.DELETE;
case "PUT", "PATCH" -> EventType.UPDATE;
case "POST" -> EventType.ADDED;
case "GET" -> EventType.LIST;
case "DELETE" -> EventType.REMOVE;
case "PUT", "PATCH" -> EventType.MODIFIED;
default -> EventType.OTHER;
};
}
@@ -121,12 +132,22 @@ public class ApiLogFunction {
public static String getUriMenuInfo(List<MenuDto.Basic> menuList, String uri) {
MenuDto.Basic m =
String normalizedUri = uri.replace("/api", "");
MenuDto.Basic basic =
menuList.stream()
.filter(menu -> menu.getMenuApiUrl() != null && uri.contains(menu.getMenuApiUrl()))
.findFirst()
.filter(
menu -> menu.getMenuUrl() != null && normalizedUri.startsWith(menu.getMenuUrl()))
.max(Comparator.comparingInt(m -> m.getMenuUrl().length()))
.orElse(null);
return m != null ? m.getMenuUid() : "SYSTEM";
return basic != null ? basic.getMenuUid() : "SYSTEM";
}
public static String cutRequestBody(String value) {
int MAX_LEN = 255;
if (value == null) {
return null;
}
return value.length() <= MAX_LEN ? value : value.substring(0, MAX_LEN);
}
}

View File

@@ -2,10 +2,17 @@ package com.kamco.cd.training.config.api;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.kamco.cd.training.auth.CustomUserDetails;
import com.kamco.cd.training.common.utils.HeaderUtil;
import com.kamco.cd.training.log.dto.EventType;
import com.kamco.cd.training.menu.dto.MenuDto;
import com.kamco.cd.training.menu.service.MenuService;
import com.kamco.cd.training.postgres.entity.AuditLogEntity;
import com.kamco.cd.training.postgres.repository.log.AuditLogRepository;
import jakarta.servlet.http.HttpServletRequest;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Optional;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.core.MethodParameter;
import org.springframework.http.MediaType;
@@ -23,6 +30,7 @@ import org.springframework.web.util.ContentCachingRequestWrapper;
*
* <p>createOK() → 201 CREATED ok() → 200 OK deleteOk() → 204 NO_CONTENT
*/
@Slf4j
@RestControllerAdvice
public class ApiResponseAdvice implements ResponseBodyAdvice<Object> {
@@ -61,13 +69,28 @@ public class ApiResponseAdvice implements ResponseBodyAdvice<Object> {
if (body instanceof ApiResponseDto<?> apiResponse) {
response.setStatusCode(apiResponse.getHttpStatus());
String ip = ApiLogFunction.getClientIp(servletRequest);
Long userid = null;
String actionType = HeaderUtil.get(servletRequest, "kamco-action-type");
// actionType 이 없으면 로그 저장하지 않기 || download 는 FileDownloadInterceptor 에서 하기
// (file down URL prefix 추가는 WebConfig.java 에 하기)
if (actionType == null || actionType.equalsIgnoreCase("download")) {
return body;
}
String ip =
Optional.ofNullable(HeaderUtil.get(servletRequest, "kamco-user-ip"))
.orElseGet(() -> ApiLogFunction.getXFowardedForIp(servletRequest));
Long userid = null;
String loginAttemptId = null;
// 로그인 시도할 때
if (servletRequest.getRequestURI().contains("/api/auth/signin")) {
loginAttemptId = HeaderUtil.get(servletRequest, "kamco-login-attempt-id");
} else {
if (servletRequest.getUserPrincipal() instanceof UsernamePasswordAuthenticationToken auth
&& auth.getPrincipal() instanceof CustomUserDetails customUserDetails) {
userid = customUserDetails.getMember().getId();
}
}
String requestBody;
// 멀티파트 요청은 바디 로깅을 생략 (파일 바이너리로 인한 문제 예방)
@@ -84,17 +107,33 @@ public class ApiResponseAdvice implements ResponseBodyAdvice<Object> {
requestBody = maskSensitiveFields(requestBody);
}
List<?> list = menuService.getFindAll();
List<MenuDto.Basic> result =
list.stream()
.map(
item -> {
if (item instanceof LinkedHashMap<?, ?> map) {
return objectMapper.convertValue(map, MenuDto.Basic.class);
} else if (item instanceof MenuDto.Basic dto) {
return dto;
} else {
throw new IllegalStateException("Unsupported cache type: " + item.getClass());
}
})
.toList();
AuditLogEntity log =
new AuditLogEntity(
userid,
ApiLogFunction.getEventType(servletRequest),
EventType.fromName(actionType),
ApiLogFunction.isSuccessFail(apiResponse),
ApiLogFunction.getUriMenuInfo(
menuService.getFindAll(), servletRequest.getRequestURI()),
ApiLogFunction.getUriMenuInfo(result, servletRequest.getRequestURI()),
ip,
servletRequest.getRequestURI(),
requestBody,
apiResponse.getErrorLogUid());
ApiLogFunction.cutRequestBody(requestBody),
apiResponse.getErrorLogUid(),
null,
loginAttemptId);
auditLogRepository.save(log);
}

View File

@@ -14,6 +14,10 @@ import io.swagger.v3.oas.annotations.responses.ApiResponse;
import io.swagger.v3.oas.annotations.responses.ApiResponses;
import io.swagger.v3.oas.annotations.tags.Tag;
import jakarta.validation.Valid;
import java.io.IOException;
import java.nio.file.FileStore;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.List;
import java.util.UUID;
import lombok.RequiredArgsConstructor;
@@ -208,8 +212,15 @@ public class DatasetApiController {
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("/usable-bytes")
public ApiResponseDto<DatasetStorage> getUsableBytes() {
return ApiResponseDto.ok(datasetService.getUsableBytes());
public ApiResponseDto<DatasetStorage> getUsableBytes() throws IOException {
FileStore store = Files.getFileStore(Path.of("."));
long usable = store.getUsableSpace();
DatasetStorage storage = new DatasetStorage();
storage.setUsableBytes(String.valueOf(usable));
// datasetService.getUsableBytes();
return ApiResponseDto.ok(storage);
}
@Operation(summary = "학습데이터 zip파일 등록", description = "학습데이터 zip파일 등록 합니다.")
@@ -217,7 +228,7 @@ public class DatasetApiController {
public ApiResponseDto<ApiResponseDto.ResponseObj> insertDataset(
@RequestBody @Valid DatasetDto.AddReq addReq) {
return ApiResponseDto.ok(datasetService.insertDataset(addReq));
return ApiResponseDto.okObject(datasetService.insertDataset(addReq));
}
@Operation(summary = "객체별 파일 Path 조회", description = "파일 Path 조회")

View File

@@ -1,7 +1,6 @@
package com.kamco.cd.training.dataset.dto;
import com.fasterxml.jackson.annotation.JsonIgnore;
import com.fasterxml.jackson.annotation.JsonInclude;
import com.kamco.cd.training.common.enums.LearnDataRegister;
import com.kamco.cd.training.common.enums.LearnDataType;
import com.kamco.cd.training.common.enums.ModelType;
@@ -77,9 +76,16 @@ public class DatasetDto {
}
public String getTotalSize(Long totalSize) {
if (totalSize == null) return "0G";
if (totalSize == null || totalSize <= 0) return "0M";
double giga = totalSize / (1024.0 * 1024 * 1024);
if (giga >= 1) {
return String.format("%.2fG", giga);
} else {
double mega = totalSize / (1024.0 * 1024);
return String.format("%.2fM", mega);
}
}
public String getStatus(String status) {
@@ -227,7 +233,6 @@ public class DatasetDto {
@Getter
@Setter
@NoArgsConstructor
@JsonInclude(JsonInclude.Include.NON_NULL)
public static class SelectDataSet {
private String modelNo; // G1, G2, G3 모델 타입
@@ -310,6 +315,183 @@ public class DatasetDto {
}
}
@Schema(name = "SelectTransferDataSet", description = "전이학습 데이터셋 선택 리스트")
@Getter
@Setter
@NoArgsConstructor
public static class SelectTransferDataSet {
private String modelNo; // G1, G2, G3 모델 타입
private Long datasetId;
private UUID uuid;
private String dataType;
private String title;
private Long roundNo;
private Integer compareYyyy;
private Integer targetYyyy;
private String memo;
@JsonIgnore private Long classCount;
private Integer buildingCnt;
private Integer containerCnt;
private String dataTypeName;
private Long wasteCnt;
private Long landCoverCnt;
private String beforeModelNo; // G1, G2, G3 모델 타입
private Long beforeDatasetId;
private UUID beforeUuid;
private String beforeDataType;
private String beforeTitle;
private Long beforeRoundNo;
private Integer beforeCompareYyyy;
private Integer beforeTargetYyyy;
private String beforeMemo;
@JsonIgnore private Long beforeClassCount;
private Integer beforeBuildingCnt;
private Integer beforeContainerCnt;
private String beforeDataTypeName;
private Long beforeWasteCnt;
private Long beforeLandCoverCnt;
public SelectTransferDataSet(
// 현재
String modelNo,
Long datasetId,
UUID uuid,
String dataType,
String title,
Long roundNo,
Integer compareYyyy,
Integer targetYyyy,
String memo,
Long classCount,
// 이전(before)
String beforeModelNo,
Long beforeDatasetId,
UUID beforeUuid,
String beforeDataType,
String beforeTitle,
Long beforeRoundNo,
Integer beforeCompareYyyy,
Integer beforeTargetYyyy,
String beforeMemo,
Long beforeClassCount) {
// 현재
this.modelNo = modelNo;
this.datasetId = datasetId;
this.uuid = uuid;
this.dataType = dataType;
this.dataTypeName = getDataTypeName(dataType);
this.title = title;
this.roundNo = roundNo;
this.compareYyyy = compareYyyy;
this.targetYyyy = targetYyyy;
this.memo = memo;
this.classCount = classCount;
if (modelNo != null && modelNo.equals(ModelType.G2.getId())) {
this.wasteCnt = classCount;
} else if (modelNo != null && modelNo.equals(ModelType.G3.getId())) {
this.landCoverCnt = classCount;
}
// 이전(before)
this.beforeModelNo = beforeModelNo;
this.beforeDatasetId = beforeDatasetId;
this.beforeUuid = beforeUuid;
this.beforeDataType = beforeDataType;
this.beforeDataTypeName = getDataTypeName(beforeDataType);
this.beforeTitle = beforeTitle;
this.beforeRoundNo = beforeRoundNo;
this.beforeCompareYyyy = beforeCompareYyyy;
this.beforeTargetYyyy = beforeTargetYyyy;
this.beforeMemo = beforeMemo;
this.beforeClassCount = beforeClassCount;
if (beforeModelNo != null && beforeModelNo.equals(ModelType.G2.getId())) {
this.beforeWasteCnt = beforeClassCount;
} else if (beforeModelNo != null && beforeModelNo.equals(ModelType.G3.getId())) {
this.beforeLandCoverCnt = beforeClassCount;
}
}
public SelectTransferDataSet(
// 현재
String modelNo,
Long datasetId,
UUID uuid,
String dataType,
String title,
Long roundNo,
Integer compareYyyy,
Integer targetYyyy,
String memo,
Integer buildingCnt,
Integer containerCnt,
// 이전(before)
String beforeModelNo,
Long beforeDatasetId,
UUID beforeUuid,
String beforeDataType,
String beforeTitle,
Long beforeRoundNo,
Integer beforeCompareYyyy,
Integer beforeTargetYyyy,
String beforeMemo,
Integer beforeBuildingCnt,
Integer beforeContainerCnt) {
// 현재
this.modelNo = modelNo;
this.datasetId = datasetId;
this.uuid = uuid;
this.dataType = dataType;
this.dataTypeName = getDataTypeName(dataType);
this.title = title;
this.roundNo = roundNo;
this.compareYyyy = compareYyyy;
this.targetYyyy = targetYyyy;
this.memo = memo;
this.buildingCnt = buildingCnt;
this.containerCnt = containerCnt;
// 이전(before)
this.beforeModelNo = beforeModelNo;
this.beforeDatasetId = beforeDatasetId;
this.beforeUuid = beforeUuid;
this.beforeDataType = beforeDataType;
this.beforeDataTypeName = getDataTypeName(beforeDataType);
this.beforeTitle = beforeTitle;
this.beforeRoundNo = beforeRoundNo;
this.beforeCompareYyyy = beforeCompareYyyy;
this.beforeTargetYyyy = beforeTargetYyyy;
this.beforeMemo = beforeMemo;
this.beforeBuildingCnt = beforeBuildingCnt;
this.beforeContainerCnt = beforeContainerCnt;
}
public String getDataTypeName(String groupTitleCd) {
LearnDataType type = Enums.fromId(LearnDataType.class, groupTitleCd);
return type == null ? null : type.getText();
}
public String getYear() {
return this.compareYyyy + "-" + this.targetYyyy;
}
public String getBeforeYear() {
if (this.beforeCompareYyyy == null || this.beforeTargetYyyy == null) {
return null;
}
return this.beforeCompareYyyy + "-" + this.beforeTargetYyyy;
}
}
@Getter
@Setter
@NoArgsConstructor

View File

@@ -26,10 +26,14 @@ import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.UUID;
import java.util.stream.Collectors;
import java.util.stream.Stream;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
@@ -56,6 +60,8 @@ public class DatasetService {
private String datasetDir;
private static final List<String> LABEL_DIRS = List.of("label-json", "label", "input1", "input2");
private static final List<String> REQUIRED_DIRS = Arrays.asList("train", "val", "test");
private static final List<String> CHECK_DIRS = List.of("label", "input1", "input2");
/**
* 데이터셋 목록 조회
@@ -164,44 +170,6 @@ public class DatasetService {
}
}
@Deprecated
@Transactional
public ResponseObj insertDatasetTo86(@Valid AddReq addReq) {
Long datasetUid = null; // master id 값, 등록하면서 가져올 예정
// 압축 해제
FIleChecker.unzipOn86Server(
addReq.getFilePath() + addReq.getFileName(),
addReq.getFilePath() + addReq.getFileName().replace(".zip", ""));
// 해제한 폴더 읽어서 데이터 저장
List<Map<String, Object>> list =
getUnzipDatasetFilesTo86(
addReq.getFilePath() + addReq.getFileName().replace(".zip", ""), "train");
int idx = 0;
for (Map<String, Object> map : list) {
datasetUid =
this.insertTrainTestData(map, addReq, idx, datasetUid, "train"); // train 데이터 insert
idx++;
}
List<Map<String, Object>> testList =
getUnzipDatasetFilesTo86(
addReq.getFilePath() + addReq.getFileName().replace(".zip", ""), "test");
int testIdx = 0;
for (Map<String, Object> test : testList) {
datasetUid =
this.insertTrainTestData(test, addReq, testIdx, datasetUid, "test"); // test 데이터 insert
testIdx++;
}
datasetCoreService.updateDatasetUploadStatus(datasetUid);
return new ResponseObj(ApiResponseCode.OK, "업로드 성공하였습니다.");
}
@Transactional
public ResponseObj insertDataset(@Valid AddReq addReq) {
@@ -218,6 +186,12 @@ public class DatasetService {
// 압축 해제
FIleChecker.unzip(addReq.getFileName(), addReq.getFilePath());
// 압축 해제한 폴더 하위에 train,val,test 폴더 모두 존재하는지 확인
validateTrainValTestDirs(addReq.getFilePath() + addReq.getFileName().replace(".zip", ""));
// 압축 해제한 폴더의 갯수 맞는지 log 찍기
validateDirFileCount(addReq.getFilePath() + addReq.getFileName().replace(".zip", ""));
// 해제한 폴더 읽어서 데이터 저장
List<Map<String, Object>> list =
getUnzipDatasetFiles(
@@ -367,7 +341,10 @@ public class DatasetService {
Path dir = root.resolve(dirName);
if (!Files.isDirectory(dir)) {
throw new IllegalStateException("폴더가 존재하지 않습니다 : " + dir);
throw new CustomApiException(
ApiResponseCode.NOT_FOUND_DATA.getId(),
HttpStatus.CONFLICT,
"폴더가 존재하지 않습니다. 업로드 된 파일을 확인하세요. : " + dir);
}
try (Stream<Path> stream = Files.list(dir)) {
@@ -421,81 +398,8 @@ public class DatasetService {
return datasetCoreService.getFilePathByUUIDPathType(uuid, pathType);
}
@Deprecated
private List<Map<String, Object>> getUnzipDatasetFilesTo86(String unzipRootPath, String subDir) {
// String root = Paths.get(unzipRootPath)
// .resolve(subDir)
// .toString();
//
String root = normalizeLinuxPath(unzipRootPath + "/" + subDir);
Map<String, Map<String, Object>> grouped = new HashMap<>();
for (String dirName : LABEL_DIRS) {
String remoteDir = root + "/" + dirName;
// 1. 86 서버에서 해당 디렉토리의 파일 목록 조회
List<String> files = listFilesOn86Server(remoteDir);
if (files.isEmpty()) {
throw new IllegalStateException("폴더가 존재하지 않거나 파일이 없습니다 : " + remoteDir);
}
for (String fullPath : files) {
String fileName = Paths.get(fullPath).getFileName().toString();
String baseName = getBaseName(fileName);
Map<String, Object> data = grouped.computeIfAbsent(baseName, k -> new HashMap<>());
data.put("baseName", baseName);
if ("label-json".equals(dirName)) {
// 2. json 내용도 86 서버에서 읽어서 가져와야 함
String json = readRemoteFileAsString(fullPath);
data.put("label-json", parseJson(json));
data.put("geojson_path", fullPath);
} else {
data.put(dirName, fullPath);
}
}
}
return new ArrayList<>(grouped.values());
}
private List<String> listFilesOn86Server(String remoteDir) {
String command = "find " + escape(remoteDir) + " -maxdepth 1 -type f";
return FIleChecker.execCommandAndReadLines(command);
}
private String readRemoteFileAsString(String remoteFilePath) {
String command = "cat " + escape(remoteFilePath);
List<String> lines = FIleChecker.execCommandAndReadLines(command);
return String.join("\n", lines);
}
private JsonNode parseJson(String json) {
try {
ObjectMapper mapper = new ObjectMapper();
return mapper.readTree(json);
} catch (IOException e) {
throw new RuntimeException("JSON 파싱 실패", e);
}
}
private String escape(String path) {
// 쉘 커맨드에서 안전하게 사용할 수 있도록 문자열을 작은따옴표로 감싸면서, 내부의 작은따옴표를 이스케이프 처리
return "'" + path.replace("'", "'\"'\"'") + "'";
}
@@ -528,4 +432,78 @@ public class DatasetService {
throw new RuntimeException(e);
}
}
/** unzipRootDir: 압축 해제된 폴더 경로 (ex: /data/xxx/myzipname) */
public static void validateTrainValTestDirs(String unzipRootDir) {
Path root = Paths.get(unzipRootDir);
// 루트 폴더 자체 존재 확인
if (!Files.exists(root) || !Files.isDirectory(root)) {
throw new CustomApiException(
ApiResponseCode.NOT_FOUND_DATA.getId(),
HttpStatus.CONFLICT,
"압축 해제 폴더가 존재하지 않습니다: " + unzipRootDir);
}
// 필요한 폴더들 존재/디렉토리 여부 확인
List<String> missing =
REQUIRED_DIRS.stream()
.filter(d -> !Files.isDirectory(root.resolve(d)))
.collect(Collectors.toList());
if (!missing.isEmpty()) {
throw new CustomApiException(
ApiResponseCode.NOT_FOUND_DATA.getId(),
HttpStatus.CONFLICT,
"데이터 폴더 구조가 올바르지 않습니다. 누락된 폴더: "
+ String.join(", ", missing)
+ " (필수: train, val, test)");
}
}
public static void validateDirFileCount(String unzipRootDir) {
Path root = Paths.get(unzipRootDir);
for (String split : REQUIRED_DIRS) {
Path splitPath = root.resolve(split);
Map<String, Long> fileCountMap = new HashMap<>();
for (String subDir : CHECK_DIRS) { // input1, input2, label 폴더만 수행하기
Path subDirPath = splitPath.resolve(subDir);
if (!Files.isDirectory(subDirPath)) {
throw new CustomApiException(
ApiResponseCode.NOT_FOUND_DATA.getId(),
HttpStatus.CONFLICT,
split + " 폴더 하위에 " + subDir + " 폴더가 존재하지 않습니다.");
}
long count;
try (Stream<Path> files = Files.list(subDirPath)) {
count = files.filter(Files::isRegularFile).count();
log.info("dir: " + subDirPath + ", count: " + count);
} catch (IOException e) {
throw new CustomApiException(
ApiResponseCode.NOT_FOUND_DATA.getId(),
HttpStatus.CONFLICT,
split + "/" + subDir + " 파일 개수 확인 중 오류 발생");
}
fileCountMap.put(subDir, count);
}
// 모든 폴더 파일 개수가 동일한지 확인
Set<Long> uniqueCounts = new HashSet<>(fileCountMap.values());
if (uniqueCounts.size() != 1) {
throw new CustomApiException(
ApiResponseCode.NOT_FOUND_DATA.getId(),
HttpStatus.CONFLICT,
split + " 데이터 파일 개수가 일치하지 않습니다. " + fileCountMap.toString());
}
}
}
}

View File

@@ -88,9 +88,8 @@ public class HyperParamApiController {
@ApiResponse(responseCode = "404", description = "하이퍼파라미터를 찾을 수 없음", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("{model}/list")
@GetMapping("/list")
public ApiResponseDto<Page<List>> getHyperParam(
@PathVariable ModelType model,
@Parameter(
description = "구분 CREATE_DATE(생성일), LAST_USED_DATE(최근사용일)",
example = "CREATE_DATE")
@@ -98,10 +97,13 @@ public class HyperParamApiController {
String type,
@Parameter(description = "시작일", example = "2026-02-01") @RequestParam(required = false)
LocalDate startDate,
@Parameter(description = "종료일", example = "2026-02-28") @RequestParam(required = false)
@Parameter(description = "종료일", example = "2026-03-31") @RequestParam(required = false)
LocalDate endDate,
@Parameter(description = "버전명", example = "G_000001") @RequestParam(required = false)
@Parameter(description = "버전명", example = "G1_000019") @RequestParam(required = false)
String hyperVer,
@Parameter(description = "모델 타입 (G1, G2, G3 중 하나)", example = "G1")
@RequestParam(required = false)
ModelType model,
@Parameter(
description = "정렬",
example = "createdDttm desc",
@@ -140,7 +142,7 @@ public class HyperParamApiController {
})
@DeleteMapping("/{uuid}")
public ApiResponseDto<Void> deleteHyperParam(
@Parameter(description = "하이퍼파라미터 uuid", example = "c3b5a285-8f68-42af-84f0-e6d09162deb5")
@Parameter(description = "하이퍼파라미터 uuid", example = "57fc9170-64c1-4128-aa7b-0657f08d6d10")
@PathVariable
UUID uuid) {
hyperParamService.deleteHyperParam(uuid);
@@ -162,7 +164,7 @@ public class HyperParamApiController {
})
@GetMapping("/{uuid}")
public ApiResponseDto<HyperParamDto.Basic> getHyperParam(
@Parameter(description = "하이퍼파라미터 uuid", example = "c3b5a285-8f68-42af-84f0-e6d09162deb5")
@Parameter(description = "하이퍼파라미터 uuid", example = "57fc9170-64c1-4128-aa7b-0657f08d6d10")
@PathVariable
UUID uuid) {
return ApiResponseDto.ok(hyperParamService.getHyperParam(uuid));
@@ -182,10 +184,8 @@ public class HyperParamApiController {
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("/init/{model}")
public ApiResponseDto<HyperParamDto.Basic> getInitHyperParam(
@PathVariable ModelType model
public ApiResponseDto<HyperParamDto.Basic> getInitHyperParam(@PathVariable ModelType model) {
) {
return ApiResponseDto.ok(hyperParamService.getInitHyperParam(model));
}
}

View File

@@ -29,6 +29,8 @@ public class HyperParamDto {
private UUID uuid;
private String hyperVer;
@JsonFormatDttm private ZonedDateTime createdDttm;
@JsonFormatDttm private ZonedDateTime lastUsedDttm;
private Integer totalUseCnt;
// -------------------------
// Important
@@ -110,13 +112,12 @@ public class HyperParamDto {
@AllArgsConstructor
public static class List {
private UUID uuid;
private ModelType model;
private String hyperVer;
@JsonFormatDttm private ZonedDateTime createDttm;
@JsonFormatDttm private ZonedDateTime lastUsedDttm;
private Long m1UseCnt;
private Long m2UseCnt;
private Long m3UseCnt;
private Long totalCnt;
private String memo;
private Integer totalUseCnt;
}
@Getter

View File

@@ -0,0 +1,99 @@
package com.kamco.cd.training.log;
import com.kamco.cd.training.config.api.ApiResponseDto;
import com.kamco.cd.training.log.dto.AuditLogDto;
import com.kamco.cd.training.log.service.AuditLogService;
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.tags.Tag;
import java.time.LocalDate;
import lombok.RequiredArgsConstructor;
import org.springframework.data.domain.Page;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
@Tag(name = "감사 로그", description = "감사 로그 관리 API")
@RequiredArgsConstructor
@RestController
@RequestMapping("/api/logs/audit")
public class AuditLogApiController {
private final AuditLogService auditLogService;
@Operation(summary = "일자별 로그 조회")
@GetMapping("/daily")
public ApiResponseDto<Page<AuditLogDto.DailyAuditList>> getDailyLogs(
@RequestParam(required = false) LocalDate startDate,
@RequestParam(required = false) LocalDate endDate,
@RequestParam int page,
@RequestParam(defaultValue = "20") int size) {
AuditLogDto.searchReq searchReq = new AuditLogDto.searchReq(page, size, "created_dttm,desc");
Page<AuditLogDto.DailyAuditList> result =
auditLogService.getLogByDaily(searchReq, startDate, endDate);
return ApiResponseDto.ok(result);
}
@Operation(summary = "일자별 로그 상세")
@GetMapping("/daily/result")
public ApiResponseDto<Page<AuditLogDto.DailyDetail>> getDailyResultLogs(
@RequestParam LocalDate logDate,
@RequestParam int page,
@RequestParam(defaultValue = "20") int size) {
AuditLogDto.searchReq searchReq = new AuditLogDto.searchReq(page, size, "created_dttm,desc");
Page<AuditLogDto.DailyDetail> result = auditLogService.getLogByDailyResult(searchReq, logDate);
return ApiResponseDto.ok(result);
}
@Operation(summary = "메뉴별 로그 조회")
@GetMapping("/menu")
public ApiResponseDto<Page<AuditLogDto.MenuAuditList>> getMenuLogs(
@RequestParam(required = false) String searchValue,
@RequestParam int page,
@RequestParam(defaultValue = "20") int size) {
AuditLogDto.searchReq searchReq = new AuditLogDto.searchReq(page, size, "created_dttm,desc");
Page<AuditLogDto.MenuAuditList> result = auditLogService.getLogByMenu(searchReq, searchValue);
return ApiResponseDto.ok(result);
}
@Operation(summary = "메뉴별 로그 상세")
@GetMapping("/menu/result")
public ApiResponseDto<Page<AuditLogDto.MenuDetail>> getMenuResultLogs(
@RequestParam String menuId,
@RequestParam int page,
@RequestParam(defaultValue = "20") int size) {
AuditLogDto.searchReq searchReq = new AuditLogDto.searchReq(page, size, "created_dttm,desc");
Page<AuditLogDto.MenuDetail> result = auditLogService.getLogByMenuResult(searchReq, menuId);
return ApiResponseDto.ok(result);
}
@Operation(summary = "사용자별 로그 조회")
@GetMapping("/account")
public ApiResponseDto<Page<AuditLogDto.UserAuditList>> getAccountLogs(
@RequestParam(required = false) String searchValue,
@RequestParam int page,
@RequestParam(defaultValue = "20") int size) {
AuditLogDto.searchReq searchReq = new AuditLogDto.searchReq(page, size, "created_dttm,desc");
Page<AuditLogDto.UserAuditList> result =
auditLogService.getLogByAccount(searchReq, searchValue);
return ApiResponseDto.ok(result);
}
@Operation(summary = "사용자별 로그 상세")
@GetMapping("/account/result")
public ApiResponseDto<Page<AuditLogDto.UserDetail>> getAccountResultLogs(
@RequestParam Long userUid,
@RequestParam int page,
@RequestParam(defaultValue = "20") int size) {
AuditLogDto.searchReq searchReq = new AuditLogDto.searchReq(page, size, "created_dttm,desc");
Page<AuditLogDto.UserDetail> result = auditLogService.getLogByAccountResult(searchReq, userUid);
return ApiResponseDto.ok(result);
}
}

View File

@@ -0,0 +1,40 @@
package com.kamco.cd.training.log;
import com.kamco.cd.training.config.api.ApiResponseDto;
import com.kamco.cd.training.log.dto.ErrorLogDto;
import com.kamco.cd.training.log.dto.EventType;
import com.kamco.cd.training.log.service.ErrorLogService;
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.tags.Tag;
import java.time.LocalDate;
import lombok.RequiredArgsConstructor;
import org.springframework.data.domain.Page;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
@Tag(name = "에러 로그", description = "에러 로그 관리 API")
@RequiredArgsConstructor
@RestController
@RequestMapping({"/api/logs/system"})
public class ErrorLogApiController {
private final ErrorLogService errorLogService;
@Operation(summary = "에러로그 조회")
@GetMapping("/error")
public ApiResponseDto<Page<ErrorLogDto.Basic>> getErrorLogs(
@RequestParam(required = false) ErrorLogDto.LogErrorLevel logErrorLevel,
@RequestParam(required = false) EventType eventType,
@RequestParam(required = false) LocalDate startDate,
@RequestParam(required = false) LocalDate endDate,
@RequestParam int page,
@RequestParam(defaultValue = "20") int size) {
ErrorLogDto.ErrorSearchReq searchReq =
new ErrorLogDto.ErrorSearchReq(
logErrorLevel, eventType, startDate, endDate, page, size, "created_dttm,desc");
Page<ErrorLogDto.Basic> result = errorLogService.findLogByError(searchReq);
return ApiResponseDto.ok(result);
}
}

View File

@@ -3,7 +3,9 @@ package com.kamco.cd.training.log.dto;
import com.fasterxml.jackson.annotation.JsonIgnore;
import com.kamco.cd.training.common.utils.interfaces.JsonFormatDttm;
import io.swagger.v3.oas.annotations.media.Schema;
import java.time.LocalDate;
import java.time.ZonedDateTime;
import java.util.UUID;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.NoArgsConstructor;
@@ -58,6 +60,7 @@ public class AuditLogDto {
@Getter
@AllArgsConstructor
public static class AuditCommon {
private int readCount;
private int cudCount;
private int printCount;
@@ -68,6 +71,7 @@ public class AuditLogDto {
@Schema(name = "DailyAuditList", description = "일자별 목록")
@Getter
public static class DailyAuditList extends AuditCommon {
private final String baseDate;
public DailyAuditList(
@@ -85,6 +89,7 @@ public class AuditLogDto {
@Schema(name = "MenuAuditList", description = "메뉴별 목록")
@Getter
public static class MenuAuditList extends AuditCommon {
private final String menuId;
private final String menuName;
@@ -105,6 +110,7 @@ public class AuditLogDto {
@Schema(name = "UserAuditList", description = "사용자별 목록")
@Getter
public static class UserAuditList extends AuditCommon {
private final Long accountId;
private final String loginId;
private final String username;
@@ -129,6 +135,7 @@ public class AuditLogDto {
@Getter
@AllArgsConstructor
public static class AuditDetail {
private Long logId;
private EventType eventType;
private LogDetail detail;
@@ -137,9 +144,11 @@ public class AuditLogDto {
@Schema(name = "DailyDetail", description = "일자별 로그 상세")
@Getter
public static class DailyDetail extends AuditDetail {
private final String userName;
private final String loginId;
private final String menuName;
private final String logDateTime;
public DailyDetail(
Long logId,
@@ -147,17 +156,20 @@ public class AuditLogDto {
String loginId,
String menuName,
EventType eventType,
String logDateTime,
LogDetail detail) {
super(logId, eventType, detail);
this.userName = userName;
this.loginId = loginId;
this.menuName = menuName;
this.logDateTime = logDateTime;
}
}
@Schema(name = "MenuDetail", description = "메뉴별 로그 상세")
@Getter
public static class MenuDetail extends AuditDetail {
private final String logDateTime;
private final String userName;
private final String loginId;
@@ -179,6 +191,7 @@ public class AuditLogDto {
@Schema(name = "UserDetail", description = "사용자별 로그 상세")
@Getter
public static class UserDetail extends AuditDetail {
private final String logDateTime;
private final String menuNm;
@@ -194,6 +207,7 @@ public class AuditLogDto {
@Setter
@AllArgsConstructor
public static class LogDetail {
String serviceName;
String parentMenuName;
String menuName;
@@ -226,4 +240,26 @@ public class AuditLogDto {
return PageRequest.of(page, size);
}
}
@Getter
@Setter
public static class DownloadReq {
UUID uuid;
LocalDate startDate;
LocalDate endDate;
String searchValue;
String menuId;
String requestUri;
}
@Getter
@Setter
@AllArgsConstructor
public static class DownloadRes {
String name;
String employeeNo;
@JsonFormatDttm ZonedDateTime downloadDttm;
}
}

View File

@@ -1,5 +1,6 @@
package com.kamco.cd.training.log.dto;
import com.kamco.cd.training.common.utils.enums.CodeExpose;
import com.kamco.cd.training.common.utils.enums.EnumType;
import io.swagger.v3.oas.annotations.media.Schema;
import java.time.LocalDate;
@@ -77,6 +78,7 @@ public class ErrorLogDto {
}
}
@CodeExpose
public enum LogErrorLevel implements EnumType {
WARNING("Warning"),
ERROR("Error"),

View File

@@ -1,22 +1,35 @@
package com.kamco.cd.training.log.dto;
import com.kamco.cd.training.common.utils.enums.CodeExpose;
import com.kamco.cd.training.common.utils.enums.EnumType;
import lombok.AllArgsConstructor;
import lombok.Getter;
@CodeExpose
@Getter
@AllArgsConstructor
public enum EventType implements EnumType {
CREATE("생성"),
READ("조회"),
UPDATE("수정"),
DELETE("삭제"),
LIST("목록"),
DETAIL("상세"),
POPUP("팝업"),
STATUS("상태"),
ADDED("추가"),
MODIFIED("수정"),
REMOVE("삭제"),
DOWNLOAD("다운로드"),
PRINT("출력"),
LOGIN("로그인"),
OTHER("기타");
private final String desc;
public static EventType fromName(String name) {
try {
return EventType.valueOf(name.toUpperCase());
} catch (Exception e) {
return OTHER;
}
}
@Override
public String getId() {
return name();

View File

@@ -1,26 +1,39 @@
package com.kamco.cd.training.model;
import com.kamco.cd.training.common.download.RangeDownloadResponder;
import com.kamco.cd.training.config.api.ApiResponseDto;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.MappingDataset;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelBestEpoch;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelFileInfo;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelTestMetrics;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelTrainMetrics;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelValidationMetrics;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.TransferDetailDto;
import com.kamco.cd.training.model.dto.ModelTrainMngDto.Basic;
import com.kamco.cd.training.model.dto.ModelTrainMngDto.ModelProgressStepDto;
import com.kamco.cd.training.model.service.ModelTrainDetailService;
import com.kamco.cd.training.model.service.ModelTrainMngService;
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.Parameter;
import io.swagger.v3.oas.annotations.enums.ParameterIn;
import io.swagger.v3.oas.annotations.media.ArraySchema;
import io.swagger.v3.oas.annotations.media.Content;
import io.swagger.v3.oas.annotations.media.Schema;
import io.swagger.v3.oas.annotations.responses.ApiResponse;
import io.swagger.v3.oas.annotations.responses.ApiResponses;
import io.swagger.v3.oas.annotations.tags.Tag;
import jakarta.servlet.http.HttpServletRequest;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.List;
import java.util.UUID;
import lombok.RequiredArgsConstructor;
import org.apache.coyote.BadRequestException;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RequestMapping;
@@ -32,6 +45,11 @@ import org.springframework.web.bind.annotation.RestController;
@RequestMapping("/api/models")
public class ModelTrainDetailApiController {
private final ModelTrainDetailService modelTrainDetailService;
private final ModelTrainMngService modelTrainMngService;
private final RangeDownloadResponder rangeDownloadResponder;
@Value("${train.docker.responseDir}")
private String responseDir;
@Operation(summary = "모델학습관리> 모델관리 > 상세정보탭 > 학습 진행정보", description = "학습 진행정보, 모델학습 정보 API")
@ApiResponses(
@@ -116,26 +134,26 @@ public class ModelTrainDetailApiController {
return ApiResponseDto.ok(modelTrainDetailService.getByModelMappingDataset(uuid));
}
@Operation(summary = "모델관리 > 전이 학습 실행설정 > 모델선택", description = "모델선택 정보 API")
@ApiResponses(
value = {
@ApiResponse(
responseCode = "200",
description = "조회 성공",
content =
@Content(
mediaType = "application/json",
schema = @Schema(implementation = TransferDetailDto.class))),
@ApiResponse(responseCode = "404", description = "데이터셋을 찾을 수 없음", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("/transfer/detail/{uuid}")
public ApiResponseDto<TransferDetailDto> getTransferDetail(
@Parameter(description = "모델 uuid", example = "7fbdff54-ea87-4b02-90d1-955fa2a3457e")
@PathVariable
UUID uuid) {
return ApiResponseDto.ok(modelTrainDetailService.getTransferDetail(uuid));
}
// @Operation(summary = "모델관리 > 전이 학습 실행설정 > 모델선택", description = "모델선택 정보 API")
// @ApiResponses(
// value = {
// @ApiResponse(
// responseCode = "200",
// description = "조회 성공",
// content =
// @Content(
// mediaType = "application/json",
// schema = @Schema(implementation = TransferDetailDto.class))),
// @ApiResponse(responseCode = "404", description = "데이터셋을 찾을 수 없음", content = @Content),
// @ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
// })
// @GetMapping("/transfer/detail/{uuid}")
// public ApiResponseDto<TransferDetailDto> getTransferDetail(
// @Parameter(description = "모델 uuid", example = "7fbdff54-ea87-4b02-90d1-955fa2a3457e")
// @PathVariable
// UUID uuid) {
// return ApiResponseDto.ok(modelTrainDetailService.getTransferDetail(uuid));
// }
@Operation(summary = "모델관리 > 모델 상세 > 성능 정보 (Train)", description = "모델 상세 > 성능 정보 (Train) API")
@ApiResponses(
@@ -222,4 +240,90 @@ public class ModelTrainDetailApiController {
UUID uuid) {
return ApiResponseDto.ok(modelTrainDetailService.getModelTrainBestEpoch(uuid));
}
@Operation(
summary = "학습데이터 파일 다운로드",
description = "학습데이터 파일 다운로드",
parameters = {
@Parameter(
name = "kamco-download-uuid",
in = ParameterIn.HEADER,
required = true,
description = "다운로드 요청 UUID",
schema =
@Schema(
type = "string",
format = "uuid",
example = "6d8d49dc-0c9d-4124-adc7-b9ca610cc394"))
})
@ApiResponses(
value = {
@ApiResponse(
responseCode = "200",
description = "학습데이터 zip파일 다운로드",
content =
@Content(
mediaType = "application/octet-stream",
schema = @Schema(type = "string", format = "binary"))),
@ApiResponse(responseCode = "404", description = "파일 없음", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("/download/{uuid}")
public ResponseEntity<?> download(@PathVariable UUID uuid, HttpServletRequest request)
throws IOException {
Basic info = modelTrainDetailService.findByModelByUUID(uuid);
Path zipPath =
Paths.get(responseDir)
.resolve(String.valueOf(info.getUuid()))
.resolve(info.getModelVer() + ".zip");
if (!Files.isRegularFile(zipPath)) {
throw new BadRequestException();
}
return rangeDownloadResponder.buildZipResponse(zipPath, info.getModelVer() + ".zip", request);
}
@Operation(summary = "모델관리 > 모델 상세 > 파일 정보", description = "모델 상세 > 파일 정보 API")
@ApiResponses(
value = {
@ApiResponse(
responseCode = "200",
description = "조회 성공",
content =
@Content(
mediaType = "application/json",
schema = @Schema(implementation = TransferDetailDto.class))),
@ApiResponse(responseCode = "404", description = "데이터셋을 찾을 수 없음", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("/file-info/{uuid}")
public ApiResponseDto<ModelFileInfo> getModelTrainFileInfo(
@Parameter(description = "모델 uuid", example = "95cb116c-380a-41c0-98d8-4d1142f15bbf")
@PathVariable
UUID uuid) {
return ApiResponseDto.ok(modelTrainDetailService.getModelTrainFileInfo(uuid));
}
@Operation(summary = "모델관리 > 모델별 진행 상황", description = "모델관리 > 모델별 진행 상황 API")
@ApiResponses(
value = {
@ApiResponse(
responseCode = "200",
description = "조회 성공",
content =
@Content(
mediaType = "application/json",
schema = @Schema(implementation = ModelProgressStepDto.class))),
@ApiResponse(responseCode = "404", description = "데이터셋을 찾을 수 없음", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("/progress/{uuid}")
public ApiResponseDto<List<ModelProgressStepDto>> findModelTrainProgressInfo(
@Parameter(description = "모델 uuid", example = "95cb116c-380a-41c0-98d8-4d1142f15bbf")
@PathVariable
UUID uuid) {
return ApiResponseDto.ok(modelTrainDetailService.findModelTrainProgressInfo(uuid));
}
}

View File

@@ -1,13 +1,17 @@
package com.kamco.cd.training.model;
import com.kamco.cd.training.common.dto.MonitorDto;
import com.kamco.cd.training.common.service.SystemMonitorService;
import com.kamco.cd.training.config.api.ApiResponseDto;
import com.kamco.cd.training.dataset.dto.DatasetDto;
import com.kamco.cd.training.dataset.dto.DatasetDto.DatasetReq;
import com.kamco.cd.training.dataset.dto.DatasetDto.SelectDataSet;
import com.kamco.cd.training.model.dto.ModelConfigDto;
import com.kamco.cd.training.model.dto.ModelTrainMngDto;
import com.kamco.cd.training.model.dto.ModelTrainMngDto.Basic;
import com.kamco.cd.training.model.dto.ModelTrainMngDto.ListDto;
import com.kamco.cd.training.model.service.ModelTrainMngService;
import com.kamco.cd.training.train.service.ModelTestMetricsJobService;
import com.kamco.cd.training.train.service.ModelTrainMetricsJobService;
import io.swagger.v3.oas.annotations.Operation;
import io.swagger.v3.oas.annotations.Parameter;
import io.swagger.v3.oas.annotations.media.Content;
@@ -16,6 +20,7 @@ import io.swagger.v3.oas.annotations.responses.ApiResponse;
import io.swagger.v3.oas.annotations.responses.ApiResponses;
import io.swagger.v3.oas.annotations.tags.Tag;
import jakarta.validation.Valid;
import java.io.IOException;
import java.util.List;
import java.util.UUID;
import lombok.RequiredArgsConstructor;
@@ -35,6 +40,9 @@ import org.springframework.web.bind.annotation.RestController;
@RequestMapping("/api/models")
public class ModelTrainMngApiController {
private final ModelTrainMngService modelTrainMngService;
private final ModelTrainMetricsJobService modelTrainMetricsJobService;
private final ModelTestMetricsJobService modelTestMetricsJobService;
private final SystemMonitorService systemMonitorService;
@Operation(summary = "모델학습 목록 조회", description = "모델학습 목록 조회 API")
@ApiResponses(
@@ -50,7 +58,7 @@ public class ModelTrainMngApiController {
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("/list")
public ApiResponseDto<Page<Basic>> findByModelList(
public ApiResponseDto<Page<ListDto>> findByModelList(
@Parameter(
description = "상태코드",
example = "IN_PROGRESS",
@@ -79,7 +87,8 @@ public class ModelTrainMngApiController {
@DeleteMapping("/{uuid}")
public ApiResponseDto<Void> deleteModelTrain(
@Parameter(description = "학습 모델 uuid", example = "f2b02229-90f2-45f5-92ea-c56cf1c29f79")
@PathVariable UUID uuid) {
@PathVariable
UUID uuid) {
modelTrainMngService.deleteModelTrain(uuid);
return ApiResponseDto.ok(null);
}
@@ -149,7 +158,9 @@ public class ModelTrainMngApiController {
return ApiResponseDto.ok(modelTrainMngService.getDatasetSelectList(req));
}
@Operation(summary = "모델학습 1단계 실행중인 것이 있는지 count", description = "모델학습 1단계 실행중인 것이 있는지 count")
@Operation(
summary = "모델학습 1단계/2단계 실행중인 것이 있는지 count",
description = "모델학습 1단계/2단계 실행중인 것이 있는지 count")
@ApiResponses(
value = {
@ApiResponse(
@@ -166,4 +177,62 @@ public class ModelTrainMngApiController {
public ApiResponseDto<Long> findModelStep1InProgressCnt() {
return ApiResponseDto.ok(modelTrainMngService.findModelStep1InProgressCnt());
}
@Operation(
summary = "스케줄러 findTrainValidMetricCsvFiles",
description = "스케줄러 findTrainValidMetricCsvFiles")
@ApiResponses(
value = {
@ApiResponse(
responseCode = "200",
description = "검색 성공",
content =
@Content(
mediaType = "application/json",
schema = @Schema(implementation = Long.class))),
@ApiResponse(responseCode = "400", description = "잘못된 검색 조건", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("/schedule-trainvalid")
public ApiResponseDto<Long> findTrainValidMetricCsvFiles() {
modelTrainMetricsJobService.findTrainValidMetricCsvFiles();
return ApiResponseDto.ok(null);
}
@Operation(summary = "스케줄러 findTestMetricCsvFiles", description = "스케줄러 findTestMetricCsvFiles")
@ApiResponses(
value = {
@ApiResponse(
responseCode = "200",
description = "검색 성공",
content =
@Content(
mediaType = "application/json",
schema = @Schema(implementation = Long.class))),
@ApiResponse(responseCode = "400", description = "잘못된 검색 조건", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("/schedule-test")
public ApiResponseDto<Long> findTestValidMetricCsvFiles() throws IOException {
modelTestMetricsJobService.findTestValidMetricCsvFiles();
return ApiResponseDto.ok(null);
}
@Operation(summary = "학습서버 시스템 사용율 조회", description = "cpu, gpu, memory 사용율 조회")
@ApiResponses(
value = {
@ApiResponse(
responseCode = "200",
description = "검색 성공",
content =
@Content(
mediaType = "application/json",
schema = @Schema(implementation = Long.class))),
@ApiResponse(responseCode = "400", description = "잘못된 검색 조건", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping("/monitor")
public ApiResponseDto<MonitorDto> getSystem() throws IOException {
return ApiResponseDto.ok(systemMonitorService.get());
}
}

View File

@@ -20,4 +20,25 @@ public class ModelConfigDto {
private Float testPercent;
private String memo;
}
@Getter
@Setter
@NoArgsConstructor
@AllArgsConstructor
public static class TransferBasic {
private Long configId;
private Long modelId;
private Integer epochCount;
private Float trainPercent;
private Float validationPercent;
private Float testPercent;
private String memo;
private Long beforeConfigId;
private Long beforeModelId;
private Integer beforeEpochCount;
private Float beforeTrainPercent;
private Float beforeValidationPercent;
private Float beforeTestPercent;
private String beforeMemo;
}
}

View File

@@ -6,7 +6,7 @@ import com.kamco.cd.training.common.enums.TrainStatusType;
import com.kamco.cd.training.common.enums.TrainType;
import com.kamco.cd.training.common.utils.enums.Enums;
import com.kamco.cd.training.common.utils.interfaces.JsonFormatDttm;
import com.kamco.cd.training.dataset.dto.DatasetDto.SelectDataSet;
import com.kamco.cd.training.dataset.dto.DatasetDto.SelectTransferDataSet;
import io.swagger.v3.oas.annotations.media.Schema;
import java.time.Duration;
import java.time.ZonedDateTime;
@@ -35,6 +35,7 @@ public class ModelTrainDetailDto {
@JsonFormatDttm private ZonedDateTime step2EndDttm;
private String statusCd;
private String trainType;
private UUID beforeUuid;
public String getStatusName() {
if (this.statusCd == null || this.statusCd.isBlank()) return null;
@@ -176,9 +177,10 @@ public class ModelTrainDetailDto {
@NoArgsConstructor
@AllArgsConstructor
public static class TransferDetailDto {
private ModelConfigDto.Basic etcConfig;
private ModelConfigDto.TransferBasic etcConfig;
private TransferHyperSummary modelTrainHyper;
private List<SelectDataSet> modelTrainDataset;
private List<SelectTransferDataSet> modelTrainDataset;
// private List<SelectDataSet> beforeTrainDataset;
}
@Getter
@@ -245,4 +247,13 @@ public class ModelTrainDetailDto {
private Float iou;
private Float accuracy;
}
@Getter
@Setter
@NoArgsConstructor
@AllArgsConstructor
public static class ModelFileInfo {
private Boolean fileExistsYn;
private String fileName;
}
}

View File

@@ -41,6 +41,13 @@ public class ModelTrainMngDto {
private String trainType;
private String modelNo;
private Long currentAttemptId;
private String requestPath;
private String packingState;
private ZonedDateTime packingStrtDttm;
private ZonedDateTime packingEndDttm;
private Long beforeModelId;
public String getStatusName() {
if (this.statusCd == null || this.statusCd.isBlank()) return null;
@@ -99,6 +106,10 @@ public class ModelTrainMngDto {
public String getStep2Duration() {
return formatDuration(this.step2StrtDttm, this.step2EndDttm);
}
public String getPackingDuration() {
return formatDuration(this.packingStrtDttm, this.packingEndDttm);
}
}
@Schema(name = "searchReq", description = "모델학습 관리 목록조회 파라미터")
@@ -209,4 +220,111 @@ public class ModelTrainMngDto {
@Schema(description = "메모", example = "메모 입니다.")
private String memo;
}
@Schema(name = "모델학습관리 목록", description = "모델학습관리 목록")
@Getter
@Setter
@NoArgsConstructor
@AllArgsConstructor
@Builder
public static class ListDto {
private Long id;
private UUID uuid;
private String modelVer;
@JsonFormatDttm private ZonedDateTime startDttm;
@JsonFormatDttm private ZonedDateTime step1StrtDttm;
@JsonFormatDttm private ZonedDateTime step1EndDttm;
@JsonFormatDttm private ZonedDateTime step2StrtDttm;
@JsonFormatDttm private ZonedDateTime step2EndDttm;
private String step1Status;
private String step2Status;
private String statusCd;
private String trainType;
private String modelNo;
private Long currentAttemptId;
private String requestPath;
private String packingState;
private ZonedDateTime packingStrtDttm;
private ZonedDateTime packingEndDttm;
private String memo;
private String userNm;
private UUID beforeUuid;
public String getStatusName() {
if (this.statusCd == null || this.statusCd.isBlank()) return null;
try {
return TrainStatusType.valueOf(this.statusCd).getText(); // 또는 getName()
} catch (IllegalArgumentException e) {
return this.statusCd; // 매핑 못하면 코드 그대로 반환(원하면 null 처리)
}
}
public String getStep1StatusName() {
if (this.step1Status == null || this.step1Status.isBlank()) return null;
try {
return TrainStatusType.valueOf(this.step1Status).getText(); // 또는 getName()
} catch (IllegalArgumentException e) {
return this.step1Status; // 매핑 못하면 코드 그대로 반환(원하면 null 처리)
}
}
public String getStep2StatusName() {
if (this.step2Status == null || this.step2Status.isBlank()) return null;
try {
return TrainStatusType.valueOf(this.step2Status).getText(); // 또는 getName()
} catch (IllegalArgumentException e) {
return this.step2Status; // 매핑 못하면 코드 그대로 반환(원하면 null 처리)
}
}
public String getTrainTypeName() {
if (this.trainType == null || this.trainType.isBlank()) return null;
try {
return TrainType.valueOf(this.trainType).getText(); // 또는 getName()
} catch (IllegalArgumentException e) {
return this.trainType; // 매핑 못하면 코드 그대로 반환(원하면 null 처리)
}
}
private String formatDuration(ZonedDateTime start, ZonedDateTime end) {
if (start == null || end == null) {
return null;
}
long totalSeconds = Math.abs(Duration.between(start, end).getSeconds());
long hours = totalSeconds / 3600;
long minutes = (totalSeconds % 3600) / 60;
long seconds = totalSeconds % 60;
return String.format("%d시간 %d분 %d초", hours, minutes, seconds);
}
public String getStep1Duration() {
return formatDuration(this.step1StrtDttm, this.step1EndDttm);
}
public String getStep2Duration() {
return formatDuration(this.step2StrtDttm, this.step2EndDttm);
}
public String getPackingDuration() {
return formatDuration(this.packingStrtDttm, this.packingEndDttm);
}
}
@Getter
@Builder
@AllArgsConstructor
public static class ModelProgressStepDto {
private int step;
private String status;
@JsonFormatDttm private ZonedDateTime startTime;
@JsonFormatDttm private ZonedDateTime endTime;
private boolean isError;
}
}

View File

@@ -1,18 +1,21 @@
package com.kamco.cd.training.model.service;
import com.kamco.cd.training.common.enums.ModelType;
import com.kamco.cd.training.dataset.dto.DatasetDto.DatasetReq;
import com.kamco.cd.training.dataset.dto.DatasetDto.SelectDataSet;
import com.kamco.cd.training.dataset.dto.DatasetDto.SelectTransferDataSet;
import com.kamco.cd.training.model.dto.ModelConfigDto;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.DetailSummary;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.HyperSummary;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.MappingDataset;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelBestEpoch;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelFileInfo;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelTestMetrics;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelTrainMetrics;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelValidationMetrics;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.TransferDetailDto;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.TransferHyperSummary;
import com.kamco.cd.training.model.dto.ModelTrainMngDto.Basic;
import com.kamco.cd.training.model.dto.ModelTrainMngDto.ModelProgressStepDto;
import com.kamco.cd.training.postgres.core.ModelTrainDetailCoreService;
import com.kamco.cd.training.postgres.core.ModelTrainMngCoreService;
import java.util.ArrayList;
@@ -70,11 +73,11 @@ public class ModelTrainDetailService {
Basic modelInfo = modelTrainDetailCoreService.findByModelByUUID(uuid);
// config 정보 조회
ModelConfigDto.Basic configInfo = mngCoreService.findModelConfigByModelId(uuid);
ModelConfigDto.TransferBasic configInfo = mngCoreService.findModelTransferConfigByModelId(uuid);
// 하이파라미터 정보 조회
TransferHyperSummary hyperSummary = modelTrainDetailCoreService.getTransferHyperSummary(uuid);
List<SelectDataSet> dataSets = new ArrayList<>();
List<SelectTransferDataSet> dataSets = new ArrayList<>();
DatasetReq datasetReq = new DatasetReq();
List<Long> datasetIds = new ArrayList<>();
@@ -87,12 +90,37 @@ public class ModelTrainDetailService {
datasetReq.setIds(datasetIds);
datasetReq.setModelNo(modelInfo.getModelNo());
if (modelInfo.getModelNo().equals("G1")) {
dataSets = mngCoreService.getDatasetSelectG1List(datasetReq);
if (modelInfo.getModelNo().equals(ModelType.G1.getId())) {
dataSets = mngCoreService.getDatasetTransferSelectG1List(modelInfo.getId());
} else {
dataSets = mngCoreService.getDatasetSelectG2G3List(datasetReq);
dataSets =
mngCoreService.getDatasetTransferSelectG2G3List(
modelInfo.getId(), modelInfo.getModelNo());
}
// DatasetReq beforeDatasetReq = new DatasetReq();
// List<Long> beforeDatasetIds = new ArrayList<>();
// List<SelectDataSet> beforeDataSets = new ArrayList<>();
//
// Long beforeModelId = modelInfo.getBeforeModelId();
// if (beforeModelId != null) {
// Basic beforeInfo = modelTrainDetailCoreService.findByModelBeforeId(beforeModelId);
// List<MappingDataset> beforeDatasets =
// modelTrainDetailCoreService.getByModelMappingDataset(beforeInfo.getUuid());
//
// for (MappingDataset before : beforeDatasets) {
// beforeDatasetIds.add(before.getDatasetId());
// }
// beforeDatasetReq.setIds(beforeDatasetIds);
// beforeDatasetReq.setModelNo(modelInfo.getModelNo());
//
// if (beforeInfo.getModelNo().equals(ModelType.G1.getId())) {
// beforeDataSets = mngCoreService.getDatasetSelectG1List(beforeDatasetReq);
// } else {
// beforeDataSets = mngCoreService.getDatasetSelectG2G3List(beforeDatasetReq);
// }
// }
TransferDetailDto transferDetailDto = new TransferDetailDto();
transferDetailDto.setEtcConfig(configInfo);
transferDetailDto.setModelTrainHyper(hyperSummary);
@@ -116,4 +144,12 @@ public class ModelTrainDetailService {
public ModelBestEpoch getModelTrainBestEpoch(UUID uuid) {
return modelTrainDetailCoreService.getModelTrainBestEpoch(uuid);
}
public ModelFileInfo getModelTrainFileInfo(UUID uuid) {
return modelTrainDetailCoreService.getModelTrainFileInfo(uuid);
}
public List<ModelProgressStepDto> findModelTrainProgressInfo(UUID uuid) {
return modelTrainDetailCoreService.findModelTrainProgressInfo(uuid);
}
}

View File

@@ -2,6 +2,7 @@ package com.kamco.cd.training.model.service;
import com.kamco.cd.training.common.dto.HyperParam;
import com.kamco.cd.training.common.enums.HyperParamSelectType;
import com.kamco.cd.training.common.enums.ModelType;
import com.kamco.cd.training.common.enums.TrainType;
import com.kamco.cd.training.common.exception.CustomApiException;
import com.kamco.cd.training.dataset.dto.DatasetDto.DatasetReq;
@@ -12,7 +13,7 @@ import com.kamco.cd.training.model.dto.ModelTrainMngDto;
import com.kamco.cd.training.model.dto.ModelTrainMngDto.SearchReq;
import com.kamco.cd.training.postgres.core.HyperParamCoreService;
import com.kamco.cd.training.postgres.core.ModelTrainMngCoreService;
import java.io.IOException;
import com.kamco.cd.training.train.service.TrainJobService;
import java.util.List;
import java.util.UUID;
import lombok.RequiredArgsConstructor;
@@ -30,7 +31,7 @@ public class ModelTrainMngService {
private final ModelTrainMngCoreService modelTrainMngCoreService;
private final HyperParamCoreService hyperParamCoreService;
private final TmpDatasetService tmpDatasetService;
private final TrainJobService trainJobService;
/**
* 모델학습 조회
@@ -38,7 +39,7 @@ public class ModelTrainMngService {
* @param searchReq 검색 조건
* @return 페이징 처리된 모델 목록
*/
public Page<ModelTrainMngDto.Basic> getModelList(SearchReq searchReq) {
public Page<ModelTrainMngDto.ListDto> getModelList(SearchReq searchReq) {
return modelTrainMngCoreService.findByModelList(searchReq);
}
@@ -93,22 +94,8 @@ public class ModelTrainMngService {
// 모델 config 저장
modelTrainMngCoreService.saveModelConfig(modelId, req.getModelConfig());
UUID tmpUuid = UUID.randomUUID();
String raw = tmpUuid.toString().toUpperCase().replace("-", "");
List<String> uids =
modelTrainMngCoreService.findDatasetUid(req.getTrainingDataset().getDatasetList());
try {
// 데이터셋 심볼링크 생성
String tmpUid = tmpDatasetService.buildTmpDatasetSymlink(raw, uids);
ModelTrainMngDto.UpdateReq updateReq = new ModelTrainMngDto.UpdateReq();
updateReq.setRequestPath(tmpUid);
modelTrainMngCoreService.updateModelMaster(modelId, updateReq);
} catch (IOException e) {
throw new RuntimeException(e);
}
// 데이터셋 임시파일 생성
trainJobService.createTmpFile(modelUuid);
return modelUuid;
}
@@ -129,7 +116,7 @@ public class ModelTrainMngService {
* @return
*/
public List<SelectDataSet> getDatasetSelectList(DatasetReq req) {
if (req.getModelNo().equals("G1")) {
if (req.getModelNo().equals(ModelType.G1.getId())) {
return modelTrainMngCoreService.getDatasetSelectG1List(req);
} else {
return modelTrainMngCoreService.getDatasetSelectG2G3List(req);

View File

@@ -1,144 +0,0 @@
package com.kamco.cd.training.model.service;
import java.io.IOException;
import java.nio.file.*;
import java.util.List;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Service;
@Slf4j
@Service
@RequiredArgsConstructor
public class TmpDatasetService {
@Value("${train.docker.requestDir}")
private String requestDir;
public String buildTmpDatasetSymlink(String uid, List<String> datasetUids) throws IOException {
log.info("========== buildTmpDatasetHardlink START ==========");
log.info("uid={}", uid);
log.info("datasetUids={}", datasetUids);
log.info("requestDir(raw)={}", requestDir);
Path BASE = toPath(requestDir);
Path tmp = BASE.resolve("tmp").resolve(uid);
log.info("BASE={}", BASE);
log.info("BASE exists? {}", Files.isDirectory(BASE));
log.info("tmp={}", tmp);
long noDir = 0, scannedDirs = 0, regularFiles = 0, hardlinksMade = 0;
// tmp 디렉토리 준비
for (String type : List.of("train", "val")) {
for (String part : List.of("input1", "input2", "label")) {
Path dir = tmp.resolve(type).resolve(part);
Files.createDirectories(dir);
log.info("createDirectories: {}", dir);
}
}
// 하드링크는 "같은 파일시스템"에서만 가능하므로 BASE/tmp가 같은 FS인지 미리 확인(권장)
try {
var baseStore = Files.getFileStore(BASE);
var tmpStore = Files.getFileStore(tmp.getParent()); // BASE/tmp
if (!baseStore.name().equals(tmpStore.name()) || !baseStore.type().equals(tmpStore.type())) {
throw new IOException(
"Hardlink requires same filesystem. baseStore="
+ baseStore.name()
+ "("
+ baseStore.type()
+ "), tmpStore="
+ tmpStore.name()
+ "("
+ tmpStore.type()
+ ")");
}
} catch (Exception e) {
// FileStore 비교가 환경마다 애매할 수 있어서, 여기서는 경고만 주고 실제 createLink에서 최종 판단하게 둘 수도 있음.
log.warn("FileStore check skipped/failed (will rely on createLink): {}", e.toString());
}
for (String id : datasetUids) {
Path srcRoot = BASE.resolve(id);
log.info("---- dataset id={} srcRoot={} exists? {}", id, srcRoot, Files.isDirectory(srcRoot));
for (String type : List.of("train", "val")) {
for (String part : List.of("input1", "input2", "label")) {
Path srcDir = srcRoot.resolve(type).resolve(part);
if (!Files.isDirectory(srcDir)) {
log.warn("SKIP (not directory): {}", srcDir);
noDir++;
continue;
}
scannedDirs++;
log.info("SCAN dir={}", srcDir);
try (DirectoryStream<Path> stream = Files.newDirectoryStream(srcDir)) {
for (Path f : stream) {
if (!Files.isRegularFile(f)) {
log.debug("skip non-regular file: {}", f);
continue;
}
regularFiles++;
String dstName = id + "__" + f.getFileName();
Path dst = tmp.resolve(type).resolve(part).resolve(dstName);
// dst가 남아있으면 삭제(심볼릭링크든 파일이든)
if (Files.exists(dst) || Files.isSymbolicLink(dst)) {
Files.delete(dst);
log.debug("deleted existing: {}", dst);
}
try {
// 하드링크 생성 (dst가 새 파일로 생기지만 inode는 f와 동일)
Files.createLink(dst, f);
hardlinksMade++;
log.debug("created hardlink: {} => {}", dst, f);
} catch (IOException e) {
// 여기서 바로 실패시키면 “tmp는 만들었는데 내용은 0개” 같은 상태를 방지할 수 있음
log.error("FAILED create hardlink: {} => {}", dst, f, e);
throw e;
}
}
}
}
}
}
if (hardlinksMade == 0) {
throw new IOException(
"No hardlinks created. regularFiles="
+ regularFiles
+ ", scannedDirs="
+ scannedDirs
+ ", noDir="
+ noDir);
}
log.info("tmp dataset created: {}", tmp);
log.info(
"summary: scannedDirs={}, noDir={}, regularFiles={}, hardlinksMade={}",
scannedDirs,
noDir,
regularFiles,
hardlinksMade);
return uid;
}
private static Path toPath(String p) {
if (p.startsWith("~/")) {
return Paths.get(System.getProperty("user.home")).resolve(p.substring(2)).normalize();
}
return Paths.get(p).toAbsolutePath().normalize();
}
}

View File

@@ -2,6 +2,7 @@ package com.kamco.cd.training.postgres.core;
import com.kamco.cd.training.common.service.BaseCoreService;
import com.kamco.cd.training.log.dto.AuditLogDto;
import com.kamco.cd.training.log.dto.AuditLogDto.DownloadReq;
import com.kamco.cd.training.postgres.repository.log.AuditLogRepository;
import java.time.LocalDate;
import lombok.RequiredArgsConstructor;
@@ -45,6 +46,11 @@ public class AuditLogCoreService
return auditLogRepository.findLogByAccount(searchRange, searchValue);
}
public Page<AuditLogDto.DownloadRes> findLogByAccount(
AuditLogDto.searchReq searchReq, DownloadReq downloadReq) {
return auditLogRepository.findDownloadLog(searchReq, downloadReq);
}
public Page<AuditLogDto.DailyDetail> getLogByDailyResult(
AuditLogDto.searchReq searchRange, LocalDate logDate) {
return auditLogRepository.findLogByDailyResult(searchRange, logDate);

View File

@@ -34,7 +34,6 @@ public class HyperParamCoreService {
ModelHyperParamEntity entity = new ModelHyperParamEntity();
entity.setHyperVer(firstVersion);
applyHyperParam(entity, createReq);
// user
@@ -72,7 +71,6 @@ public class HyperParamCoreService {
return entity.getHyperVer();
}
/**
* 하이퍼파라미터 삭제
*
@@ -105,8 +103,9 @@ public class HyperParamCoreService {
*/
public HyperParamDto.Basic getInitHyperParam(ModelType model) {
ModelHyperParamEntity entity =
hyperParamRepository
.getHyperparamByType(model)
hyperParamRepository.getHyperParamByType(model).stream()
.filter(e -> e.getIsDefault() == Boolean.TRUE)
.findFirst()
.orElseThrow(() -> new CustomApiException("NOT_FOUND_DATA", HttpStatus.NOT_FOUND));
return entity.toDto();
}
@@ -172,7 +171,7 @@ public class HyperParamCoreService {
} else {
entity.setCropSize("256,256");
}
// entity.setCropSize(src.getCropSize());
entity.setCropSize(src.getCropSize());
// Important
entity.setModelType(model); // 20250212 modeltype추가
@@ -214,5 +213,4 @@ public class HyperParamCoreService {
// memo
entity.setMemo(src.getMemo());
}
}

View File

@@ -1,7 +1,10 @@
package com.kamco.cd.training.postgres.core;
import com.kamco.cd.training.postgres.repository.train.ModelTestMetricsJobRepository;
import com.kamco.cd.training.train.dto.ModelTrainMetricsDto.ModelMetricJsonDto;
import com.kamco.cd.training.train.dto.ModelTrainMetricsDto.ModelTestFileName;
import com.kamco.cd.training.train.dto.ModelTrainMetricsDto.ResponsePathDto;
import java.time.ZonedDateTime;
import java.util.List;
import lombok.RequiredArgsConstructor;
import org.springframework.stereotype.Service;
@@ -26,4 +29,22 @@ public class ModelTestMetricsJobCoreService {
public void insertModelMetricsTest(List<Object[]> batchArgs) {
modelTestMetricsJobRepository.insertModelMetricsTest(batchArgs);
}
public ModelMetricJsonDto getTestMetricPackingInfo(Long modelId) {
return modelTestMetricsJobRepository.getTestMetricPackingInfo(modelId);
}
public ModelTestFileName findModelTestFileNames(Long modelId) {
return modelTestMetricsJobRepository.findModelTestFileNames(modelId);
}
@Transactional
public void updatePackingStart(Long modelId, ZonedDateTime now) {
modelTestMetricsJobRepository.updatePackingStart(modelId, now);
}
@Transactional
public void updatePackingEnd(Long modelId, ZonedDateTime now, String failSuccState) {
modelTestMetricsJobRepository.updatePackingEnd(modelId, now, failSuccState);
}
}

View File

@@ -8,11 +8,13 @@ import com.kamco.cd.training.model.dto.ModelTrainDetailDto.DetailSummary;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.HyperSummary;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.MappingDataset;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelBestEpoch;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelFileInfo;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelTestMetrics;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelTrainMetrics;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelValidationMetrics;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.TransferHyperSummary;
import com.kamco.cd.training.model.dto.ModelTrainMngDto.Basic;
import com.kamco.cd.training.model.dto.ModelTrainMngDto.ModelProgressStepDto;
import com.kamco.cd.training.postgres.entity.ModelMasterEntity;
import com.kamco.cd.training.postgres.repository.model.ModelConfigRepository;
import com.kamco.cd.training.postgres.repository.model.ModelDetailRepository;
@@ -55,6 +57,12 @@ public class ModelTrainDetailCoreService {
return modelDetailRepository.getModelDetailSummary(uuid);
}
/**
* 하이퍼 파리미터 요약정보
*
* @param uuid 모델마스터 uuid
* @return
*/
public HyperSummary getByModelHyperParamSummary(UUID uuid) {
return modelDetailRepository.getByModelHyperParamSummary(uuid);
}
@@ -97,4 +105,17 @@ public class ModelTrainDetailCoreService {
public ModelBestEpoch getModelTrainBestEpoch(UUID uuid) {
return modelDetailRepository.getModelTrainBestEpoch(uuid);
}
public ModelFileInfo getModelTrainFileInfo(UUID uuid) {
return modelDetailRepository.getModelTrainFileInfo(uuid);
}
public List<ModelProgressStepDto> findModelTrainProgressInfo(UUID uuid) {
return modelDetailRepository.findModelTrainProgressInfo(uuid);
}
public Basic findByModelBeforeId(Long beforeModelId) {
ModelMasterEntity entity = modelDetailRepository.findByModelBeforeId(beforeModelId);
return entity.toDto();
}
}

View File

@@ -1,15 +1,22 @@
package com.kamco.cd.training.postgres.core;
import com.kamco.cd.training.common.exception.CustomApiException;
import com.kamco.cd.training.postgres.entity.ModelTrainJobEntity;
import com.kamco.cd.training.postgres.repository.train.ModelTrainJobRepository;
import com.kamco.cd.training.train.dto.ModelTrainJobDto;
import java.time.ZonedDateTime;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import java.util.Objects;
import java.util.Optional;
import lombok.RequiredArgsConstructor;
import lombok.extern.log4j.Log4j2;
import org.springframework.http.HttpStatus;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
@Log4j2
@Service
@RequiredArgsConstructor
@Transactional(readOnly = true)
@@ -42,17 +49,23 @@ public class ModelTrainJobCoreService {
job.setQueuedDttm(queuedDttm != null ? queuedDttm : ZonedDateTime.now());
modelTrainJobRepository.save(job);
modelTrainJobRepository.flush();
return job.getId();
}
/** 실행 시작 처리 */
@Transactional
public void markRunning(
Long jobId, String containerName, String logPath, String lockedBy, Integer totalEpoch) {
Long jobId,
String containerName,
String logPath,
String lockedBy,
Integer totalEpoch,
String jobType) {
ModelTrainJobEntity job =
modelTrainJobRepository
.findById(jobId)
.orElseThrow(() -> new IllegalArgumentException("Job not found: " + jobId));
.orElseThrow(() -> new CustomApiException("NOT_FOUND_DATA", HttpStatus.NOT_FOUND));
job.setStatusCd("RUNNING");
job.setContainerName(containerName);
@@ -60,37 +73,73 @@ public class ModelTrainJobCoreService {
job.setStartedDttm(ZonedDateTime.now());
job.setLockedDttm(ZonedDateTime.now());
job.setLockedBy(lockedBy);
job.setJobType(jobType);
if (totalEpoch != null) {
job.setTotalEpoch(totalEpoch);
}
}
/** 성공 처리 */
/**
* 성공 처리
*
* @param jobId
* @param exitCode
*/
@Transactional
public void markSuccess(Long jobId, int exitCode) {
ModelTrainJobEntity job =
modelTrainJobRepository
.findById(jobId)
.orElseThrow(() -> new IllegalArgumentException("Job not found: " + jobId));
.orElseThrow(() -> new CustomApiException("NOT_FOUND_DATA", HttpStatus.NOT_FOUND));
job.setStatusCd("SUCCESS");
job.setExitCode(exitCode);
job.setFinishedDttm(ZonedDateTime.now());
}
/** 실패 처리 */
/**
* 실패 처리
*
* @param jobId
* @param exitCode
* @param errorMessage
*/
@Transactional
public void markFailed(Long jobId, Integer exitCode, String errorMessage) {
ModelTrainJobEntity job =
modelTrainJobRepository
.findById(jobId)
.orElseThrow(() -> new IllegalArgumentException("Job not found: " + jobId));
.orElseThrow(() -> new CustomApiException("NOT_FOUND_DATA", HttpStatus.NOT_FOUND));
job.setStatusCd("FAILED");
job.setExitCode(exitCode);
job.setErrorMessage(errorMessage);
job.setFinishedDttm(ZonedDateTime.now());
log.info("[TRAIN JOB FAIL] jobId={}, modelId={}", jobId, errorMessage);
}
/**
* 중단됨 처리
*
* @param jobId
* @param exitCode
* @param errorMessage
*/
@Transactional
public void markPaused(Long jobId, Integer exitCode, String errorMessage) {
ModelTrainJobEntity job =
modelTrainJobRepository
.findById(jobId)
.orElseThrow(() -> new CustomApiException("NOT_FOUND_DATA", HttpStatus.NOT_FOUND));
job.setStatusCd("STOPPED");
job.setExitCode(exitCode);
job.setErrorMessage(errorMessage);
job.setFinishedDttm(ZonedDateTime.now());
log.info("[TRAIN JOB FAIL] jobId={}, modelId={}", jobId, errorMessage);
}
/** 취소 처리 */
@@ -99,9 +148,40 @@ public class ModelTrainJobCoreService {
ModelTrainJobEntity job =
modelTrainJobRepository
.findById(jobId)
.orElseThrow(() -> new IllegalArgumentException("Job not found: " + jobId));
.orElseThrow(() -> new CustomApiException("NOT_FOUND_DATA", HttpStatus.NOT_FOUND));
job.setStatusCd("STOPPED");
job.setFinishedDttm(ZonedDateTime.now());
}
@Transactional
public void updateEpoch(String containerName, Integer epoch) {
ModelTrainJobEntity job =
modelTrainJobRepository
.findByContainerName(containerName)
.orElseThrow(() -> new CustomApiException("NOT_FOUND_DATA", HttpStatus.NOT_FOUND));
job.setCurrentEpoch(epoch);
if (Objects.equals(job.getTotalEpoch(), epoch)) {}
}
public void insertModelTestTrainingRun(Long modelId, Long jobId, int epoch) {
modelTrainJobRepository.insertModelTestTrainingRun(modelId, jobId, epoch);
}
/**
* 실행중인 학습이 있는지 조회
*
* @return
*/
public List<ModelTrainJobDto> findRunningJobs() {
List<ModelTrainJobEntity> entity = modelTrainJobRepository.findRunningJobs();
if (entity == null || entity.isEmpty()) {
return Collections.emptyList();
}
return entity.stream().map(ModelTrainJobEntity::toDto).toList();
}
}

View File

@@ -29,4 +29,9 @@ public class ModelTrainMetricsJobCoreService {
public void insertModelMetricsValidation(List<Object[]> batchArgs) {
modelTrainMetricsJobRepository.insertModelMetricsValidation(batchArgs);
}
@Transactional
public void updateModelSelectedBestEpoch(Long modelId, Integer epoch) {
modelTrainMetricsJobRepository.updateModelSelectedBestEpoch(modelId, epoch);
}
}

View File

@@ -8,9 +8,10 @@ import com.kamco.cd.training.common.exception.CustomApiException;
import com.kamco.cd.training.common.utils.UserUtil;
import com.kamco.cd.training.dataset.dto.DatasetDto.DatasetReq;
import com.kamco.cd.training.dataset.dto.DatasetDto.SelectDataSet;
import com.kamco.cd.training.dataset.dto.DatasetDto.SelectTransferDataSet;
import com.kamco.cd.training.model.dto.ModelConfigDto;
import com.kamco.cd.training.model.dto.ModelTrainMngDto;
import com.kamco.cd.training.model.dto.ModelTrainMngDto.Basic;
import com.kamco.cd.training.model.dto.ModelTrainMngDto.ListDto;
import com.kamco.cd.training.model.dto.ModelTrainMngDto.TrainingDataset;
import com.kamco.cd.training.postgres.entity.ModelConfigEntity;
import com.kamco.cd.training.postgres.entity.ModelDatasetEntity;
@@ -23,6 +24,7 @@ import com.kamco.cd.training.postgres.repository.model.ModelConfigRepository;
import com.kamco.cd.training.postgres.repository.model.ModelDatasetMappRepository;
import com.kamco.cd.training.postgres.repository.model.ModelDatasetRepository;
import com.kamco.cd.training.postgres.repository.model.ModelMngRepository;
import com.kamco.cd.training.train.dto.ModelTrainLinkDto;
import com.kamco.cd.training.train.dto.TrainRunRequest;
import java.time.ZonedDateTime;
import java.util.ArrayList;
@@ -53,9 +55,10 @@ public class ModelTrainMngCoreService {
* @param searchReq 검색 조건
* @return 페이징 처리된 모델 목록
*/
public Page<Basic> findByModelList(ModelTrainMngDto.SearchReq searchReq) {
Page<ModelMasterEntity> entityPage = modelMngRepository.findByModels(searchReq);
return entityPage.map(ModelMasterEntity::toDto);
public Page<ListDto> findByModelList(ModelTrainMngDto.SearchReq searchReq) {
// Page<ModelMasterEntity> entityPage = modelMngRepository.findByModels(searchReq);
// return entityPage.map(ModelMasterEntity::toDto);
return modelMngRepository.findByModels(searchReq);
}
/**
@@ -86,7 +89,11 @@ public class ModelTrainMngCoreService {
// 최적화 파라미터는 모델 type의 디폴트사용
if (HyperParamSelectType.OPTIMIZED.getId().equals(addReq.getHyperParamType())) {
ModelType modelType = ModelType.getValueData(addReq.getModelNo());
hyperParamEntity = hyperParamRepository.getHyperparamByType(modelType).orElse(null);
hyperParamEntity =
hyperParamRepository.getHyperParamByType(modelType).stream()
.filter(e -> e.getIsDefault() == Boolean.TRUE)
.findFirst()
.orElse(null);
// hyperParamEntity = hyperParamRepository.findByHyperVer("HPs_0001").orElse(null);
} else {
@@ -97,6 +104,12 @@ public class ModelTrainMngCoreService {
if (hyperParamEntity == null || hyperParamEntity.getHyperVer() == null) {
throw new CustomApiException("NOT_FOUND_DATA", HttpStatus.NOT_FOUND);
}
// 하이퍼 파라미터 사용 횟수 업데이트
hyperParamEntity.setTotalUseCnt(
hyperParamEntity.getTotalUseCnt() == null ? 1 : hyperParamEntity.getTotalUseCnt() + 1);
// 최근 사용일시 업데이트
hyperParamEntity.setLastUsedDttm(ZonedDateTime.now());
String modelVer =
String.join(
@@ -107,16 +120,8 @@ public class ModelTrainMngCoreService {
entity.setTrainType(addReq.getTrainType()); // 일반, 전이
entity.setBeforeModelId(addReq.getBeforeModelId());
if (addReq.getIsStart()) {
entity.setModelStep((short) 1);
entity.setStatusCd(TrainStatusType.IN_PROGRESS.getId());
entity.setStrtDttm(ZonedDateTime.now());
entity.setStep1StrtDttm(ZonedDateTime.now());
entity.setStep1State(TrainStatusType.IN_PROGRESS.getId());
} else {
entity.setStatusCd(TrainStatusType.READY.getId());
entity.setStep1State(TrainStatusType.READY.getId());
}
entity.setCreatedUid(userUtil.getId());
ModelMasterEntity resultEntity = modelMngRepository.save(entity);
@@ -168,13 +173,10 @@ public class ModelTrainMngCoreService {
modelMngRepository
.findById(modelId)
.orElseThrow(() -> new CustomApiException("NOT_FOUND_DATA", HttpStatus.NOT_FOUND));
// 임시폴더 UID업데이트
if (req.getRequestPath() != null && !req.getRequestPath().isEmpty()) {
entity.setRequestPath(req.getRequestPath());
}
if (req.getResponsePath() != null && !req.getResponsePath().isEmpty()) {
entity.setRequestPath(req.getResponsePath());
}
}
/**
@@ -205,7 +207,10 @@ public class ModelTrainMngCoreService {
ModelConfigEntity entity = new ModelConfigEntity();
modelMasterEntity.setId(modelId);
entity.setModel(modelMasterEntity);
entity.setEpochCount(req.getEpochCnt());
entity.setEpochCount(
req.getEpochCnt() < 10
? 10
: req.getEpochCnt()); // 에폭이 10 이하이면 10으로 고정하기. 10 이상 에폭으로 해야 best 에폭 파일이 생성되어 내려옴
entity.setTrainPercent(req.getTrainingCnt());
entity.setValidationPercent(req.getValidationCnt());
entity.setTestPercent(req.getTestCnt());
@@ -273,6 +278,13 @@ public class ModelTrainMngCoreService {
.orElseThrow(() -> new CustomApiException("NOT_FOUND_DATA", HttpStatus.NOT_FOUND));
}
public ModelConfigDto.TransferBasic findModelTransferConfigByModelId(UUID uuid) {
ModelMasterEntity modelEntity = findByUuid(uuid);
return modelConfigRepository
.findModelTransferConfigByModelId(modelEntity.getId())
.orElseThrow(() -> new CustomApiException("NOT_FOUND_DATA", HttpStatus.NOT_FOUND));
}
/**
* 데이터셋 G1 목록
*
@@ -283,6 +295,16 @@ public class ModelTrainMngCoreService {
return datasetRepository.getDatasetSelectG1List(req);
}
/**
* 전이학습 데이터셋 G1 목록
*
* @param modelId 모델 Id
* @return
*/
public List<SelectTransferDataSet> getDatasetTransferSelectG1List(Long modelId) {
return datasetRepository.getDatasetTransferSelectG1List(modelId);
}
/**
* 데이터셋 G2, G3 목록
*
@@ -293,6 +315,18 @@ public class ModelTrainMngCoreService {
return datasetRepository.getDatasetSelectG2G3List(req);
}
/**
* 전이학습 데이터셋 G2, G3 목록
*
* @param modelId 모델 Id
* @param modelNo G2, G3
* @return
*/
public List<SelectTransferDataSet> getDatasetTransferSelectG2G3List(
Long modelId, String modelNo) {
return datasetRepository.getDatasetTransferSelectG2G3List(modelId, modelNo);
}
/**
* 모델관리 조회
*
@@ -307,9 +341,7 @@ public class ModelTrainMngCoreService {
return entity.toDto();
}
/**
* 마스터를 IN_PROGRESS로 전환하고, 현재 실행 jobId를 연결 - UI/중단/상태조회 모두 currentAttemptId를 기준으로 동작
*/
/** 마스터를 IN_PROGRESS로 전환하고, 현재 실행 jobId를 연결 - UI/중단/상태조회 모두 currentAttemptId를 기준으로 동작 */
@Transactional
public void markInProgress(Long modelId, Long jobId) {
ModelMasterEntity master =
@@ -321,11 +353,10 @@ public class ModelTrainMngCoreService {
master.setCurrentAttemptId(jobId);
// 필요하면 시작시간도 여기서 찍어줌
modelMngRepository.flush();
}
/**
* 마지막 에러 메시지 초기화 - 재시작/새 실행 때 이전 에러 흔적 제거
*/
/** 마지막 에러 메시지 초기화 - 재시작/새 실행 때 이전 에러 흔적 제거 */
@Transactional
public void clearLastError(Long modelId) {
ModelMasterEntity master =
@@ -334,11 +365,10 @@ public class ModelTrainMngCoreService {
.orElseThrow(() -> new IllegalArgumentException("Model not found: " + modelId));
master.setLastError(null);
modelMngRepository.flush();
}
/**
* 중단 처리(옵션) - cancel에서 쓰려고 하면 같이 구현
*/
/** 중단 처리(옵션) - cancel에서 쓰려고 하면 같이 구현 */
@Transactional
public void markStopped(Long modelId) {
ModelMasterEntity master =
@@ -349,9 +379,7 @@ public class ModelTrainMngCoreService {
master.setStatusCd(TrainStatusType.STOPPED.getId());
}
/**
* 완료 처리(옵션) - Worker가 성공 시 호출
*/
/** 완료 처리(옵션) - Worker가 성공 시 호출 */
@Transactional
public void markCompleted(Long modelId) {
ModelMasterEntity master =
@@ -364,6 +392,9 @@ public class ModelTrainMngCoreService {
/**
* step 1오류 처리(옵션) - Worker가 실패 시 호출
*
* @param modelId
* @param errorMessage
*/
@Transactional
public void markError(Long modelId, String errorMessage) {
@@ -381,6 +412,9 @@ public class ModelTrainMngCoreService {
/**
* step 2오류 처리(옵션) - Worker가 실패 시 호출
*
* @param modelId
* @param errorMessage
*/
@Transactional
public void markStep2Error(Long modelId, String errorMessage) {
@@ -396,6 +430,44 @@ public class ModelTrainMngCoreService {
master.setUpdatedDttm(ZonedDateTime.now());
}
/**
* step1 정지 처리
*
* @param modelId
* @param errorMessage
*/
public void markStep1Stop(Long modelId, String errorMessage) {
ModelMasterEntity master =
modelMngRepository
.findById(modelId)
.orElseThrow(() -> new IllegalArgumentException("Model not found: " + modelId));
master.setStatusCd(TrainStatusType.STOPPED.getId());
master.setStep1State(TrainStatusType.STOPPED.getId());
master.setLastError(errorMessage);
master.setUpdatedUid(userUtil.getId());
master.setUpdatedDttm(ZonedDateTime.now());
}
/**
* step2 정지 처리
*
* @param modelId
* @param errorMessage
*/
public void markStep2Stop(Long modelId, String errorMessage) {
ModelMasterEntity master =
modelMngRepository
.findById(modelId)
.orElseThrow(() -> new IllegalArgumentException("Model not found: " + modelId));
master.setStatusCd(TrainStatusType.STOPPED.getId());
master.setStep2State(TrainStatusType.STOPPED.getId());
master.setLastError(errorMessage);
master.setUpdatedUid(userUtil.getId());
master.setUpdatedDttm(ZonedDateTime.now());
}
@Transactional
public void markSuccess(Long modelId) {
ModelMasterEntity master =
@@ -434,6 +506,7 @@ public class ModelTrainMngCoreService {
.orElseThrow(() -> new IllegalArgumentException("Model not found: " + modelId));
entity.setStatusCd(TrainStatusType.IN_PROGRESS.getId());
entity.setStrtDttm(ZonedDateTime.now());
entity.setStep1StrtDttm(ZonedDateTime.now());
entity.setStep1State(TrainStatusType.IN_PROGRESS.getId());
entity.setCurrentAttemptId(jobId);
@@ -531,4 +604,34 @@ public class ModelTrainMngCoreService {
public Long findModelStep1InProgressCnt() {
return modelMngRepository.findModelStep1InProgressCnt();
}
/**
* train 링크할 파일 경로
*
* @param modelId
* @return
*/
public List<ModelTrainLinkDto> findDatasetTrainPath(Long modelId) {
return modelDatasetMapRepository.findDatasetTrainPath(modelId);
}
/**
* validation 링크할 파일 경로
*
* @param modelId
* @return
*/
public List<ModelTrainLinkDto> findDatasetValPath(Long modelId) {
return modelDatasetMapRepository.findDatasetValPath(modelId);
}
/**
* test 링크할 파일 경로
*
* @param modelId
* @return
*/
public List<ModelTrainLinkDto> findDatasetTestPath(Long modelId) {
return modelDatasetMapRepository.findDatasetTestPath(modelId);
}
}

View File

@@ -5,6 +5,7 @@ import com.kamco.cd.training.log.dto.EventStatus;
import com.kamco.cd.training.log.dto.EventType;
import com.kamco.cd.training.postgres.CommonCreateEntity;
import jakarta.persistence.*;
import java.util.UUID;
import lombok.AccessLevel;
import lombok.Getter;
import lombok.NoArgsConstructor;
@@ -14,6 +15,7 @@ import lombok.NoArgsConstructor;
@NoArgsConstructor(access = AccessLevel.PROTECTED)
@Table(name = "tb_audit_log")
public class AuditLogEntity extends CommonCreateEntity {
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
@Column(name = "audit_log_uid", nullable = false)
@@ -43,6 +45,12 @@ public class AuditLogEntity extends CommonCreateEntity {
@Column(name = "error_log_uid")
private Long errorLogUid;
@Column(name = "download_uuid")
private UUID downloadUuid;
@Column(name = "login_attempt_id")
private String loginAttemptId;
public AuditLogEntity(
Long userUid,
EventType eventType,
@@ -51,7 +59,9 @@ public class AuditLogEntity extends CommonCreateEntity {
String ipAddress,
String requestUri,
String requestBody,
Long errorLogUid) {
Long errorLogUid,
UUID downloadUuid,
String loginAttemptId) {
this.userUid = userUid;
this.eventType = eventType;
this.eventStatus = eventStatus;
@@ -60,6 +70,31 @@ public class AuditLogEntity extends CommonCreateEntity {
this.requestUri = requestUri;
this.requestBody = requestBody;
this.errorLogUid = errorLogUid;
this.downloadUuid = downloadUuid;
this.loginAttemptId = loginAttemptId;
}
/** 파일 다운로드 이력 생성 */
public static AuditLogEntity forFileDownload(
Long userId,
String requestUri,
String menuUid,
String ip,
int httpStatus,
UUID downloadUuid) {
return new AuditLogEntity(
userId,
EventType.DOWNLOAD, // 이벤트 타입 고정
httpStatus < 400 ? EventStatus.SUCCESS : EventStatus.FAILED, // 성공 여부
menuUid,
ip,
requestUri,
null, // requestBody 없음
null, // errorLogUid 없음
downloadUuid,
null // loginAttemptId 없음
);
}
public AuditLogDto.Basic toDto() {

View File

@@ -192,10 +192,10 @@ public class ModelHyperParamEntity {
@Column(name = "save_best_rule", nullable = false, length = 10)
private String saveBestRule = "greater";
/** Default: 10 */
/** Default: 1 */
@NotNull
@Column(name = "val_interval", nullable = false)
private Integer valInterval = 10;
private Integer valInterval = 1;
/** Default: 400 */
@NotNull
@@ -303,29 +303,24 @@ public class ModelHyperParamEntity {
@Column(name = "last_used_dttm")
private ZonedDateTime lastUsedDttm;
@Column(name = "m1_use_cnt")
private Long m1UseCnt = 0L;
@Column(name = "m2_use_cnt")
private Long m2UseCnt = 0L;
@Column(name = "m3_use_cnt")
private Long m3UseCnt = 0L;
@Column(name = "model_type")
@Enumerated(EnumType.STRING)
private ModelType modelType;
@Column(name = "default_param")
private Boolean isDefault = false;
@Column(name = "total_use_cnt")
private Integer totalUseCnt = 0;
public HyperParamDto.Basic toDto() {
return new HyperParamDto.Basic(
this.modelType,
this.uuid,
this.hyperVer,
this.createdDttm,
this.lastUsedDttm,
this.totalUseCnt,
// -------------------------
// Important
// -------------------------
@@ -395,8 +390,7 @@ public class ModelHyperParamEntity {
// -------------------------
this.gpuCnt,
this.gpuIds,
this.masterPort
, this.isDefault
);
this.masterPort,
this.isDefault);
}
}

View File

@@ -112,6 +112,15 @@ public class ModelMasterEntity {
@Column(name = "response_path")
private String responsePath;
@Column(name = "packing_state")
private String packingState;
@Column(name = "packing_strt_dttm")
private ZonedDateTime packingStrtDttm;
@Column(name = "packing_end_dttm")
private ZonedDateTime packingEndDttm;
public ModelTrainMngDto.Basic toDto() {
return new ModelTrainMngDto.Basic(
this.id,
@@ -127,6 +136,11 @@ public class ModelMasterEntity {
this.statusCd,
this.trainType,
this.modelNo,
this.currentAttemptId);
this.currentAttemptId,
this.requestPath,
this.packingState,
this.packingStrtDttm,
this.packingEndDttm,
this.beforeModelId);
}
}

View File

@@ -0,0 +1,42 @@
package com.kamco.cd.training.postgres.entity;
import jakarta.persistence.Column;
import jakarta.persistence.Entity;
import jakarta.persistence.GeneratedValue;
import jakarta.persistence.GenerationType;
import jakarta.persistence.Id;
import jakarta.persistence.Table;
import jakarta.validation.constraints.NotNull;
import java.time.OffsetDateTime;
import lombok.Getter;
import lombok.Setter;
import org.hibernate.annotations.ColumnDefault;
@Getter
@Setter
@Entity
@Table(name = "tb_model_test_training_run")
public class ModelTestTrainingRunEntity {
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
@Column(name = "tsr_id", nullable = false)
private Long id;
@NotNull
@Column(name = "model_id", nullable = false)
private Long modelId;
@Column(name = "attempt_no")
private Integer attemptNo;
@Column(name = "job_id")
private Long jobId;
@Column(name = "epoch")
private Integer epoch;
@ColumnDefault("now()")
@Column(name = "created_dttm")
private OffsetDateTime createdDttm;
}

View File

@@ -57,8 +57,7 @@ public class ModelTrainJobEntity {
@Column(name = "exit_code")
private Integer exitCode;
@Size(max = 2000)
@Column(name = "error_message", length = 2000)
@Column(name = "error_message", columnDefinition = "TEXT")
private String errorMessage;
@ColumnDefault("now()")
@@ -84,6 +83,9 @@ public class ModelTrainJobEntity {
@Column(name = "current_epoch")
private Integer currentEpoch;
@Column(name = "job_type")
private String jobType;
public ModelTrainJobDto toDto() {
return new ModelTrainJobDto(
this.id,

View File

@@ -4,6 +4,7 @@ import com.kamco.cd.training.dataset.dto.DatasetDto;
import com.kamco.cd.training.dataset.dto.DatasetDto.DatasetMngRegDto;
import com.kamco.cd.training.dataset.dto.DatasetDto.DatasetReq;
import com.kamco.cd.training.dataset.dto.DatasetDto.SelectDataSet;
import com.kamco.cd.training.dataset.dto.DatasetDto.SelectTransferDataSet;
import com.kamco.cd.training.postgres.entity.DatasetEntity;
import java.util.List;
import java.util.Optional;
@@ -17,6 +18,10 @@ public interface DatasetRepositoryCustom {
List<SelectDataSet> getDatasetSelectG1List(DatasetReq req);
public List<SelectTransferDataSet> getDatasetTransferSelectG1List(Long modelId);
public List<SelectTransferDataSet> getDatasetTransferSelectG2G3List(Long modelId, String modelNo);
List<SelectDataSet> getDatasetSelectG2G3List(DatasetReq req);
Long getDatasetMaxStage(int compareYyyy, int targetYyyy);

View File

@@ -1,14 +1,20 @@
package com.kamco.cd.training.postgres.repository.dataset;
import static com.kamco.cd.training.postgres.entity.QDatasetObjEntity.datasetObjEntity;
import static com.kamco.cd.training.postgres.entity.QModelDatasetMappEntity.modelDatasetMappEntity;
import static com.kamco.cd.training.postgres.entity.QModelMasterEntity.modelMasterEntity;
import com.kamco.cd.training.common.enums.ModelType;
import com.kamco.cd.training.dataset.dto.DatasetDto.DatasetMngRegDto;
import com.kamco.cd.training.dataset.dto.DatasetDto.DatasetReq;
import com.kamco.cd.training.dataset.dto.DatasetDto.SearchReq;
import com.kamco.cd.training.dataset.dto.DatasetDto.SelectDataSet;
import com.kamco.cd.training.dataset.dto.DatasetDto.SelectTransferDataSet;
import com.kamco.cd.training.postgres.entity.DatasetEntity;
import com.kamco.cd.training.postgres.entity.QDatasetEntity;
import com.kamco.cd.training.postgres.entity.QDatasetObjEntity;
import com.kamco.cd.training.postgres.entity.QModelDatasetMappEntity;
import com.kamco.cd.training.postgres.entity.QModelMasterEntity;
import com.querydsl.core.BooleanBuilder;
import com.querydsl.core.types.Projections;
import com.querydsl.core.types.dsl.CaseBuilder;
@@ -67,7 +73,11 @@ public class DatasetRepositoryImpl implements DatasetRepositoryCustom {
// Count 쿼리 별도 실행 (null safe handling)
long total =
Optional.ofNullable(
queryFactory.select(dataset.count()).from(dataset).where(builder).fetchOne())
queryFactory
.select(dataset.count())
.from(dataset)
.where(builder.and(dataset.deleted.isFalse()))
.fetchOne())
.orElse(0L);
return new PageImpl<>(content, pageable, total);
@@ -88,6 +98,8 @@ public class DatasetRepositoryImpl implements DatasetRepositoryCustom {
BooleanBuilder builder = new BooleanBuilder();
builder.and(dataset.deleted.isFalse());
if (StringUtils.isNotBlank(req.getDataType()) && !"CURRENT".equals(req.getDataType())) {
builder.and(dataset.dataType.eq(req.getDataType()));
}
@@ -138,10 +150,108 @@ public class DatasetRepositoryImpl implements DatasetRepositoryCustom {
.fetch();
}
@Override
public List<SelectTransferDataSet> getDatasetTransferSelectG1List(Long modelId) {
QModelMasterEntity beforeMaster = new QModelMasterEntity("beforeMaster");
QModelDatasetMappEntity beforeMapp = new QModelDatasetMappEntity("beforeMapp");
QDatasetEntity beforeDataset = new QDatasetEntity("beforeDataset");
QDatasetObjEntity beforeObj = new QDatasetObjEntity("beforeObj");
return queryFactory
.select(
Projections.constructor(
SelectTransferDataSet.class,
// ===== 현재 =====
modelMasterEntity.modelNo,
dataset.id,
dataset.uuid,
dataset.dataType,
dataset.title,
dataset.roundNo,
dataset.compareYyyy,
dataset.targetYyyy,
dataset.memo,
new CaseBuilder()
.when(datasetObjEntity.targetClassCd.eq("building"))
.then(1)
.otherwise(0)
.sum(),
new CaseBuilder()
.when(datasetObjEntity.targetClassCd.eq("container"))
.then(1)
.otherwise(0)
.sum(),
// ===== before (join으로) =====
beforeMaster.modelNo,
beforeDataset.id,
beforeDataset.uuid,
beforeDataset.dataType,
beforeDataset.title,
beforeDataset.roundNo,
beforeDataset.compareYyyy,
beforeDataset.targetYyyy,
beforeDataset.memo,
new CaseBuilder()
.when(beforeObj.targetClassCd.eq("building"))
.then(1)
.otherwise(0)
.sum(),
new CaseBuilder()
.when(beforeObj.targetClassCd.eq("container"))
.then(1)
.otherwise(0)
.sum()))
.from(modelMasterEntity)
// ===== 현재 dataset join =====
.leftJoin(modelDatasetMappEntity)
.on(modelDatasetMappEntity.modelUid.eq(modelMasterEntity.id))
.leftJoin(dataset)
.on(modelDatasetMappEntity.datasetUid.eq(dataset.id))
.leftJoin(datasetObjEntity)
.on(dataset.id.eq(datasetObjEntity.datasetUid))
// ===== before 모델 join =====
.leftJoin(beforeMaster)
.on(beforeMaster.id.eq(modelMasterEntity.beforeModelId))
.leftJoin(beforeMapp)
.on(beforeMapp.modelUid.eq(beforeMaster.id))
.leftJoin(beforeDataset)
.on(beforeMapp.datasetUid.eq(beforeDataset.id))
.leftJoin(beforeObj)
.on(beforeDataset.id.eq(beforeObj.datasetUid))
.where(modelMasterEntity.id.eq(modelId))
.groupBy(
modelMasterEntity.modelNo,
dataset.id,
dataset.uuid,
dataset.dataType,
dataset.title,
dataset.roundNo,
dataset.compareYyyy,
dataset.targetYyyy,
dataset.memo,
beforeMaster.modelNo,
beforeDataset.id,
beforeDataset.uuid,
beforeDataset.dataType,
beforeDataset.title,
beforeDataset.roundNo,
beforeDataset.compareYyyy,
beforeDataset.targetYyyy,
beforeDataset.memo)
.orderBy(dataset.createdDttm.desc())
.fetch();
}
@Override
public List<SelectDataSet> getDatasetSelectG2G3List(DatasetReq req) {
BooleanBuilder builder = new BooleanBuilder();
builder.and(dataset.deleted.isFalse());
NumberExpression<Long> selectedCnt = null;
NumberExpression<Long> wasteCnt =
@@ -201,6 +311,116 @@ public class DatasetRepositoryImpl implements DatasetRepositoryCustom {
.fetch();
}
@Override
public List<SelectTransferDataSet> getDatasetTransferSelectG2G3List(
Long modelId, String modelNo) {
// before join용
QModelMasterEntity beforeMaster = new QModelMasterEntity("beforeMaster");
QModelDatasetMappEntity beforeMapp = new QModelDatasetMappEntity("beforeMapp");
QDatasetEntity beforeDataset = new QDatasetEntity("beforeDataset");
QDatasetObjEntity beforeObj = new QDatasetObjEntity("beforeObj");
BooleanBuilder builder = new BooleanBuilder();
NumberExpression<Long> wasteCnt =
datasetObjEntity.targetClassCd.when("waste").then(1L).otherwise(0L).sum();
NumberExpression<Long> elseCnt =
new CaseBuilder()
.when(datasetObjEntity.targetClassCd.notIn("building", "container", "waste"))
.then(1L)
.otherwise(0L)
.sum();
NumberExpression<Long> selectedCnt = ModelType.G2.getId().equals(modelNo) ? wasteCnt : elseCnt;
// before도 동일 로직으로 cnt 계산
NumberExpression<Long> beforeWasteCnt =
beforeObj.targetClassCd.when("waste").then(1L).otherwise(0L).sum();
NumberExpression<Long> beforeElseCnt =
new CaseBuilder()
.when(beforeObj.targetClassCd.notIn("building", "container", "waste"))
.then(1L)
.otherwise(0L)
.sum();
NumberExpression<Long> beforeSelectedCnt =
ModelType.G2.getId().equals(modelNo) ? beforeWasteCnt : beforeElseCnt;
return queryFactory
.select(
Projections.constructor(
SelectTransferDataSet.class,
// ===== 현재 =====
modelMasterEntity.modelNo, // modelNo 파라미터 사용 (req.getModelNo() 제거)
dataset.id,
dataset.uuid,
dataset.dataType,
dataset.title,
dataset.roundNo,
dataset.compareYyyy,
dataset.targetYyyy,
dataset.memo,
selectedCnt, // classCount 자리에 들어가는 cnt (Long)
// ===== before =====
beforeMaster.modelNo,
beforeDataset.id,
beforeDataset.uuid,
beforeDataset.dataType,
beforeDataset.title,
beforeDataset.roundNo,
beforeDataset.compareYyyy,
beforeDataset.targetYyyy,
beforeDataset.memo,
beforeSelectedCnt))
.from(modelMasterEntity)
// ===== 현재 dataset =====
.leftJoin(modelDatasetMappEntity)
.on(modelDatasetMappEntity.modelUid.eq(modelMasterEntity.id))
.leftJoin(dataset)
.on(modelDatasetMappEntity.datasetUid.eq(dataset.id))
.leftJoin(datasetObjEntity)
.on(dataset.id.eq(datasetObjEntity.datasetUid))
// ===== before dataset =====
.leftJoin(beforeMaster)
.on(beforeMaster.id.eq(modelMasterEntity.beforeModelId))
.leftJoin(beforeMapp)
.on(beforeMapp.modelUid.eq(beforeMaster.id))
.leftJoin(beforeDataset)
.on(beforeMapp.datasetUid.eq(beforeDataset.id))
.leftJoin(beforeObj)
.on(beforeDataset.id.eq(beforeObj.datasetUid))
.where(modelMasterEntity.id.eq(modelId).and(builder))
// sum() 때문에 groupBy 필요
.groupBy(
dataset.id,
dataset.uuid,
dataset.dataType,
dataset.title,
dataset.roundNo,
dataset.compareYyyy,
dataset.targetYyyy,
dataset.memo,
beforeMaster.modelNo,
beforeDataset.id,
beforeDataset.uuid,
beforeDataset.dataType,
beforeDataset.title,
beforeDataset.roundNo,
beforeDataset.compareYyyy,
beforeDataset.targetYyyy,
beforeDataset.memo)
.orderBy(dataset.createdDttm.desc())
.fetch();
}
@Override
public Long getDatasetMaxStage(int compareYyyy, int targetYyyy) {
return queryFactory
@@ -239,7 +459,7 @@ public class DatasetRepositoryImpl implements DatasetRepositoryCustom {
return queryFactory
.select(dataset.id)
.from(dataset)
.where(dataset.uid.eq(mngRegDto.getUid()))
.where(dataset.uid.eq(mngRegDto.getUid()), dataset.deleted.isFalse())
.fetchOne();
}
@@ -253,7 +473,7 @@ public class DatasetRepositoryImpl implements DatasetRepositoryCustom {
return queryFactory
.select(dataset.id.count())
.from(dataset)
.where(dataset.uid.eq(uid))
.where(dataset.uid.eq(uid), dataset.deleted.isFalse())
.fetchOne();
}
}

View File

@@ -4,6 +4,7 @@ import com.kamco.cd.training.common.enums.ModelType;
import com.kamco.cd.training.hyperparam.dto.HyperParamDto;
import com.kamco.cd.training.hyperparam.dto.HyperParamDto.SearchReq;
import com.kamco.cd.training.postgres.entity.ModelHyperParamEntity;
import java.util.List;
import java.util.Optional;
import java.util.UUID;
import org.springframework.data.domain.Page;
@@ -28,9 +29,28 @@ public interface HyperParamRepositoryCustom {
Optional<ModelHyperParamEntity> findHyperParamByHyperVer(String hyperVer);
/**
* 하이퍼 파라미터 상세조회
*
* @param uuid
* @return
*/
Optional<ModelHyperParamEntity> findHyperParamByUuid(UUID uuid);
/**
* 하이퍼 파라미터 목록 조회
*
* @param model
* @param req
* @return
*/
Page<HyperParamDto.List> findByHyperVerList(ModelType model, SearchReq req);
Optional<ModelHyperParamEntity> getHyperparamByType(ModelType modelType);
/**
* 하이퍼 파라미터 모델타입으로 조회
*
* @param modelType
* @return
*/
List<ModelHyperParamEntity> getHyperParamByType(ModelType modelType);
}

View File

@@ -9,7 +9,6 @@ import com.kamco.cd.training.hyperparam.dto.HyperParamDto.SearchReq;
import com.kamco.cd.training.postgres.entity.ModelHyperParamEntity;
import com.querydsl.core.BooleanBuilder;
import com.querydsl.core.types.Projections;
import com.querydsl.core.types.dsl.NumberExpression;
import com.querydsl.jpa.impl.JPAQuery;
import com.querydsl.jpa.impl.JPAQueryFactory;
import java.time.ZoneId;
@@ -82,7 +81,7 @@ public class HyperParamRepositoryImpl implements HyperParamRepositoryCustom {
queryFactory
.select(modelHyperParamEntity)
.from(modelHyperParamEntity)
.where(modelHyperParamEntity.delYn.isFalse().and(modelHyperParamEntity.uuid.eq(uuid)))
.where(modelHyperParamEntity.uuid.eq(uuid))
.fetchOne());
}
@@ -91,9 +90,13 @@ public class HyperParamRepositoryImpl implements HyperParamRepositoryCustom {
Pageable pageable = req.toPageable();
BooleanBuilder builder = new BooleanBuilder();
builder.and(modelHyperParamEntity.modelType.eq(model));
builder.and(modelHyperParamEntity.delYn.isFalse());
if (model != null) {
builder.and(modelHyperParamEntity.modelType.eq(model));
}
if (req.getHyperVer() != null && !req.getHyperVer().isEmpty()) {
// 버전
builder.and(modelHyperParamEntity.hyperVer.contains(req.getHyperVer()));
@@ -116,26 +119,18 @@ public class HyperParamRepositoryImpl implements HyperParamRepositoryCustom {
}
}
NumberExpression<Long> totalUseCnt =
modelHyperParamEntity
.m1UseCnt
.coalesce(0L)
.add(modelHyperParamEntity.m2UseCnt.coalesce(0L))
.add(modelHyperParamEntity.m3UseCnt.coalesce(0L));
JPAQuery<HyperParamDto.List> query =
queryFactory
.select(
Projections.constructor(
HyperParamDto.List.class,
modelHyperParamEntity.uuid,
modelHyperParamEntity.modelType.as("model"),
modelHyperParamEntity.hyperVer,
modelHyperParamEntity.createdDttm,
modelHyperParamEntity.lastUsedDttm,
modelHyperParamEntity.m1UseCnt,
modelHyperParamEntity.m2UseCnt,
modelHyperParamEntity.m3UseCnt,
totalUseCnt.as("totalUseCnt")))
modelHyperParamEntity.memo,
modelHyperParamEntity.totalUseCnt))
.from(modelHyperParamEntity)
.where(builder);
@@ -160,8 +155,11 @@ public class HyperParamRepositoryImpl implements HyperParamRepositoryCustom {
asc
? modelHyperParamEntity.lastUsedDttm.asc()
: modelHyperParamEntity.lastUsedDttm.desc());
case "totalUseCnt" -> query.orderBy(asc ? totalUseCnt.asc() : totalUseCnt.desc());
case "totalUseCnt" ->
query.orderBy(
asc
? modelHyperParamEntity.totalUseCnt.asc()
: modelHyperParamEntity.totalUseCnt.desc());
default -> query.orderBy(modelHyperParamEntity.createdDttm.desc());
}
@@ -183,12 +181,15 @@ public class HyperParamRepositoryImpl implements HyperParamRepositoryCustom {
}
@Override
public Optional<ModelHyperParamEntity> getHyperparamByType(ModelType modelType) {
return Optional.ofNullable(
queryFactory
public List<ModelHyperParamEntity> getHyperParamByType(ModelType modelType) {
return queryFactory
.select(modelHyperParamEntity)
.from(modelHyperParamEntity)
.where(modelHyperParamEntity.delYn.isFalse().and(modelHyperParamEntity.modelType.eq(modelType)))
.fetchOne());
.where(
modelHyperParamEntity
.delYn
.isFalse()
.and(modelHyperParamEntity.modelType.eq(modelType)))
.fetch();
}
}

View File

@@ -1,6 +1,7 @@
package com.kamco.cd.training.postgres.repository.log;
import com.kamco.cd.training.log.dto.AuditLogDto;
import com.kamco.cd.training.log.dto.AuditLogDto.DownloadReq;
import java.time.LocalDate;
import org.springframework.data.domain.Page;
@@ -15,6 +16,9 @@ public interface AuditLogRepositoryCustom {
Page<AuditLogDto.UserAuditList> findLogByAccount(
AuditLogDto.searchReq searchReq, String searchValue);
Page<AuditLogDto.DownloadRes> findDownloadLog(
AuditLogDto.searchReq searchReq, DownloadReq downloadReq);
Page<AuditLogDto.DailyDetail> findLogByDailyResult(
AuditLogDto.searchReq searchReq, LocalDate logDate);

View File

@@ -6,32 +6,42 @@ import static com.kamco.cd.training.postgres.entity.QMemberEntity.memberEntity;
import static com.kamco.cd.training.postgres.entity.QMenuEntity.menuEntity;
import com.kamco.cd.training.log.dto.AuditLogDto;
import com.kamco.cd.training.log.dto.AuditLogDto.DownloadReq;
import com.kamco.cd.training.log.dto.AuditLogDto.searchReq;
import com.kamco.cd.training.log.dto.ErrorLogDto;
import com.kamco.cd.training.log.dto.EventStatus;
import com.kamco.cd.training.log.dto.EventType;
import com.kamco.cd.training.postgres.entity.AuditLogEntity;
import com.kamco.cd.training.postgres.entity.QMenuEntity;
import com.querydsl.core.BooleanBuilder;
import com.querydsl.core.types.Projections;
import com.querydsl.core.types.dsl.*;
import com.querydsl.jpa.impl.JPAQueryFactory;
import io.micrometer.common.util.StringUtils;
import java.time.LocalDate;
import java.time.LocalDateTime;
import java.time.ZoneId;
import java.time.ZonedDateTime;
import java.util.List;
import java.util.Objects;
import lombok.RequiredArgsConstructor;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.PageImpl;
import org.springframework.data.domain.Pageable;
import org.springframework.data.jpa.repository.support.QuerydslRepositorySupport;
import org.springframework.stereotype.Repository;
@Repository
@RequiredArgsConstructor
public class AuditLogRepositoryImpl implements AuditLogRepositoryCustom {
public class AuditLogRepositoryImpl extends QuerydslRepositorySupport
implements AuditLogRepositoryCustom {
private static final ZoneId ZONE = ZoneId.of("Asia/Seoul");
private final JPAQueryFactory queryFactory;
private final StringExpression NULL_STRING = Expressions.stringTemplate("cast(null as text)");
public AuditLogRepositoryImpl(JPAQueryFactory queryFactory) {
super(AuditLogEntity.class);
this.queryFactory = queryFactory;
}
@Override
public Page<AuditLogDto.DailyAuditList> findLogByDaily(
AuditLogDto.searchReq searchReq, LocalDate startDate, LocalDate endDate) {
@@ -87,7 +97,7 @@ public class AuditLogRepositoryImpl implements AuditLogRepositoryCustom {
.from(auditLogEntity)
.leftJoin(menuEntity)
.on(auditLogEntity.menuUid.eq(menuEntity.menuUid))
.where(menuNameEquals(searchValue))
.where(auditLogEntity.menuUid.ne("SYSTEM"), menuNameEquals(searchValue))
.groupBy(auditLogEntity.menuUid)
.offset(pageable.getOffset())
.limit(pageable.getPageSize())
@@ -128,7 +138,7 @@ public class AuditLogRepositoryImpl implements AuditLogRepositoryCustom {
.from(auditLogEntity)
.leftJoin(memberEntity)
.on(auditLogEntity.userUid.eq(memberEntity.id))
.where(loginIdOrUsernameContains(searchValue))
.where(auditLogEntity.userUid.isNotNull(), loginIdOrUsernameContains(searchValue))
.groupBy(auditLogEntity.userUid, memberEntity.employeeNo, memberEntity.name)
.offset(pageable.getOffset())
.limit(pageable.getPageSize())
@@ -147,6 +157,62 @@ public class AuditLogRepositoryImpl implements AuditLogRepositoryCustom {
return new PageImpl<>(foundContent, pageable, countQuery);
}
@Override
public Page<AuditLogDto.DownloadRes> findDownloadLog(
AuditLogDto.searchReq searchReq, DownloadReq req) {
Pageable pageable = searchReq.toPageable();
BooleanBuilder whereBuilder = new BooleanBuilder();
whereBuilder.and(auditLogEntity.eventStatus.ne(EventStatus.valueOf("FAILED")));
whereBuilder.and(auditLogEntity.eventType.eq(EventType.valueOf("DOWNLOAD")));
// if (req.getMenuId() != null && !req.getMenuId().isEmpty()) {
// whereBuilder.and(auditLogEntity.menuUid.eq(req.getMenuId()));
// }
if (req.getUuid() != null) {
whereBuilder.and(auditLogEntity.requestUri.contains(req.getRequestUri()));
whereBuilder.and(auditLogEntity.downloadUuid.eq(req.getUuid()));
}
if (req.getSearchValue() != null && !req.getSearchValue().isEmpty()) {
whereBuilder.and(
memberEntity
.name
.contains(req.getSearchValue())
.or(memberEntity.employeeNo.contains(req.getSearchValue())));
}
List<AuditLogDto.DownloadRes> foundContent =
queryFactory
.select(
Projections.constructor(
AuditLogDto.DownloadRes.class,
memberEntity.name,
memberEntity.employeeNo,
auditLogEntity.createdDate.as("downloadDttm")))
.from(auditLogEntity)
.leftJoin(memberEntity)
.on(auditLogEntity.userUid.eq(memberEntity.id))
.where(whereBuilder, createdDateBetween(req.getStartDate(), req.getEndDate()))
.offset(pageable.getOffset())
.limit(pageable.getPageSize())
.orderBy(auditLogEntity.createdDate.desc())
.fetch();
Long countQuery =
queryFactory
.select(auditLogEntity.userUid.countDistinct())
.from(auditLogEntity)
.leftJoin(memberEntity)
.on(auditLogEntity.userUid.eq(memberEntity.id))
.where(whereBuilder, createdDateBetween(req.getStartDate(), req.getEndDate()))
.fetchOne();
return new PageImpl<>(foundContent, pageable, countQuery);
}
@Override
public Page<AuditLogDto.DailyDetail> findLogByDailyResult(
AuditLogDto.searchReq searchReq, LocalDate logDate) {
@@ -176,6 +242,9 @@ public class AuditLogRepositoryImpl implements AuditLogRepositoryCustom {
memberEntity.employeeNo.as("loginId"),
menuEntity.menuNm.as("menuName"),
auditLogEntity.eventType.as("eventType"),
Expressions.stringTemplate(
"to_char({0}, 'YYYY-MM-DD HH24:MI')", auditLogEntity.createdDate)
.as("logDateTime"),
Projections.constructor(
AuditLogDto.LogDetail.class,
Expressions.constant("한국자산관리공사"), // serviceName
@@ -184,7 +253,7 @@ public class AuditLogRepositoryImpl implements AuditLogRepositoryCustom {
menuEntity.menuUrl.as("menuUrl"),
menuEntity.description.as("menuDescription"),
menuEntity.menuOrder.as("sortOrder"),
menuEntity.isUse.as("used"))))
menuEntity.isUse.as("used")))) // TODO
.from(auditLogEntity)
.leftJoin(menuEntity)
.on(auditLogEntity.menuUid.eq(menuEntity.menuUid))
@@ -238,8 +307,8 @@ public class AuditLogRepositoryImpl implements AuditLogRepositoryCustom {
AuditLogDto.MenuDetail.class,
auditLogEntity.id.as("logId"),
Expressions.stringTemplate(
"to_char({0}, 'YYYY-MM-DD')", auditLogEntity.createdDate)
.as("logDateTime"), // ??
"to_char({0}, 'YYYY-MM-DD HH24:MI')", auditLogEntity.createdDate)
.as("logDateTime"),
memberEntity.name.as("userName"),
memberEntity.employeeNo.as("loginId"),
auditLogEntity.eventType.as("eventType"),
@@ -305,7 +374,7 @@ public class AuditLogRepositoryImpl implements AuditLogRepositoryCustom {
AuditLogDto.UserDetail.class,
auditLogEntity.id.as("logId"),
Expressions.stringTemplate(
"to_char({0}, 'YYYY-MM-DD')", auditLogEntity.createdDate)
"to_char({0}, 'YYYY-MM-DD HH24:MI')", auditLogEntity.createdDate)
.as("logDateTime"),
menuEntity.menuNm.as("menuName"),
auditLogEntity.eventType.as("eventType"),
@@ -349,12 +418,23 @@ public class AuditLogRepositoryImpl implements AuditLogRepositoryCustom {
if (Objects.isNull(startDate) || Objects.isNull(endDate)) {
return null;
}
LocalDateTime startDateTime = startDate.atStartOfDay();
LocalDateTime endDateTime = endDate.plusDays(1).atStartOfDay();
ZoneId zoneId = ZoneId.of("Asia/Seoul");
ZonedDateTime startDateTime = startDate.atStartOfDay(zoneId);
ZonedDateTime endDateTime = endDate.plusDays(1).atStartOfDay(zoneId);
return auditLogEntity
.createdDate
.goe(ZonedDateTime.from(startDateTime))
.and(auditLogEntity.createdDate.lt(ZonedDateTime.from(endDateTime)));
.goe(startDateTime)
.and(auditLogEntity.createdDate.lt(endDateTime));
}
private BooleanExpression createdDateBetween(LocalDate startDate, LocalDate endDate) {
if (startDate == null || endDate == null) {
return null;
}
ZonedDateTime start = startDate.atStartOfDay(ZONE);
ZonedDateTime endExclusive = endDate.plusDays(1).atStartOfDay(ZONE);
return auditLogEntity.createdDate.goe(start).and(auditLogEntity.createdDate.lt(endExclusive));
}
private BooleanExpression menuNameEquals(String searchValue) {
@@ -393,11 +473,11 @@ public class AuditLogRepositoryImpl implements AuditLogRepositoryCustom {
}
private BooleanExpression eventEndedAtEqDate(LocalDate logDate) {
StringExpression eventEndedDate =
Expressions.stringTemplate("to_char({0}, 'YYYY-MM-DD')", auditLogEntity.createdDate);
LocalDateTime comparisonDate = logDate.atStartOfDay();
ZoneId zoneId = ZoneId.of("Asia/Seoul");
ZonedDateTime start = logDate.atStartOfDay(zoneId);
ZonedDateTime end = logDate.plusDays(1).atStartOfDay(zoneId);
return eventEndedDate.eq(comparisonDate.toString());
return auditLogEntity.createdDate.goe(start).and(auditLogEntity.createdDate.lt(end));
}
private BooleanExpression menuUidEq(String menuUid) {
@@ -410,7 +490,7 @@ public class AuditLogRepositoryImpl implements AuditLogRepositoryCustom {
private NumberExpression<Integer> readCount() {
return new CaseBuilder()
.when(auditLogEntity.eventType.eq(EventType.READ))
.when(auditLogEntity.eventType.in(EventType.LIST, EventType.DETAIL))
.then(1)
.otherwise(0)
.sum();
@@ -418,7 +498,7 @@ public class AuditLogRepositoryImpl implements AuditLogRepositoryCustom {
private NumberExpression<Integer> cudCount() {
return new CaseBuilder()
.when(auditLogEntity.eventType.in(EventType.CREATE, EventType.UPDATE, EventType.DELETE))
.when(auditLogEntity.eventType.in(EventType.ADDED, EventType.MODIFIED, EventType.REMOVE))
.then(1)
.otherwise(0)
.sum();
@@ -426,7 +506,7 @@ public class AuditLogRepositoryImpl implements AuditLogRepositoryCustom {
private NumberExpression<Integer> printCount() {
return new CaseBuilder()
.when(auditLogEntity.eventType.eq(EventType.PRINT))
.when(auditLogEntity.eventType.eq(EventType.OTHER))
.then(1)
.otherwise(0)
.sum();

View File

@@ -8,29 +8,35 @@ import static com.kamco.cd.training.postgres.entity.QMenuEntity.menuEntity;
import com.kamco.cd.training.log.dto.ErrorLogDto;
import com.kamco.cd.training.log.dto.EventStatus;
import com.kamco.cd.training.log.dto.EventType;
import com.kamco.cd.training.postgres.entity.AuditLogEntity;
import com.querydsl.core.types.Projections;
import com.querydsl.core.types.dsl.BooleanExpression;
import com.querydsl.core.types.dsl.Expressions;
import com.querydsl.core.types.dsl.StringExpression;
import com.querydsl.jpa.impl.JPAQueryFactory;
import java.time.LocalDate;
import java.time.LocalDateTime;
import java.time.ZoneId;
import java.time.ZonedDateTime;
import java.util.List;
import java.util.Objects;
import lombok.RequiredArgsConstructor;
import org.springframework.data.domain.Page;
import org.springframework.data.domain.PageImpl;
import org.springframework.data.domain.Pageable;
import org.springframework.data.jpa.repository.support.QuerydslRepositorySupport;
import org.springframework.stereotype.Repository;
@Repository
@RequiredArgsConstructor
public class ErrorLogRepositoryImpl implements ErrorLogRepositoryCustom {
public class ErrorLogRepositoryImpl extends QuerydslRepositorySupport
implements ErrorLogRepositoryCustom {
private final JPAQueryFactory queryFactory;
private final StringExpression NULL_STRING = Expressions.stringTemplate("cast(null as text)");
public ErrorLogRepositoryImpl(JPAQueryFactory queryFactory) {
super(AuditLogEntity.class);
this.queryFactory = queryFactory;
}
@Override
public Page<ErrorLogDto.Basic> findLogByError(ErrorLogDto.ErrorSearchReq searchReq) {
Pageable pageable = searchReq.toPageable();
@@ -52,7 +58,7 @@ public class ErrorLogRepositoryImpl implements ErrorLogRepositoryCustom {
errorLogEntity.errorMessage.as("errorMessage"),
errorLogEntity.stackTrace.as("errorDetail"),
Expressions.stringTemplate(
"to_char({0}, 'YYYY-MM-DD')", errorLogEntity.createdDate)))
"to_char({0}, 'YYYY-MM-DD HH24:MI:SS.FF3')", errorLogEntity.createdDate)))
.from(errorLogEntity)
.leftJoin(auditLogEntity)
.on(errorLogEntity.id.eq(auditLogEntity.errorLogUid))
@@ -94,12 +100,14 @@ public class ErrorLogRepositoryImpl implements ErrorLogRepositoryCustom {
if (Objects.isNull(startDate) || Objects.isNull(endDate)) {
return null;
}
LocalDateTime startDateTime = startDate.atStartOfDay();
LocalDateTime endDateTime = endDate.plusDays(1).atStartOfDay();
ZoneId zoneId = ZoneId.of("Asia/Seoul");
ZonedDateTime startDateTime = startDate.atStartOfDay(zoneId);
ZonedDateTime endDateTime = endDate.plusDays(1).atStartOfDay(zoneId);
return auditLogEntity
.createdDate
.goe(ZonedDateTime.from(startDateTime))
.and(auditLogEntity.createdDate.lt(ZonedDateTime.from(endDateTime)));
.goe(startDateTime)
.and(auditLogEntity.createdDate.lt(endDateTime));
}
private BooleanExpression eventStatusEqFailed() {

View File

@@ -5,4 +5,6 @@ import java.util.Optional;
public interface ModelConfigRepositoryCustom {
Optional<ModelConfigDto.Basic> findModelConfigByModelId(Long modelId);
Optional<ModelConfigDto.TransferBasic> findModelTransferConfigByModelId(Long modelId);
}

View File

@@ -1,8 +1,12 @@
package com.kamco.cd.training.postgres.repository.model;
import static com.kamco.cd.training.postgres.entity.QModelConfigEntity.modelConfigEntity;
import static com.kamco.cd.training.postgres.entity.QModelMasterEntity.modelMasterEntity;
import com.kamco.cd.training.model.dto.ModelConfigDto.Basic;
import com.kamco.cd.training.model.dto.ModelConfigDto.TransferBasic;
import com.kamco.cd.training.postgres.entity.QModelConfigEntity;
import com.kamco.cd.training.postgres.entity.QModelMasterEntity;
import com.querydsl.core.types.Projections;
import com.querydsl.jpa.impl.JPAQueryFactory;
import java.util.Optional;
@@ -34,4 +38,44 @@ public class ModelConfigRepositoryImpl implements ModelConfigRepositoryCustom {
.where(modelConfigEntity.model.id.eq(modelId))
.fetchOne());
}
@Override
public Optional<TransferBasic> findModelTransferConfigByModelId(Long modelId) {
QModelConfigEntity beforeConfig = new QModelConfigEntity("beforeConfig");
QModelMasterEntity beforeMaster = new QModelMasterEntity("beforeMaster");
return Optional.ofNullable(
queryFactory
.select(
Projections.constructor(
TransferBasic.class,
// ===== 현재 =====
modelConfigEntity.id,
modelConfigEntity.model.id,
modelConfigEntity.epochCount,
modelConfigEntity.trainPercent,
modelConfigEntity.validationPercent,
modelConfigEntity.testPercent,
modelConfigEntity.memo,
// ===== before =====
beforeConfig.id,
beforeConfig.model.id,
beforeConfig.epochCount,
beforeConfig.trainPercent,
beforeConfig.validationPercent,
beforeConfig.testPercent,
beforeConfig.memo))
.from(modelConfigEntity)
.innerJoin(modelConfigEntity.model, modelMasterEntity)
// before 모델 조인
.leftJoin(beforeMaster)
.on(beforeMaster.id.eq(modelMasterEntity.beforeModelId))
.leftJoin(beforeConfig)
.on(beforeConfig.model.id.eq(beforeMaster.id))
.where(modelMasterEntity.id.eq(modelId))
.fetchOne());
}
}

View File

@@ -1,8 +1,15 @@
package com.kamco.cd.training.postgres.repository.model;
import com.kamco.cd.training.postgres.entity.ModelDatasetMappEntity;
import com.kamco.cd.training.train.dto.ModelTrainLinkDto;
import java.util.List;
public interface ModelDatasetMappRepositoryCustom {
List<ModelDatasetMappEntity> findByModelUid(Long modelId);
List<ModelTrainLinkDto> findDatasetTrainPath(Long modelId);
List<ModelTrainLinkDto> findDatasetValPath(Long modelId);
List<ModelTrainLinkDto> findDatasetTestPath(Long modelId);
}

View File

@@ -1,8 +1,16 @@
package com.kamco.cd.training.postgres.repository.model;
import static com.kamco.cd.training.postgres.entity.QDatasetEntity.datasetEntity;
import static com.kamco.cd.training.postgres.entity.QModelDatasetMappEntity.modelDatasetMappEntity;
import static com.kamco.cd.training.postgres.entity.QModelMasterEntity.modelMasterEntity;
import com.kamco.cd.training.common.enums.ModelType;
import com.kamco.cd.training.postgres.entity.ModelDatasetMappEntity;
import com.kamco.cd.training.postgres.entity.QDatasetObjEntity;
import com.kamco.cd.training.postgres.entity.QDatasetTestObjEntity;
import com.kamco.cd.training.postgres.entity.QDatasetValObjEntity;
import com.kamco.cd.training.train.dto.ModelTrainLinkDto;
import com.querydsl.core.types.Projections;
import com.querydsl.jpa.impl.JPAQueryFactory;
import java.util.List;
import lombok.RequiredArgsConstructor;
@@ -22,4 +30,136 @@ public class ModelDatasetMappRepositoryImpl implements ModelDatasetMappRepositor
.where(modelDatasetMappEntity.modelUid.eq(modelId))
.fetch();
}
@Override
public List<ModelTrainLinkDto> findDatasetTrainPath(Long modelId) {
QDatasetObjEntity datasetObjEntity = QDatasetObjEntity.datasetObjEntity;
return queryFactory
.select(
Projections.constructor(
ModelTrainLinkDto.class,
modelMasterEntity.id,
modelMasterEntity.trainType,
modelMasterEntity.modelNo,
modelDatasetMappEntity.datasetUid,
datasetObjEntity.targetClassCd,
datasetObjEntity.comparePath,
datasetObjEntity.targetPath,
datasetObjEntity.labelPath,
datasetObjEntity.geojsonPath,
datasetEntity.uid))
.from(modelMasterEntity)
.leftJoin(modelDatasetMappEntity)
.on(modelDatasetMappEntity.modelUid.eq(modelMasterEntity.id))
.leftJoin(datasetEntity)
.on(datasetEntity.id.eq(modelDatasetMappEntity.datasetUid))
.leftJoin(datasetObjEntity)
.on(
datasetObjEntity
.datasetUid
.eq(modelDatasetMappEntity.datasetUid)
.and(
modelMasterEntity
.modelNo
.eq(ModelType.G1.getId())
.and(datasetObjEntity.targetClassCd.upper().in("CONTAINER", "BUILDING"))
.or(
modelMasterEntity
.modelNo
.eq(ModelType.G2.getId())
.and(datasetObjEntity.targetClassCd.upper().eq("WASTE")))
.or(modelMasterEntity.modelNo.eq(ModelType.G3.getId()))))
.where(modelMasterEntity.id.eq(modelId))
.fetch();
}
@Override
public List<ModelTrainLinkDto> findDatasetValPath(Long modelId) {
QDatasetValObjEntity datasetValObjEntity = QDatasetValObjEntity.datasetValObjEntity;
return queryFactory
.select(
Projections.constructor(
ModelTrainLinkDto.class,
modelMasterEntity.id,
modelMasterEntity.trainType,
modelMasterEntity.modelNo,
modelDatasetMappEntity.datasetUid,
datasetValObjEntity.targetClassCd,
datasetValObjEntity.comparePath,
datasetValObjEntity.targetPath,
datasetValObjEntity.labelPath,
datasetValObjEntity.geojsonPath,
datasetEntity.uid))
.from(modelMasterEntity)
.leftJoin(modelDatasetMappEntity)
.on(modelDatasetMappEntity.modelUid.eq(modelMasterEntity.id))
.leftJoin(datasetEntity)
.on(datasetEntity.id.eq(modelDatasetMappEntity.datasetUid))
.leftJoin(datasetValObjEntity)
.on(
datasetValObjEntity
.datasetUid
.eq(modelDatasetMappEntity.datasetUid)
.and(
modelMasterEntity
.modelNo
.eq(ModelType.G1.getId())
.and(datasetValObjEntity.targetClassCd.upper().in("CONTAINER", "BUILDING"))
.or(
modelMasterEntity
.modelNo
.eq(ModelType.G2.getId())
.and(datasetValObjEntity.targetClassCd.upper().eq("WASTE")))
.or(modelMasterEntity.modelNo.eq(ModelType.G3.getId()))))
.where(modelMasterEntity.id.eq(modelId))
.fetch();
}
@Override
public List<ModelTrainLinkDto> findDatasetTestPath(Long modelId) {
QDatasetTestObjEntity datasetTestObjEntity = QDatasetTestObjEntity.datasetTestObjEntity;
return queryFactory
.select(
Projections.constructor(
ModelTrainLinkDto.class,
modelMasterEntity.id,
modelMasterEntity.trainType,
modelMasterEntity.modelNo,
modelDatasetMappEntity.datasetUid,
datasetTestObjEntity.targetClassCd,
datasetTestObjEntity.comparePath,
datasetTestObjEntity.targetPath,
datasetTestObjEntity.labelPath,
datasetTestObjEntity.geojsonPath,
datasetEntity.uid))
.from(modelMasterEntity)
.leftJoin(modelDatasetMappEntity)
.on(modelDatasetMappEntity.modelUid.eq(modelMasterEntity.id))
.leftJoin(datasetEntity)
.on(datasetEntity.id.eq(modelDatasetMappEntity.datasetUid))
.leftJoin(datasetTestObjEntity)
.on(
datasetTestObjEntity
.datasetUid
.eq(modelDatasetMappEntity.datasetUid)
.and(
modelMasterEntity
.modelNo
.eq(ModelType.G1.getId())
.and(datasetTestObjEntity.targetClassCd.upper().in("CONTAINER", "BUILDING"))
.or(
modelMasterEntity
.modelNo
.eq(ModelType.G2.getId())
.and(datasetTestObjEntity.targetClassCd.upper().eq("WASTE")))
.or(modelMasterEntity.modelNo.eq(ModelType.G3.getId()))))
.where(modelMasterEntity.id.eq(modelId))
.fetch();
}
}

View File

@@ -4,10 +4,12 @@ import com.kamco.cd.training.model.dto.ModelTrainDetailDto.DetailSummary;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.HyperSummary;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.MappingDataset;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelBestEpoch;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelFileInfo;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelTestMetrics;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelTrainMetrics;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelValidationMetrics;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.TransferHyperSummary;
import com.kamco.cd.training.model.dto.ModelTrainMngDto.ModelProgressStepDto;
import com.kamco.cd.training.postgres.entity.ModelMasterEntity;
import java.util.List;
import java.util.Optional;
@@ -34,4 +36,10 @@ public interface ModelDetailRepositoryCustom {
List<ModelTestMetrics> getModelTestMetricResult(UUID uuid);
ModelBestEpoch getModelTrainBestEpoch(UUID uuid);
ModelFileInfo getModelTrainFileInfo(UUID uuid);
List<ModelProgressStepDto> findModelTrainProgressInfo(UUID uuid);
ModelMasterEntity findByModelBeforeId(Long beforeModelId);
}

View File

@@ -9,20 +9,25 @@ import static com.kamco.cd.training.postgres.entity.QModelMetricsTestEntity.mode
import static com.kamco.cd.training.postgres.entity.QModelMetricsTrainEntity.modelMetricsTrainEntity;
import static com.kamco.cd.training.postgres.entity.QModelMetricsValidationEntity.modelMetricsValidationEntity;
import com.kamco.cd.training.common.enums.TrainStatusType;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.DetailSummary;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.HyperSummary;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.MappingDataset;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelBestEpoch;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelFileInfo;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelTestMetrics;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelTrainMetrics;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.ModelValidationMetrics;
import com.kamco.cd.training.model.dto.ModelTrainDetailDto.TransferHyperSummary;
import com.kamco.cd.training.model.dto.ModelTrainMngDto.ModelProgressStepDto;
import com.kamco.cd.training.postgres.entity.ModelMasterEntity;
import com.kamco.cd.training.postgres.entity.QModelHyperParamEntity;
import com.kamco.cd.training.postgres.entity.QModelMasterEntity;
import com.querydsl.core.types.Expression;
import com.querydsl.core.types.Projections;
import com.querydsl.jpa.JPAExpressions;
import com.querydsl.jpa.impl.JPAQueryFactory;
import java.util.ArrayList;
import java.util.List;
import java.util.Optional;
import java.util.UUID;
@@ -55,6 +60,13 @@ public class ModelDetailRepositoryImpl implements ModelDetailRepositoryCustom {
@Override
public DetailSummary getModelDetailSummary(UUID uuid) {
QModelMasterEntity beforeModel = new QModelMasterEntity("beforeModel"); // alias
Expression<UUID> beforeModelUuid =
com.querydsl.jpa.JPAExpressions.select(beforeModel.uuid)
.from(beforeModel)
.where(beforeModel.id.eq(modelMasterEntity.beforeModelId));
return queryFactory
.select(
Projections.constructor(
@@ -66,7 +78,8 @@ public class ModelDetailRepositoryImpl implements ModelDetailRepositoryCustom {
modelMasterEntity.step1StrtDttm,
modelMasterEntity.step2EndDttm,
modelMasterEntity.statusCd,
modelMasterEntity.trainType))
modelMasterEntity.trainType,
beforeModelUuid))
.from(modelMasterEntity)
.where(modelMasterEntity.uuid.eq(uuid))
.fetchOne();
@@ -269,4 +282,94 @@ public class ModelDetailRepositoryImpl implements ModelDetailRepositoryCustom {
modelMetricsTrainEntity.epoch.eq(modelMasterEntity.getBestEpoch()))
.fetchOne();
}
@Override
public ModelFileInfo getModelTrainFileInfo(UUID uuid) {
return queryFactory
.select(
Projections.constructor(
ModelFileInfo.class,
modelMasterEntity
.packingState
.eq(TrainStatusType.COMPLETED.getId())
.coalesce(false),
modelMasterEntity.modelVer))
.from(modelMasterEntity)
.where(modelMasterEntity.uuid.eq(uuid))
.fetchOne();
}
@Override
public List<ModelProgressStepDto> findModelTrainProgressInfo(UUID uuid) {
ModelMasterEntity entity = findByModelByUUID(uuid);
if (entity == null) {
return List.of();
}
List<ModelProgressStepDto> steps = new ArrayList<>();
// 0단계 : 대기 상태
steps.add(
ModelProgressStepDto.builder()
.step(0)
.status(TrainStatusType.READY.getId())
.startTime(entity.getCreatedDttm())
.endTime(null)
.isError(false)
.build());
// 1단계 : Train/Validation 실행
boolean step1Active =
entity.getStep1StrtDttm() != null
&& !TrainStatusType.READY.getId().equals(entity.getStep1State());
if (step1Active) {
steps.add(
ModelProgressStepDto.builder()
.step(1)
.status(entity.getStep1State())
.startTime(entity.getStep1StrtDttm())
.endTime(entity.getStep1EndDttm())
.isError(TrainStatusType.ERROR.getId().equals(entity.getStep1State()))
.build());
}
// 2단계 : Test 실행
boolean step2Done = entity.getStep2State() != null;
if (step2Done) {
steps.add(
ModelProgressStepDto.builder()
.step(2)
.status(entity.getStep2State())
.startTime(entity.getStep2StrtDttm())
.endTime(entity.getStep2EndDttm())
.isError(TrainStatusType.ERROR.getId().equals(entity.getStep2State()))
.build());
}
// 3단계 : 패키징
boolean step3Done = entity.getPackingState() != null;
if (step3Done) {
steps.add(
ModelProgressStepDto.builder()
.step(3)
.status(entity.getPackingState())
.startTime(entity.getPackingStrtDttm())
.endTime(entity.getPackingEndDttm())
.isError(TrainStatusType.ERROR.getId().equals(entity.getPackingState()))
.build());
}
return steps;
}
@Override
public ModelMasterEntity findByModelBeforeId(Long beforeModelId) {
return queryFactory
.selectFrom(modelMasterEntity)
.where(modelMasterEntity.id.eq(beforeModelId))
.fetchOne();
}
}

View File

@@ -1,6 +1,7 @@
package com.kamco.cd.training.postgres.repository.model;
import com.kamco.cd.training.model.dto.ModelTrainMngDto;
import com.kamco.cd.training.model.dto.ModelTrainMngDto.ListDto;
import com.kamco.cd.training.postgres.entity.ModelMasterEntity;
import com.kamco.cd.training.train.dto.TrainRunRequest;
import java.util.Optional;
@@ -15,7 +16,7 @@ public interface ModelMngRepositoryCustom {
* @param searchReq
* @return
*/
Page<ModelMasterEntity> findByModels(ModelTrainMngDto.SearchReq searchReq);
Page<ListDto> findByModels(ModelTrainMngDto.SearchReq searchReq);
Optional<ModelMasterEntity> findByUuid(UUID uuid);

View File

@@ -1,14 +1,18 @@
package com.kamco.cd.training.postgres.repository.model;
import static com.kamco.cd.training.postgres.entity.QMemberEntity.memberEntity;
import static com.kamco.cd.training.postgres.entity.QModelConfigEntity.modelConfigEntity;
import static com.kamco.cd.training.postgres.entity.QModelHyperParamEntity.modelHyperParamEntity;
import static com.kamco.cd.training.postgres.entity.QModelMasterEntity.modelMasterEntity;
import com.kamco.cd.training.common.enums.TrainStatusType;
import com.kamco.cd.training.model.dto.ModelTrainMngDto;
import com.kamco.cd.training.model.dto.ModelTrainMngDto.ListDto;
import com.kamco.cd.training.postgres.entity.ModelMasterEntity;
import com.kamco.cd.training.postgres.entity.QModelMasterEntity;
import com.kamco.cd.training.train.dto.TrainRunRequest;
import com.querydsl.core.BooleanBuilder;
import com.querydsl.core.types.Expression;
import com.querydsl.core.types.Projections;
import com.querydsl.core.types.dsl.Expressions;
import com.querydsl.jpa.impl.JPAQueryFactory;
@@ -34,12 +38,23 @@ public class ModelMngRepositoryImpl implements ModelMngRepositoryCustom {
* @return
*/
@Override
public Page<ModelMasterEntity> findByModels(ModelTrainMngDto.SearchReq req) {
public Page<ListDto> findByModels(ModelTrainMngDto.SearchReq req) {
QModelMasterEntity beforeModel = new QModelMasterEntity("beforeModel"); // alias
Expression<UUID> beforeModelUuid =
com.querydsl.jpa.JPAExpressions.select(beforeModel.uuid)
.from(beforeModel)
.where(beforeModel.id.eq(modelMasterEntity.beforeModelId));
Pageable pageable = req.toPageable();
BooleanBuilder builder = new BooleanBuilder();
if (req.getStatus() != null && !req.getStatus().isEmpty()) {
builder.and(modelMasterEntity.statusCd.eq(req.getStatus()));
builder.and(
modelMasterEntity
.step1State
.eq(req.getStatus())
.or(modelMasterEntity.step2State.eq(req.getStatus())));
}
if (req.getModelNo() != null && !req.getModelNo().isEmpty()) {
@@ -48,9 +63,37 @@ public class ModelMngRepositoryImpl implements ModelMngRepositoryCustom {
builder.and(modelMasterEntity.delYn.isFalse());
List<ModelMasterEntity> content =
List<ListDto> content =
queryFactory
.selectFrom(modelMasterEntity)
.select(
Projections.constructor(
ListDto.class,
modelMasterEntity.id,
modelMasterEntity.uuid,
modelMasterEntity.modelVer,
modelMasterEntity.strtDttm,
modelMasterEntity.step1StrtDttm,
modelMasterEntity.step1EndDttm,
modelMasterEntity.step2StrtDttm,
modelMasterEntity.step2EndDttm,
modelMasterEntity.step1State,
modelMasterEntity.step2State,
modelMasterEntity.statusCd,
modelMasterEntity.trainType,
modelMasterEntity.modelNo,
modelMasterEntity.currentAttemptId,
modelMasterEntity.requestPath,
modelMasterEntity.packingState,
modelMasterEntity.packingStrtDttm,
modelMasterEntity.packingEndDttm,
modelConfigEntity.memo,
memberEntity.name,
beforeModelUuid))
.from(modelMasterEntity)
.innerJoin(modelConfigEntity)
.on(modelMasterEntity.id.eq(modelConfigEntity.model.id))
.leftJoin(memberEntity)
.on(modelMasterEntity.createdUid.eq(memberEntity.id))
.where(builder)
.offset(pageable.getOffset())
.limit(pageable.getPageSize())
@@ -62,6 +105,10 @@ public class ModelMngRepositoryImpl implements ModelMngRepositoryCustom {
queryFactory
.select(modelMasterEntity.count())
.from(modelMasterEntity)
.innerJoin(modelConfigEntity)
.on(modelMasterEntity.id.eq(modelConfigEntity.model.id))
.leftJoin(memberEntity)
.on(modelMasterEntity.createdUid.eq(memberEntity.id))
.where(builder)
.fetchOne();
@@ -154,7 +201,11 @@ public class ModelMngRepositoryImpl implements ModelMngRepositoryCustom {
return queryFactory
.select(modelMasterEntity.id.count())
.from(modelMasterEntity)
.where(modelMasterEntity.step1State.eq(TrainStatusType.IN_PROGRESS.getId()))
.where(
modelMasterEntity
.step1State
.eq(TrainStatusType.IN_PROGRESS.getId())
.or(modelMasterEntity.step2State.eq(TrainStatusType.IN_PROGRESS.getId())))
.fetchOne();
}
}

View File

@@ -1,6 +1,9 @@
package com.kamco.cd.training.postgres.repository.train;
import com.kamco.cd.training.train.dto.ModelTrainMetricsDto.ModelMetricJsonDto;
import com.kamco.cd.training.train.dto.ModelTrainMetricsDto.ModelTestFileName;
import com.kamco.cd.training.train.dto.ModelTrainMetricsDto.ResponsePathDto;
import java.time.ZonedDateTime;
import java.util.List;
public interface ModelTestMetricsJobRepositoryCustom {
@@ -10,4 +13,12 @@ public interface ModelTestMetricsJobRepositoryCustom {
List<ResponsePathDto> getTestMetricSaveNotYetModelIds();
void insertModelMetricsTest(List<Object[]> batchArgs);
ModelMetricJsonDto getTestMetricPackingInfo(Long modelId);
ModelTestFileName findModelTestFileNames(Long modelId);
void updatePackingStart(Long modelId, ZonedDateTime now);
void updatePackingEnd(Long modelId, ZonedDateTime now, String failSuccState);
}

View File

@@ -1,12 +1,18 @@
package com.kamco.cd.training.postgres.repository.train;
import static com.kamco.cd.training.postgres.entity.QModelMasterEntity.modelMasterEntity;
import static com.kamco.cd.training.postgres.entity.QModelMetricsTestEntity.modelMetricsTestEntity;
import static com.kamco.cd.training.postgres.entity.QModelMetricsTrainEntity.modelMetricsTrainEntity;
import com.kamco.cd.training.common.enums.TrainStatusType;
import com.kamco.cd.training.postgres.entity.ModelMetricsTestEntity;
import com.kamco.cd.training.train.dto.ModelTrainMetricsDto.ModelMetricJsonDto;
import com.kamco.cd.training.train.dto.ModelTrainMetricsDto.ModelTestFileName;
import com.kamco.cd.training.train.dto.ModelTrainMetricsDto.Properties;
import com.kamco.cd.training.train.dto.ModelTrainMetricsDto.ResponsePathDto;
import com.querydsl.core.types.Projections;
import com.querydsl.jpa.impl.JPAQueryFactory;
import java.time.ZonedDateTime;
import java.util.List;
import org.springframework.data.jpa.repository.support.QuerydslRepositorySupport;
import org.springframework.jdbc.core.JdbcTemplate;
@@ -59,15 +65,107 @@ public class ModelTestMetricsJobRepositoryImpl extends QuerydslRepositorySupport
@Override
public void insertModelMetricsTest(List<Object[]> batchArgs) {
String sql =
// AS-IS
// String sql =
// """
// insert into tb_model_metrics_test
// (model_id, model, tp, fp, fn, precisions, recall, f1_score, accuracy, iou,
// detection_count, gt_count
// )
// values (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
// """;
//
// jdbcTemplate.batchUpdate(sql, batchArgs);
// TO-BE: modelId, model(best_fscore_10) 같은 데이터가 있으면 update, 없으면 insert
String updateSql =
"""
insert into tb_model_metrics_test
(model_id, model, tp, fp, fn, precisions, recall, f1_score, accuracy, iou,
detection_count, gt_count
)
values (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
UPDATE tb_model_metrics_test
SET tp=?, fp=?, fn=?, precisions=?, recall=?, f1_score=?, accuracy=?, iou=?,
detection_count=?, gt_count=?
WHERE model_id=? AND model=?
""";
jdbcTemplate.batchUpdate(sql, batchArgs);
String insertSql =
"""
INSERT INTO tb_model_metrics_test
(model_id, model, tp, fp, fn, precisions, recall, f1_score, accuracy, iou,
detection_count, gt_count)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
""";
// row 단위 처리 (batch 안에서 upsert)
for (Object[] row : batchArgs) {
// row 순서: (model_id, model, tp, fp, fn, precisions, recall, f1_score, accuracy, iou,
// detection_count, gt_count)
int updated =
jdbcTemplate.update(
updateSql, row[2], row[3], row[4], row[5], row[6], row[7], row[8], row[9], row[10],
row[11], row[0], row[1]);
if (updated == 0) {
jdbcTemplate.update(insertSql, row);
}
}
}
@Override
public ModelMetricJsonDto getTestMetricPackingInfo(Long modelId) {
return queryFactory
.select(
Projections.constructor(
ModelMetricJsonDto.class,
modelMasterEntity.modelNo,
modelMasterEntity.modelVer,
Projections.constructor(
Properties.class,
modelMetricsTestEntity.f1Score,
modelMetricsTestEntity.precisions,
modelMetricsTestEntity.recall,
modelMetricsTestEntity.iou,
modelMetricsTrainEntity.loss)))
.from(modelMetricsTestEntity)
.innerJoin(modelMasterEntity)
.on(modelMetricsTestEntity.model.id.eq(modelMasterEntity.id))
.innerJoin(modelMetricsTrainEntity)
.on(
modelMetricsTestEntity.model.eq(modelMetricsTrainEntity.model),
modelMasterEntity.bestEpoch.eq(modelMetricsTrainEntity.epoch))
.where(modelMetricsTestEntity.model.id.eq(modelId))
.orderBy(modelMetricsTestEntity.createdDttm.desc())
.fetchFirst();
}
@Override
public ModelTestFileName findModelTestFileNames(Long modelId) {
return queryFactory
.select(
Projections.constructor(
ModelTestFileName.class, modelMetricsTestEntity.model1, modelMasterEntity.modelVer))
.from(modelMetricsTestEntity)
.innerJoin(modelMasterEntity)
.on(modelMetricsTestEntity.model.id.eq(modelMasterEntity.id))
.where(modelMetricsTestEntity.model.id.eq(modelId))
.fetchOne();
}
@Override
public void updatePackingStart(Long modelId, ZonedDateTime now) {
queryFactory
.update(modelMasterEntity)
.set(modelMasterEntity.packingStrtDttm, ZonedDateTime.now())
.set(modelMasterEntity.packingState, TrainStatusType.READY.getId())
.where(modelMasterEntity.id.eq(modelId))
.execute();
}
@Override
public void updatePackingEnd(Long modelId, ZonedDateTime now, String failSuccState) {
queryFactory
.update(modelMasterEntity)
.set(modelMasterEntity.packingEndDttm, ZonedDateTime.now())
.set(modelMasterEntity.packingState, failSuccState)
.where(modelMasterEntity.id.eq(modelId))
.execute();
}
}

View File

@@ -1,10 +1,17 @@
package com.kamco.cd.training.postgres.repository.train;
import com.kamco.cd.training.postgres.entity.ModelTrainJobEntity;
import java.util.List;
import java.util.Optional;
public interface ModelTrainJobRepositoryCustom {
int findMaxAttemptNo(Long modelId);
Optional<ModelTrainJobEntity> findLatestByModelId(Long modelId);
Optional<ModelTrainJobEntity> findByContainerName(String containerName);
void insertModelTestTrainingRun(Long modelId, Long jobId, int epoch);
List<ModelTrainJobEntity> findRunningJobs();
}

View File

@@ -1,9 +1,15 @@
package com.kamco.cd.training.postgres.repository.train;
import static com.kamco.cd.training.postgres.entity.QModelTestTrainingRunEntity.modelTestTrainingRunEntity;
import static com.kamco.cd.training.postgres.entity.QModelTrainJobEntity.modelTrainJobEntity;
import com.kamco.cd.training.common.enums.JobStatusType;
import com.kamco.cd.training.common.enums.JobType;
import com.kamco.cd.training.postgres.entity.ModelTrainJobEntity;
import com.kamco.cd.training.postgres.entity.QModelTrainJobEntity;
import com.querydsl.jpa.impl.JPAQueryFactory;
import jakarta.persistence.EntityManager;
import java.util.List;
import java.util.Optional;
import org.springframework.stereotype.Repository;
@@ -19,7 +25,7 @@ public class ModelTrainJobRepositoryImpl implements ModelTrainJobRepositoryCusto
/** modelId의 attempt_no 최대값. (없으면 0) */
@Override
public int findMaxAttemptNo(Long modelId) {
QModelTrainJobEntity j = QModelTrainJobEntity.modelTrainJobEntity;
QModelTrainJobEntity j = modelTrainJobEntity;
Integer max =
queryFactory.select(j.attemptNo.max()).from(j).where(j.modelId.eq(modelId)).fetchOne();
@@ -33,11 +39,61 @@ public class ModelTrainJobRepositoryImpl implements ModelTrainJobRepositoryCusto
*/
@Override
public Optional<ModelTrainJobEntity> findLatestByModelId(Long modelId) {
QModelTrainJobEntity j = QModelTrainJobEntity.modelTrainJobEntity;
QModelTrainJobEntity j = modelTrainJobEntity;
ModelTrainJobEntity job =
queryFactory.selectFrom(j).where(j.modelId.eq(modelId)).orderBy(j.id.desc()).fetchFirst();
return Optional.ofNullable(job);
}
@Override
public Optional<ModelTrainJobEntity> findByContainerName(String containerName) {
QModelTrainJobEntity j = modelTrainJobEntity;
ModelTrainJobEntity job =
queryFactory
.selectFrom(j)
.where(j.containerName.eq(containerName))
.orderBy(j.id.desc())
.fetchFirst();
return Optional.ofNullable(job);
}
@Override
public void insertModelTestTrainingRun(Long modelId, Long jobId, int epoch) {
Integer maxAttemptNo =
queryFactory
.select(modelTestTrainingRunEntity.attemptNo.max().coalesce(0))
.from(modelTestTrainingRunEntity)
.where(modelTestTrainingRunEntity.modelId.eq(modelId))
.fetchOne();
int nextAttemptNo = (maxAttemptNo == null ? 1 : maxAttemptNo + 1);
queryFactory
.insert(modelTestTrainingRunEntity)
.columns(
modelTestTrainingRunEntity.modelId,
modelTestTrainingRunEntity.attemptNo,
modelTestTrainingRunEntity.jobId,
modelTestTrainingRunEntity.epoch)
.values(modelId, nextAttemptNo, jobId, epoch)
.execute();
}
@Override
public List<ModelTrainJobEntity> findRunningJobs() {
return queryFactory
.select(modelTrainJobEntity)
.from(modelTrainJobEntity)
.where(
modelTrainJobEntity
.statusCd
.eq(JobStatusType.RUNNING.getId())
.and(modelTrainJobEntity.jobType.eq(JobType.TRAIN.getId())))
.orderBy(modelTrainJobEntity.id.desc())
.fetch();
}
}

View File

@@ -12,4 +12,6 @@ public interface ModelTrainMetricsJobRepositoryCustom {
void updateModelMetricsTrainSaveYn(Long modelId, String stepNo);
void insertModelMetricsValidation(List<Object[]> batchArgs);
void updateModelSelectedBestEpoch(Long modelId, Integer epoch);
}

View File

@@ -82,4 +82,13 @@ public class ModelTrainMetricsJobRepositoryImpl extends QuerydslRepositorySuppor
jdbcTemplate.batchUpdate(sql, batchArgs);
}
@Override
public void updateModelSelectedBestEpoch(Long modelId, Integer epoch) {
queryFactory
.update(modelMasterEntity)
.set(modelMasterEntity.bestEpoch, epoch)
.where(modelMasterEntity.id.eq(modelId))
.execute();
}
}

View File

@@ -1,6 +1,7 @@
package com.kamco.cd.training.train;
import com.kamco.cd.training.config.api.ApiResponseDto;
import com.kamco.cd.training.train.service.DataSetCountersService;
import com.kamco.cd.training.train.service.TestJobService;
import com.kamco.cd.training.train.service.TrainJobService;
import io.swagger.v3.oas.annotations.Operation;
@@ -12,6 +13,8 @@ import io.swagger.v3.oas.annotations.responses.ApiResponses;
import io.swagger.v3.oas.annotations.tags.Tag;
import java.util.UUID;
import lombok.RequiredArgsConstructor;
import org.springframework.http.MediaType;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestMapping;
@@ -25,6 +28,7 @@ public class TrainApiController {
private final TrainJobService trainJobService;
private final TestJobService testJobService;
private final DataSetCountersService dataSetCountersService;
@Operation(summary = "학습 실행", description = "학습 실행 API")
@ApiResponses(
@@ -46,6 +50,7 @@ public class TrainApiController {
UUID uuid) {
Long modelId = trainJobService.getModelIdByUuid(uuid);
trainJobService.enqueue(modelId);
return ApiResponseDto.ok("ok");
}
@@ -180,10 +185,32 @@ public class TrainApiController {
})
@PostMapping("/create-tmp/{uuid}")
public ApiResponseDto<UUID> createTmpFile(
@Parameter(description = "uuid", example = "80a0e544-36ed-4999-b705-97427f23337d")
@Parameter(description = "model uuid", example = "80a0e544-36ed-4999-b705-97427f23337d")
@PathVariable
UUID uuid) {
return ApiResponseDto.ok(trainJobService.createTmpFile(uuid));
}
@Operation(summary = "getCount", description = "getCount 서버 로그확인")
@ApiResponses(
value = {
@ApiResponse(
responseCode = "200",
description = "데이터셋 tmp 파일생성 성공",
content =
@Content(
mediaType = "application/json",
schema = @Schema(implementation = String.class))),
@ApiResponse(responseCode = "400", description = "잘못된 검색 조건", content = @Content),
@ApiResponse(responseCode = "500", description = "서버 오류", content = @Content)
})
@GetMapping(path = "/counts/{uuid}", produces = MediaType.APPLICATION_JSON_VALUE)
public ApiResponseDto<String> getCount(
@Parameter(description = "uuid", example = "e22181eb-2ac4-4100-9941-d06efce25c49")
@PathVariable
UUID uuid) {
Long modelId = trainJobService.getModelIdByUuid(uuid);
return ApiResponseDto.ok(dataSetCountersService.getCount(modelId));
}
}

View File

@@ -13,4 +13,10 @@ public class EvalRunRequest {
private String uuid;
private int epoch; // best_changed_fscore_epoch_1.pth
private Integer timeoutSeconds;
private String datasetFolder;
private String outputFolder;
public String getOutputFolder() {
return this.outputFolder.toString();
}
}

View File

@@ -0,0 +1,24 @@
package com.kamco.cd.training.train.dto;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.Setter;
@Getter
@Setter
@NoArgsConstructor
@AllArgsConstructor
public class ModelTrainLinkDto {
private Long modelId;
private String trainType;
private String modelNo;
private Long datasetId;
private String targetClassCd;
private String comparePath;
private String targetPath;
private String labelPath;
private String geoJsonPath;
private String datasetUid;
}

View File

@@ -1,6 +1,8 @@
package com.kamco.cd.training.train.dto;
import com.fasterxml.jackson.annotation.JsonProperty;
import io.swagger.v3.oas.annotations.media.Schema;
import java.util.Properties;
import java.util.UUID;
import lombok.AllArgsConstructor;
import lombok.Getter;
@@ -20,4 +22,38 @@ public class ModelTrainMetricsDto {
private String responsePath;
private UUID uuid;
}
@Getter
@AllArgsConstructor
public static class ModelMetricJsonDto {
@JsonProperty("cd_model_type")
private String cdModelType;
@JsonProperty("model_version")
private String modelVersion;
private Properties properties;
}
@Getter
@AllArgsConstructor
public static class Properties {
@JsonProperty("f1_score")
private Float f1Score;
private Float precision;
private Float recall;
private Float loss;
private Double iou;
}
@Getter
@AllArgsConstructor
public static class ModelTestFileName {
private String bestEpochFileName;
private String modelVersion;
}
}

View File

@@ -0,0 +1,230 @@
package com.kamco.cd.training.train.service;
import com.kamco.cd.training.model.dto.ModelTrainMngDto;
import com.kamco.cd.training.postgres.core.ModelTrainMngCoreService;
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.ArrayList;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
import java.util.stream.Collectors;
import lombok.Getter;
import lombok.RequiredArgsConstructor;
import lombok.extern.log4j.Log4j2;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Service;
/** 학습실행 파일 하드링크 */
@Service
@Log4j2
@RequiredArgsConstructor
public class DataSetCountersService {
private final ModelTrainMngCoreService modelTrainMngCoreService;
@Value("${train.docker.requestDir}")
private String requestDir;
@Value("${train.docker.basePath}")
private String trainBaseDir;
public String getCount(Long modelId) {
ModelTrainMngDto.Basic basic = modelTrainMngCoreService.findModelById(modelId);
List<Long> datasetIds = modelTrainMngCoreService.findModelDatasetMapp(modelId);
List<String> uids = modelTrainMngCoreService.findDatasetUid(datasetIds);
StringBuilder allLogs = new StringBuilder();
try {
// request 폴더
for (String uid : uids) {
Path path = Path.of(requestDir, uid);
DatasetCounters counters = countTmpAfterBuild(path);
allLogs.append(counters.prints(uid, "REQUEST")).append(System.lineSeparator());
}
// tmp
Path tmpPath = Path.of(trainBaseDir, "tmp", basic.getRequestPath());
// 차이나는거
diffMergedRequestsVsTmp(uids, tmpPath);
DatasetCounters counters2 = countTmpAfterBuild(tmpPath);
allLogs
.append(counters2.prints(basic.getRequestPath(), "TMP"))
.append(System.lineSeparator());
} catch (IOException e) {
log.error(e.getMessage());
}
return allLogs.toString();
}
private int countTif(Path dir) throws IOException {
if (!Files.isDirectory(dir)) return 0;
try (var stream = Files.walk(dir)) {
return (int)
stream.filter(Files::isRegularFile).filter(p -> p.toString().endsWith(".tif")).count();
}
/*
대소문자 및 geojson 필요시
* try (var stream = Files.walk(dir)) {
return (int)
stream
.filter(Files::isRegularFile)
.filter(p -> {
String name = p.getFileName().toString().toLowerCase();
return name.endsWith(".tif") || name.endsWith(".geojson");
})
.count();
}
* */
}
public DatasetCounters countTmpAfterBuild(Path path) throws IOException {
// input1
int in1Train = countTif(path.resolve("train/input1"));
int in1Val = countTif(path.resolve("val/input1"));
int in1Test = countTif(path.resolve("test/input1"));
// input2
int in2Train = countTif(path.resolve("train/input2"));
int in2Val = countTif(path.resolve("val/input2"));
int in2Test = countTif(path.resolve("test/input2"));
List<DatasetCounter> input1List = new ArrayList<>();
List<DatasetCounter> input2List = new ArrayList<>();
input1List.add(new DatasetCounter(in1Train, in1Test, in1Val));
input2List.add(new DatasetCounter(in2Train, in2Test, in2Val));
return new DatasetCounters(input1List, input2List);
}
@Getter
public static class DatasetCounter {
private int inputTrain = 0;
private int inputTest = 0;
private int inputVal = 0;
public DatasetCounter(int inputTrain, int inputTest, int inputVal) {
this.inputTrain = inputTrain;
this.inputTest = inputTest;
this.inputVal = inputVal;
}
}
@Getter
public static class DatasetCounters {
private List<DatasetCounter> input1 = new ArrayList<>();
private List<DatasetCounter> input2 = new ArrayList<>();
public DatasetCounters(List<DatasetCounter> input1, List<DatasetCounter> input2) {
this.input1 = input1;
this.input2 = input2;
}
public String prints(String uuid, String type) {
int train = 0, test = 0, val = 0;
int train2 = 0, test2 = 0, val2 = 0;
for (DatasetCounter datasetCounter : input1) {
train += datasetCounter.inputTrain;
test += datasetCounter.inputTest;
val += datasetCounter.inputVal;
}
for (DatasetCounter datasetCounter : input2) {
train2 += datasetCounter.inputTrain;
test2 += datasetCounter.inputTest;
val2 += datasetCounter.inputVal;
}
log.info("======== UUID FOLDER COUNT {} : {}", type, uuid);
log.info("input 1 = train : {} | val : {} | test : {} ", train, val, test);
log.info("input 2 = train : {} | val : {} | test : {} ", train2, val2, test2);
log.info(
"*total* = train : {} | val : {} | test : {} ",
train + train2,
val + val2,
test + test2);
return String.format(
"======== UUID FOLDER COUNT %s : %s%n"
+ "input 1 = train : %s | val : %s | test : %s%n"
+ "input 2 = train : %s | val : %s | test : %s%n"
+ "*total* = train : %s | val : %s | test : %s",
type,
uuid,
train,
val,
test,
train2,
val2,
test2,
train + train2,
val + val2,
test + test2);
}
}
private Set<String> listTifRelative(Path root) throws IOException {
if (!Files.isDirectory(root)) return Set.of();
try (var stream = Files.walk(root)) {
return stream
.filter(Files::isRegularFile)
.filter(p -> p.getFileName().toString().toLowerCase().endsWith(".tif"))
.map(p -> root.relativize(p).toString().replace("\\", "/"))
.collect(Collectors.toSet());
}
}
private Set<String> listTifFileNameOnly(Path root) throws IOException {
if (!Files.isDirectory(root)) return Set.of();
try (var stream = Files.walk(root)) {
return stream
.filter(Files::isRegularFile)
.filter(p -> p.getFileName().toString().toLowerCase().endsWith(".tif"))
.map(p -> p.getFileName().toString()) // 파일명만
.collect(Collectors.toSet());
}
}
public void diffMergedRequestsVsTmp(List<String> uids, Path tmpRoot) throws IOException {
// 1) 요청 uids 전체를 합친 tif "파일명" 집합
Set<String> reqAll = new HashSet<>();
for (String uid : uids) {
Path reqRoot = Path.of(requestDir, uid);
// ★합본 tmp는 보통 폴더 구조가 바뀌므로 "상대경로" 비교보다 파일명 비교가 먼저 유용합니다.
reqAll.addAll(listTifFileNameOnly(reqRoot));
}
// 2) tmp tif 파일명 집합
Set<String> tmpAll = listTifFileNameOnly(tmpRoot);
Set<String> missing = new HashSet<>(reqAll);
missing.removeAll(tmpAll);
Set<String> extra = new HashSet<>(tmpAll);
extra.removeAll(reqAll);
log.info("==== MERGED DIFF (filename-based) ====");
log.info("request(all uids) tif = {}", reqAll.size());
log.info("tmp tif = {}", tmpAll.size());
log.info("missing = {}", missing.size());
log.info("extra = {}", extra.size());
log.info("[MISSING] total = {}", missing.size());
missing.stream().sorted().limit(50).forEach(f -> log.warn("[MISSING] {}", f));
log.info("[EXTRA] total = {}", extra.size());
extra.stream().sorted().limit(50).forEach(f -> log.warn("[EXTRA] {}", f));
}
}

View File

@@ -1,22 +1,29 @@
package com.kamco.cd.training.train.service;
import com.kamco.cd.training.postgres.core.ModelTrainJobCoreService;
import com.kamco.cd.training.train.dto.EvalRunRequest;
import com.kamco.cd.training.train.dto.TrainRunRequest;
import com.kamco.cd.training.train.dto.TrainRunResult;
import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicInteger;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import lombok.RequiredArgsConstructor;
import lombok.extern.log4j.Log4j2;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Service;
@Log4j2
@Service
@RequiredArgsConstructor
public class DockerTrainService {
// 실행할 Docker 이미지명
@@ -39,11 +46,27 @@ public class DockerTrainService {
@Value("${train.docker.shmSize:16g}")
private String shmSize;
// data 경로 request,response 상위 폴더
@Value("${train.docker.basePath}")
private String basePath;
// IPC host 사용 여부
@Value("${train.docker.ipcHost:true}")
private boolean ipcHost;
/** Docker 학습 컨테이너를 동기 실행 - 요청 스레드에서 docker run 실행 - 컨테이너 종료까지 대기 - stdout/stderr 로그 수집 후 결과 반환 */
@Value("${spring.profiles.active}")
private String profile;
private final ModelTrainJobCoreService modelTrainJobCoreService;
/**
* Docker 학습 컨테이너를 동기 실행 - 요청 스레드에서 docker run 실행 - 컨테이너 종료까지 대기 - stdout/stderr 로그 수집 후 결과 반환
*
* @param req
* @param containerName
* @return
* @throws Exception
*/
public TrainRunResult runTrainSync(TrainRunRequest req, String containerName) throws Exception {
List<String> cmd = buildDockerRunCommand(containerName, req);
@@ -56,12 +79,42 @@ public class DockerTrainService {
ProcessBuilder pb = new ProcessBuilder(cmd);
pb.redirectErrorStream(true);
Process p = pb.start();
// 로그는 별도 스레드에서 읽기 (메인 스레드가 readLine에 안 걸리게)
StringBuilder logBuilder = new StringBuilder();
Process p = pb.start();
Pattern epochPattern = Pattern.compile("(?i)\\bepoch\\s*\\[?(\\d+)\\s*/\\s*(\\d+)\\]?\\b");
log.info("[TRAIN-BOOT] docker run started. container={}", containerName);
try {
log.info("[TRAIN-BOOT] pid={}", p.pid()); // Java 9+
} catch (Throwable ignore) {
}
try {
// 바로 죽었는지 100ms만 체크
if (p.waitFor(100, TimeUnit.MILLISECONDS)) {
int exit = p.exitValue();
String earlyLogs;
synchronized (logBuilder) {
earlyLogs = logBuilder.toString();
}
log.error(
"[TRAIN-BOOT] docker run exited immediately. container={} exit={}",
containerName,
exit);
log.error("[TRAIN-BOOT] early logs:\n{}", earlyLogs);
} else {
log.info("[TRAIN-BOOT] docker run is still running. container={}", containerName);
}
} catch (Exception e) {
log.warn("[TRAIN-BOOT] early-exit check failed: {}", e.toString(), e);
}
Pattern epochPattern = Pattern.compile("Epoch\\(train\\)\\s+\\[(\\d+)\\]\\[(\\d+)/(\\d+)\\]");
// 너무 잦은 업데이트 방지용
AtomicInteger lastEpoch = new AtomicInteger(0);
AtomicInteger lastIter = new AtomicInteger(0);
Thread logThread =
new Thread(
@@ -73,23 +126,40 @@ public class DockerTrainService {
String line;
while ((line = br.readLine()) != null) {
// 1) 로그 누적
synchronized (logBuilder) {
logBuilder.append(line).append('\n');
}
// 2) epoch 감지 + DB 업데이트
Matcher m = epochPattern.matcher(line);
if (m.find()) {
int currentEpoch = Integer.parseInt(m.group(1));
int totalEpoch = Integer.parseInt(m.group(2));
int epoch = Integer.parseInt(m.group(1));
int iter = Integer.parseInt(m.group(2));
int totalIter = Integer.parseInt(m.group(3));
log.info("[EPOCH] container={} {}/{}", containerName, currentEpoch, totalEpoch);
// (선택) maxEpochs는 req에서 알고 있으니 req.getEpochs() 같은 걸로 사용
int maxEpochs = req.getEpochs() != null ? req.getEpochs() : 0;
// TODO 실행중인 에폭 저장 필요하면 만들어야함
// TODO 하지만 여기서 트랜젝션 걸리는 db 작업하면 안좋다고하는데..?
// modelTrainMngCoreService.updateCurrentEpoch(modelId,
// currentEpoch, totalEpoch);
// 쓰로틀링: 에폭 끝 or 10 iter마다
boolean shouldUpdate = (iter == totalIter) || (iter % 10 == 0);
// 중복 방지
if (shouldUpdate) {
int prevEpoch = lastEpoch.get();
int prevIter = lastIter.get();
if (epoch != prevEpoch || iter != prevIter) {
lastEpoch.set(epoch);
lastIter.set(iter);
log.info(
"[TRAIN] container={} epoch={} iter={}/{}",
containerName,
epoch,
iter,
totalIter);
modelTrainJobCoreService.updateEpoch(containerName, epoch);
}
}
}
}
} catch (Exception e) {
@@ -97,21 +167,6 @@ public class DockerTrainService {
}
},
"train-log-" + containerName);
// new Thread(
// () -> {
// try (BufferedReader br =
// new BufferedReader(
// new InputStreamReader(p.getInputStream(), StandardCharsets.UTF_8))) {
// String line;
// while ((line = br.readLine()) != null) {
// synchronized (log) {
// log.append(line).append('\n');
// }
// }
// } catch (Exception ignored) {
// }
// },
// "train-log-" + containerName);
logThread.setDaemon(true);
logThread.start();
@@ -206,7 +261,9 @@ public class DockerTrainService {
// 요청/결과 디렉토리 볼륨 마운트
c.add("-v");
c.add(requestDir + "/tmp:/data");
c.add(basePath + ":" + basePath); // 심볼릭 링크와 연결되는 실제 파일 경로도 마운트를 해줘야 함
c.add("-v");
c.add(basePath + "/tmp:/data");
c.add("-v");
c.add(responseDir + ":/checkpoints");
@@ -225,9 +282,15 @@ public class DockerTrainService {
addArg(c, "--output-folder", req.getOutputFolder());
addArg(c, "--input-size", req.getInputSize());
addArg(c, "--crop-size", req.getCropSize());
addArg(c, "--batch-size", req.getBatchSize());
addArg(c, "--gpu-ids", req.getGpuIds());
// addArg(c, "--gpus", req.getGpus());
// addArg(c, "--batch-size", req.getBatchSize());
// addArg(c, "--gpu-ids", req.getGpuIds()); // null
if ("prod".equals(profile)) {
addArg(c, "--batch-size", 2); // 학습서버 GPU 1개인 곳은 batch-size:2 까지만 가능
addArg(c, "--gpus", "1"); // 학습서버 GPU 1개인 곳은 1이어야 함
addArg(c, "--gpu-ids", "0"); // 학습서버 GPU 1개인 곳은 0이어야 함
} else {
addArg(c, "--batch-size", req.getBatchSize()); // 학습서버 GPU 1개인 곳은 batch-size:2 까지만 가능
}
addArg(c, "--lr", req.getLearningRate());
addArg(c, "--backbone", req.getBackbone());
addArg(c, "--epochs", req.getEpochs());
@@ -264,15 +327,20 @@ public class DockerTrainService {
// ===== Augmentation =====
addArg(c, "--rot-prob", req.getRotProb());
// addArg(c, "--rot-degree", req.getRotDegree()); // TODO AI 수정되면 주석 해제
addArg(c, "--rot-degree", req.getRotDegree());
addArg(c, "--flip-prob", req.getFlipProb());
addArg(c, "--exchange-prob", req.getExchangeProb());
addArg(c, "--brightness-delta", req.getBrightnessDelta());
// addArg(c, "--contrast-range", req.getContrastRange()); // TODO AI 수정되면 주석 해제
// addArg(c, "--saturation-range", req.getSaturationRange()); // TODO AI 수정되면 주석 해제
addArg(c, "--contrast-range", req.getContrastRange());
addArg(c, "--saturation-range", req.getSaturationRange());
addArg(c, "--hue-delta", req.getHueDelta());
addArg(c, "--resume-from", req.getResumeFrom());
if (req.getResumeFrom() != null && !req.getResumeFrom().isBlank()) {
c.add("--resume");
addArg(c, "--load-from", req.getResumeFrom());
}
addArg(c, "--save-interval", 1);
return c;
}
@@ -296,7 +364,15 @@ public class DockerTrainService {
}
}
public TrainRunResult runEvalSync(EvalRunRequest req, String containerName) throws Exception {
/**
* Docker 학습 컨테이너를 동기 실행 - 요청 스레드에서 docker run 실행 - 컨테이너 종료까지 대기 - stdout/stderr 로그 수집 후 결과 반환
*
* @param containerName
* @param req
* @return
* @throws Exception
*/
public TrainRunResult runEvalSync(String containerName, EvalRunRequest req) throws Exception {
List<String> cmd = buildDockerEvalCommand(containerName, req);
@@ -341,7 +417,6 @@ public class DockerTrainService {
synchronized (log) {
logs = log.toString();
}
return new TrainRunResult(null, containerName, -1, "TIMEOUT", logs);
}
@@ -370,37 +445,61 @@ public class DockerTrainService {
if (uuid == null || uuid.isBlank()) throw new IllegalArgumentException("uuid is required");
if (epoch == null || epoch <= 0) throw new IllegalArgumentException("epoch must be > 0");
String modelFile = "best_changed_fscore_epoch_" + epoch + ".pth";
Path epochPath = Paths.get(responseDir, req.getOutputFolder());
// 결과 폴더에 파라미터로 받은 베스트 epoch이 best_changed_fscore_epoch_ 로 시작하는 파일이 있는지 확인 후 pth 파일명 반환
String modelFile = findCheckpoint(epochPath, epoch);
List<String> c = new ArrayList<>();
c.add("docker");
c.add("run");
c.add("--name");
c.add(containerName);
c.add("--rm");
c.add("--gpus");
c.add("all");
if (ipcHost) c.add("--ipc=host");
c.add("--ipc=host");
c.add("--shm-size=" + shmSize);
c.add("-v");
c.add(requestDir + ":/data");
c.add(basePath + ":" + basePath); // 심볼릭 링크와 연결되는 실제 파일 경로도 마운트를 해줘야 함
c.add("-v");
c.add(basePath + "/tmp:/data");
c.add("-v");
c.add(responseDir + ":/checkpoints");
c.add(image);
c.add("kamco-cd-train:latest");
c.add("python");
c.add("/workspace/change-detection-code/run_evaluation_pipeline.py");
c.add("--dataset_dir");
c.add("/data/" + uuid);
addArg(c, "--dataset-folder", req.getDatasetFolder());
addArg(c, "--output-folder", req.getOutputFolder());
c.add("--model");
c.add("/checkpoints/" + uuid + "/" + modelFile);
c.add("--epoch");
c.add(modelFile);
return c;
}
public String findCheckpoint(Path dir, int epoch) {
String bestFileName = String.format("best_changed_fscore_epoch_%d.pth", epoch);
String normalFileName = String.format("epoch_%d.pth", epoch);
Path bestPath = dir.resolve(bestFileName);
Path normalPath = dir.resolve(normalFileName);
// 1. best 파일이 존재하면 그거 사용
if (Files.isRegularFile(bestPath)) {
return bestFileName;
}
// 2. 없으면 일반 epoch 파일 사용
if (Files.isRegularFile(normalPath)) {
return normalFileName;
}
throw new IllegalStateException("Checkpoint 파일이 없습니다. epoch=" + epoch);
}
}

View File

@@ -0,0 +1,533 @@
package com.kamco.cd.training.train.service;
import com.kamco.cd.training.model.dto.ModelTrainMngDto;
import com.kamco.cd.training.postgres.core.ModelTrainJobCoreService;
import com.kamco.cd.training.postgres.core.ModelTrainMngCoreService;
import com.kamco.cd.training.train.dto.ModelTrainJobDto;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.nio.charset.StandardCharsets;
import java.nio.file.DirectoryStream;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.List;
import java.util.Map;
import java.util.Optional;
import java.util.concurrent.TimeUnit;
import java.util.stream.Stream;
import lombok.RequiredArgsConstructor;
import lombok.extern.log4j.Log4j2;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.context.event.ApplicationReadyEvent;
import org.springframework.context.annotation.Profile;
import org.springframework.context.event.EventListener;
import org.springframework.stereotype.Component;
import org.springframework.transaction.annotation.Transactional;
/**
* 서버 재기동 시 "RUNNING 상태로 남아있는 학습 Job"을 복구(정리)하기 위한 서비스.
*
* <p>상황 예시: - 서버가 강제 재기동/장애로 내려감 - DB 상에서는 job_state가 RUNNING(진행중)으로 남아있음 - 실제 docker 컨테이너는: 1) 아직
* 살아있거나(running=true) 2) 종료되었거나(exited) 3) --rm 옵션으로 인해 컨테이너가 이미 삭제되어 존재하지 않을 수 있음
*
* <p>이 클래스는 ApplicationReadyEvent(스프링 부팅 완료) 시점에 실행되어, DB의 RUNNING 잡들을 조회한 뒤 컨테이너 상태를 점검하고,
* SUCCESS/FAILED 처리를 수행합니다.
*/
@Profile("!local")
@Component
@RequiredArgsConstructor
@Log4j2
public class JobRecoveryOnStartupService {
private final ModelTrainJobCoreService modelTrainJobCoreService;
private final ModelTrainMngCoreService modelTrainMngCoreService;
private final ModelTrainMetricsJobService modelTrainMetricsJobService;
/**
* Docker 컨테이너가 쓰는 response(산출물) 디렉토리의 "호스트 측" 베이스 경로. 예) /data/train/response
*
* <p>컨테이너가 --rm 으로 삭제된 경우에도 이 경로에 val.csv / *.pth 등이 남아있으면 정상 종료 여부를 "파일 기반"으로 판정합니다.
*/
@Value("${train.docker.responseDir}")
private String responseDir;
/**
* 스프링 부팅 완료 시점(빈 생성/초기화 모두 끝난 뒤)에 복구 로직 실행.
*
* <p>@Transactional: - recover() 메서드 전체가 하나의 트랜잭션으로 감싸집니다. - Job 하나씩 처리하다가 예외가 발생하면 전체 롤백이 될 수
* 있으므로 "잡 단위로 확실히 커밋"이 필요하면 (권장) 잡 단위로 분리 트랜잭션(REQUIRES_NEW) 고려하세요.
*/
@EventListener(ApplicationReadyEvent.class)
@Transactional
public void recover() {
// 1) DB에서 "RUNNING(진행중) 상태"로 남아있는 job 목록을 조회
List<ModelTrainJobDto> runningJobs = modelTrainJobCoreService.findRunningJobs();
// 실행중 job이 없으면 할 일 없음
if (runningJobs == null || runningJobs.isEmpty()) {
return;
}
// 2) 각 job에 대해 docker 컨테이너 상태를 확인하고, 상태에 따라 조치
for (ModelTrainJobDto job : runningJobs) {
String containerName = job.getContainerName();
try {
// 2-1) docker inspect로 컨테이너 상태 조회
DockerInspectState state = inspectContainer(containerName);
// 3) 컨테이너가 "없음"
// - docker run --rm 로 실행한 컨테이너는 정상 종료 시 바로 삭제될 수 있음
// - 즉 "컨테이너 없음"이 무조건 실패는 아님
if (!state.exists()) {
log.warn(
"[RECOVERY] container missing. try file-based reconcile. container={}",
containerName);
// 3-1) 컨테이너가 없을 때는 산출물(responseDir)을 보고 완료 여부를 "추정"
OutputResult out = probeOutputs(job);
// 3-2) 산출물이 충분하면 성공 처리
if (out.completed()) {
log.info("[RECOVERY] outputs look completed. mark SUCCESS. jobId={}", job.getId());
modelTrainJobCoreService.markSuccess(job.getId(), 0);
// model 상태 변경
markStepSuccessByJobType(job);
// 결과 csv 파일 정보 등록
modelTrainMetricsJobService.findTrainValidMetricCsvFiles();
} else {
// 3-3) 산출물이 부족하면 중단처리
// 산출물이 부족하면 "중단/보류"로 처리
// 운영자가 재시작 할 수 있게 한다.
log.warn(
"[RECOVERY] outputs incomplete. mark PAUSED/STOP for restart. jobId={} reason={}",
job.getId(),
out.reason());
Integer modelId = job.getModelId() == null ? null : Math.toIntExact(job.getModelId());
// PAUSED/STOP
modelTrainJobCoreService.markPaused(
job.getId(), modelId, "SERVER_RESTART_CONTAINER_MISSING_OUTPUT_INCOMPLETE");
// 모델도 에러가 아니라 STOP으로
markStepStopByJobType(
job, "SERVER_RESTART_CONTAINER_MISSING_OUTPUT_INCOMPLETE: " + out.reason());
}
continue;
}
// 4) 컨테이너는 존재하고, 아직 running=true
// - 서버만 재기동됐고 컨테이너는 그대로 살아있는 케이스
// - 실행중 docker 를 kill 하고 이어하기를 한다,
if (state.running()) {
log.warn("[RECOVERY] container still running. force kill. container={}", containerName);
try {
// ============================================================
// 1) docker kill (SIGKILL) 바로 전송
// - kill은 즉시 종료
// ============================================================
ProcessBuilder pb = new ProcessBuilder("docker", "kill", containerName);
pb.redirectErrorStream(true);
Process p = pb.start();
boolean finished = p.waitFor(20, TimeUnit.SECONDS);
if (!finished) {
p.destroyForcibly();
throw new IOException("docker kill timeout");
}
int code = p.exitValue();
if (code != 0) {
throw new IOException("docker kill failed. exit=" + code);
}
// ============================================================
// 2) kill 후 실제로 죽었는지 확인
// ============================================================
DockerInspectState after = inspectContainer(containerName);
if (after.exists() && after.running()) {
throw new IOException("docker kill returned 0 but container still running");
}
log.info("[RECOVERY] container killed successfully. container={}", containerName);
// ============================================================
// 3) job 상태를 PAUSED로 변경 (서버 재기동으로 강제 중단)
// ============================================================
Integer modelId = job.getModelId() == null ? null : Math.toIntExact(job.getModelId());
modelTrainJobCoreService.markPaused(
job.getId(), modelId, "AUTO_KILLED_ON_SERVER_RESTART");
log.info("job = {}", job);
markStepStopByJobType(job, "AUTO_KILLED_ON_SERVER_RESTART");
} catch (Exception e) {
log.error("[RECOVERY] docker kill failed. container={}", containerName, e);
modelTrainJobCoreService.markFailed(
job.getId(), -1, "AUTO_KILL_FAILED_ON_SERVER_RESTART");
markStepErrorByJobType(job, "AUTO_KILL_FAILED_ON_SERVER_RESTART");
}
continue;
}
// 5) 컨테이너는 존재하지만 running=false
// - exited / dead 등의 상태
Integer exitCode = state.exitCode();
String status = state.status();
// 5-1) exitCode=0이면 정상 종료로 간주 → SUCCESS 처리
if (exitCode != null && exitCode == 0) {
log.info("[RECOVERY] container exited(0). mark SUCCESS. container={}", containerName);
modelTrainJobCoreService.markSuccess(job.getId(), 0);
// model 상태 변경
markStepSuccessByJobType(job);
// 결과 csv 파일 정보 등록
modelTrainMetricsJobService.findTrainValidMetricCsvFiles();
} else {
// 5-2) exitCode != 0 이거나 null이면 실패로 간주 → FAILED 처리
log.warn(
"[RECOVERY] container exited non-zero. mark FAILED. container={} status={} exitCode={}",
containerName,
status,
exitCode);
modelTrainJobCoreService.markFailed(
job.getId(), exitCode, "SERVER_RESTART_CONTAINER_EXIT_NONZERO");
// model 상태 변경
markStepErrorByJobType(job, "exit=" + exitCode + " status=" + status);
}
} catch (Exception e) {
// 6) docker inspect 자체가 실패한 경우
// - docker 데몬 문제/권한 문제/일시적 오류 가능
// - 운영 정책에 따라 "바로 실패" 대신 "유예" 처리도 고려 가능
log.error("[RECOVERY] container inspect failed. container={}", containerName, e);
modelTrainJobCoreService.markFailed(
job.getId(), -1, "SERVER_RESTART_CONTAINER_INSPECT_ERROR");
// model 상태 변경
markStepErrorByJobType(job, "inspect-error");
}
}
}
/**
* jobType에 따라 학습 관리 테이블의 "성공 단계"를 업데이트.
*
* <p>예: - jobType == "EVAL" → step2(평가 단계) 성공 - 그 외 → step1(학습 단계) 성공
*/
private void markStepSuccessByJobType(ModelTrainJobDto job) {
Map<String, Object> params = job.getParamsJson();
boolean isEval = params != null && "EVAL".equals(String.valueOf(params.get("jobType")));
if (isEval) {
modelTrainMngCoreService.markStep2Success(job.getModelId());
} else {
modelTrainMngCoreService.markStep1Success(job.getModelId());
}
}
/**
* jobType에 따라 학습 관리 테이블의 "에러 단계"를 업데이트.
*
* <p>예: - jobType == "EVAL" → step2(평가 단계) 에러 - 그 외 → step1 혹은 전체 에러
*/
private void markStepStopByJobType(ModelTrainJobDto job, String msg) {
Map<String, Object> params = job.getParamsJson();
boolean isEval = params != null && "EVAL".equals(String.valueOf(params.get("jobType")));
if (isEval) {
modelTrainMngCoreService.markStep2Stop(job.getModelId(), msg);
} else {
modelTrainMngCoreService.markStep1Stop(job.getModelId(), msg);
}
}
private void markStepErrorByJobType(ModelTrainJobDto job, String msg) {
Map<String, Object> params = job.getParamsJson();
boolean isEval = params != null && "EVAL".equals(String.valueOf(params.get("jobType")));
if (isEval) {
modelTrainMngCoreService.markStep2Error(job.getModelId(), msg);
} else {
modelTrainMngCoreService.markError(job.getModelId(), msg);
}
}
/**
* docker inspect를 사용해서 컨테이너 상태를 조회합니다.
*
* <p>사용하는 템플릿: {{.State.Status}} {{.State.Running}} {{.State.ExitCode}}
*
* <p>예상 출력 예: - "running true 0" - "exited false 0" - "exited false 137"
*
* <p>주의: - 컨테이너가 없거나 inspect 실패 시 exitCode != 0 또는 output이 비어서 missing() 반환 - 무한 대기 방지를 위해 5초
* 타임아웃을 둠
*/
private DockerInspectState inspectContainer(String containerName)
throws IOException, InterruptedException {
ProcessBuilder pb =
new ProcessBuilder(
"docker",
"inspect",
"-f",
"{{.State.Status}} {{.State.Running}} {{.State.ExitCode}}",
containerName);
// stderr를 stdout으로 합쳐서 한 스트림으로 읽기(에러 메시지도 함께 받음)
pb.redirectErrorStream(true);
Process p = pb.start();
// inspect 출력은 1줄이면 충분하므로 readLine()만 수행
String output;
try (BufferedReader br = new BufferedReader(new InputStreamReader(p.getInputStream()))) {
output = br.readLine();
}
// 무한대기 방지: 5초 내에 종료되지 않으면 강제 종료
boolean finished = p.waitFor(5, TimeUnit.SECONDS);
if (!finished) {
p.destroyForcibly();
throw new IOException("docker inspect timeout");
}
// docker inspect 자체의 프로세스 exit code
int code = p.exitValue();
// 실패(코드 !=0) 또는 출력이 없으면 "컨테이너 없음"으로 간주
if (code != 0 || output == null || output.isBlank()) {
return DockerInspectState.missing();
}
// "status running exitCode" 형태로 split
String[] parts = output.trim().split("\\s+");
// status: running/exited/dead 등
String status = parts.length > 0 ? parts[0] : "unknown";
// running: true/false
boolean running = parts.length > 1 && Boolean.parseBoolean(parts[1]);
// exitCode: 정수 파싱(파싱 실패하면 null)
Integer exitCode = null;
if (parts.length > 2) {
try {
exitCode = Integer.parseInt(parts[2]);
} catch (Exception ignore) {
// ignore
}
}
return new DockerInspectState(true, running, exitCode, status);
}
/**
* docker inspect 결과를 담는 레코드.
*
* <p>exists: - true : docker inspect 성공 (컨테이너 존재) - false : 컨테이너 없음(또는 inspect 실패를 missing으로 간주)
*/
private record DockerInspectState(
boolean exists, boolean running, Integer exitCode, String status) {
static DockerInspectState missing() {
return new DockerInspectState(false, false, null, "missing");
}
}
// ============================================================================================
// 컨테이너가 "없을 때" 파일 기반으로 완료/미완료를 판정하는 로직
// ============================================================================================
/**
* 컨테이너가 없을 때(responseDir 산출물만 남아있는 상태) 완료 여부를 파일 기반으로 판정합니다.
*
* <p>판정 규칙(보수적으로 설계): 1) total_epoch가 paramsJson에 있어야 함 (없으면 완료 판단 불가) 2) val.csv 존재 + 헤더 제외 라인 수
* >= total_epoch 이어야 함 3) *.pth 파일이 total_epoch 이상 존재하거나, best*.pth(또는 *best*.pth)가 존재해야 함
*
* <p>왜 이렇게? - 어떤 학습은 epoch마다 pth를 남기고 - 어떤 학습은 best만 남기기도 해서 "pthCount >= total_epoch"만 쓰면 정상 종료를
* 실패로 오판할 수 있음.
*/
private OutputResult probeOutputs(ModelTrainJobDto job) {
try {
log.info(
"[RECOVERY] probeOutputs start. jobId={}, modelId={}", job.getId(), job.getModelId());
// 1) 출력 디렉토리 확인
Path outDir = resolveOutputDir(job);
if (outDir == null || !Files.isDirectory(outDir)) {
log.warn("[RECOVERY] output directory missing. jobId={}, path={}", job.getId(), outDir);
return new OutputResult(false, "output-dir-missing");
}
log.info("[RECOVERY] output directory found. jobId={}, path={}", job.getId(), outDir);
// 2) totalEpoch 확인
Integer totalEpoch = extractTotalEpoch(job).orElse(null);
if (totalEpoch == null || totalEpoch <= 0) {
log.warn(
"[RECOVERY] totalEpoch missing or invalid. jobId={}, totalEpoch={}",
job.getId(),
totalEpoch);
return new OutputResult(false, "total-epoch-missing");
}
log.info("[RECOVERY] totalEpoch={}. jobId={}", totalEpoch, job.getId());
// 3) val.csv 존재 확인
Path valCsv = outDir.resolve("val.csv");
if (!Files.exists(valCsv)) {
log.warn("[RECOVERY] val.csv missing. jobId={}, path={}", job.getId(), valCsv);
return new OutputResult(false, "val.csv-missing");
}
// 4) val.csv 라인 수 확인
long lines = countNonHeaderLines(valCsv);
log.info(
"[RECOVERY] val.csv lines counted. jobId={}, lines={}, expected={}",
job.getId(),
lines,
totalEpoch);
// 5) 완료 판정
if (lines == totalEpoch) {
log.info("[RECOVERY] outputs look COMPLETE. jobId={}", job.getId());
return new OutputResult(true, "ok");
}
log.warn(
"[RECOVERY] val.csv line mismatch. jobId={}, lines={}, expected={}",
job.getId(),
lines,
totalEpoch);
return new OutputResult(
false, "val.csv-lines-mismatch lines=" + lines + " expected=" + totalEpoch);
} catch (Exception e) {
log.error("[RECOVERY] probeOutputs error. jobId={}", job.getId(), e);
return new OutputResult(false, "probe-error");
}
}
/**
* responseDir 아래에서 job 산출물 디렉토리를 찾습니다.
*
* <p>가장 중요한 커스터마이징 포인트: - 실제 운영 환경에서 산출물이 어떤 경로 규칙으로 저장되는지에 따라 여기만 수정하면 됩니다.
*
* <p>현재 기본 탐색 순서: 1) {responseDir}/{jobId} 2) {responseDir}/{modelId} 3)
* {responseDir}/{containerName} 4) 마지막 fallback: responseDir 자체
*
* <p>추천: - 여러분 규칙이 "{responseDir}/{modelId}/{jobId}" 같은 형태라면 base.resolve(modelId).resolve(jobId)
* 형태를 1순위로 두세요.
*/
private Path resolveOutputDir(ModelTrainJobDto job) {
ModelTrainMngDto.Basic model = modelTrainMngCoreService.findModelById(job.getModelId());
Path base = Paths.get(responseDir, model.getUuid().toString(), "metrics");
return Files.isDirectory(base) ? base : null;
}
/**
* paramsJson에서 total_epoch 값을 추출합니다.
*
* <p>키 후보: - "total_epoch" (snake_case) - "totalEpoch" (camelCase)
*
* <p>예: paramsJson = {"jobType":"TRAIN","total_epoch":50,...}
*/
private Optional<Integer> extractTotalEpoch(ModelTrainJobDto job) {
Map<String, Object> params = job.getParamsJson();
if (params == null) return Optional.empty();
Object v = params.get("total_epoch");
if (v == null) v = params.get("totalEpoch");
if (v == null) return Optional.empty();
try {
return Optional.of(Integer.parseInt(String.valueOf(v)));
} catch (Exception ignore) {
return Optional.empty();
}
}
/**
* CSV 파일에서 "헤더(첫 줄)"를 제외한 라인 수를 계산합니다.
*
* <p>가정: - val.csv 첫 줄은 헤더 - 이후 라인들이 epoch별 기록(또는 유사한 누적 기록)
*
* <p>주의: - 파일 인코딩은 UTF-8로 가정 - 빈 줄은 제외
*/
private long countNonHeaderLines(Path csv) throws IOException {
try (Stream<String> lines = Files.lines(csv, StandardCharsets.UTF_8)) {
return lines.skip(1).filter(s -> s != null && !s.isBlank()).count();
}
}
/**
* 디렉토리에서 glob 패턴에 맞는 파일 수를 셉니다.
*
* <p>예: - "*.pth" - "best*.pth"
*/
private long countFilesByGlob(Path dir, String glob) throws IOException {
try (DirectoryStream<Path> ds = Files.newDirectoryStream(dir, glob)) {
long cnt = 0;
for (Path p : ds) {
if (Files.isRegularFile(p)) cnt++;
}
return cnt;
}
}
/** 디렉토리에서 glob 패턴에 맞는 파일이 "하나라도" 존재하는지 체크합니다. */
private boolean existsByGlob(Path dir, String glob) throws IOException {
try (DirectoryStream<Path> ds = Files.newDirectoryStream(dir, glob)) {
return ds.iterator().hasNext();
}
}
// ============================================================================================
// probeOutputs() 결과 객체
// ============================================================================================
/**
* 컨테이너가 없을 때(responseDir 기반) 완료 여부 판정 결과.
*
* <p>completed: - true : 산출물이 완료로 보임(성공 처리 가능) - false : 산출물이 부족/불명확(실패 또는 유예 판단)
*
* <p>reason: - 실패/미완료 사유(로그/DB 메시지로 남기기 용도)
*/
private static final class OutputResult {
private final boolean completed;
private final String reason;
private OutputResult(boolean completed, String reason) {
this.completed = completed;
this.reason = reason;
}
boolean completed() {
return completed;
}
String reason() {
return reason;
}
}
}

View File

@@ -1,14 +1,28 @@
package com.kamco.cd.training.train.service;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.SerializationFeature;
import com.kamco.cd.training.common.enums.TrainStatusType;
import com.kamco.cd.training.postgres.core.ModelTestMetricsJobCoreService;
import com.kamco.cd.training.train.dto.ModelTrainMetricsDto.ModelMetricJsonDto;
import com.kamco.cd.training.train.dto.ModelTrainMetricsDto.ModelTestFileName;
import com.kamco.cd.training.train.dto.ModelTrainMetricsDto.ResponsePathDto;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.OutputStream;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.nio.file.StandardOpenOption;
import java.time.ZonedDateTime;
import java.util.ArrayList;
import java.util.List;
import java.util.Set;
import java.util.stream.Collectors;
import java.util.stream.Stream;
import java.util.zip.ZipEntry;
import java.util.zip.ZipOutputStream;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.csv.CSVFormat;
@@ -31,20 +45,11 @@ public class ModelTestMetricsJobService {
@Value("${train.docker.responseDir}")
private String responseDir;
/**
* 실행중인 profile
*
* @return
*/
private boolean isLocalProfile() {
return "local".equalsIgnoreCase(profile);
}
@Value("${file.pt-path}")
private String ptPathDir;
// @Scheduled(cron = "0 * * * * *")
/** 결과 csv 파일 정보 등록 */
public void findTestValidMetricCsvFiles() {
// if (isLocalProfile()) {
// return;
// }
List<ResponsePathDto> modelIds =
modelTestMetricsJobCoreService.getTestMetricSaveNotYetModelIds();
@@ -96,11 +101,116 @@ public class ModelTestMetricsJobService {
modelTestMetricsJobCoreService.insertModelMetricsTest(batchArgs);
// test.csv 파일 읽어서 저장한 여부로만 사용하기
modelTestMetricsJobCoreService.updateModelMetricsTrainSaveYn(
modelInfo.getModelId(), "step2");
} catch (IOException e) {
throw new RuntimeException(e);
}
modelTestMetricsJobCoreService.updateModelMetricsTrainSaveYn(modelInfo.getModelId(), "step2");
// 패키징할 파일 만들기
modelTestMetricsJobCoreService.updatePackingStart(
modelInfo.getModelId(), ZonedDateTime.now());
ModelMetricJsonDto jsonDto =
modelTestMetricsJobCoreService.getTestMetricPackingInfo(modelInfo.getModelId());
try {
writeJsonFile(
jsonDto,
Paths.get(
responseDir
+ "/"
+ modelInfo.getUuid()
+ "/"
+ jsonDto.getModelVersion()
+ ".json"));
} catch (IOException e) {
throw new RuntimeException(e);
}
Path responsePath = Paths.get(responseDir + "/" + modelInfo.getUuid());
ModelTestFileName fileInfo =
modelTestMetricsJobCoreService.findModelTestFileNames(modelInfo.getModelId());
Path zipPath =
Paths.get(
responseDir + "/" + modelInfo.getUuid() + "/" + fileInfo.getModelVersion() + ".zip");
Set<String> targetNames =
Set.of(
"model_config.py",
fileInfo.getBestEpochFileName() + ".pth",
fileInfo.getModelVersion() + ".json");
List<Path> files = new ArrayList<>();
try (Stream<Path> s = Files.list(responsePath)) {
files.addAll(
s.filter(Files::isRegularFile)
.filter(p -> targetNames.contains(p.getFileName().toString()))
.collect(Collectors.toList()));
} catch (IOException e) {
throw new RuntimeException(e);
}
try (Stream<Path> s = Files.list(Path.of(ptPathDir))) {
files.addAll(
s.filter(Files::isRegularFile)
.limit(1) // yolov8_6th-6m.pt 파일 1개만
.collect(Collectors.toList()));
} catch (IOException e) {
throw new RuntimeException(e);
}
try {
zipFiles(files, zipPath);
modelTestMetricsJobCoreService.updatePackingEnd(
modelInfo.getModelId(), ZonedDateTime.now(), TrainStatusType.COMPLETED.getId());
} catch (IOException e) {
modelTestMetricsJobCoreService.updatePackingEnd(
modelInfo.getModelId(), ZonedDateTime.now(), TrainStatusType.ERROR.getId());
throw new RuntimeException(e);
}
}
}
private void writeJsonFile(Object data, Path outputPath) throws IOException {
Path parent = outputPath.getParent();
if (parent != null) {
Files.createDirectories(parent);
}
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.enable(SerializationFeature.INDENT_OUTPUT);
try (OutputStream os =
Files.newOutputStream(
outputPath, StandardOpenOption.CREATE, StandardOpenOption.TRUNCATE_EXISTING)) {
objectMapper.writeValue(os, data);
}
}
private void zipFiles(List<Path> files, Path zipPath) throws IOException {
Path parent = zipPath.getParent();
if (parent != null) {
Files.createDirectories(parent);
}
try (ZipOutputStream zos = new ZipOutputStream(Files.newOutputStream(zipPath))) {
for (Path file : files) {
ZipEntry entry = new ZipEntry(file.getFileName().toString());
zos.putNextEntry(entry);
Files.copy(file, zos);
zos.closeEntry();
}
}
}
}

View File

@@ -6,9 +6,14 @@ import java.io.BufferedReader;
import java.io.IOException;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.List;
import java.util.Objects;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import java.util.stream.Stream;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.csv.CSVFormat;
@@ -31,20 +36,8 @@ public class ModelTrainMetricsJobService {
@Value("${train.docker.responseDir}")
private String responseDir;
/**
* 실행중인 profile
*
* @return
*/
private boolean isLocalProfile() {
return "local".equalsIgnoreCase(profile);
}
// @Scheduled(cron = "0 * * * * *")
/** 결과 csv 파일 정보 등록 */
public void findTrainValidMetricCsvFiles() {
// if (isLocalProfile()) {
// return;
// }
List<ResponsePathDto> modelIds =
modelTrainMetricsJobCoreService.getTrainMetricSaveNotYetModelIds();
@@ -65,7 +58,7 @@ public class ModelTrainMetricsJobService {
for (CSVRecord record : parser) {
int epoch = Integer.parseInt(record.get("Epoch")) + 1; // TODO : 나중에 AI 개발 완료되면 -1 하기
int epoch = Integer.parseInt(record.get("Epoch"));
long iteration = Long.parseLong(record.get("Iteration"));
double Loss = Double.parseDouble(record.get("Loss"));
double LR = Double.parseDouble(record.get("LR"));
@@ -129,6 +122,33 @@ public class ModelTrainMetricsJobService {
throw new RuntimeException(e);
}
Path responsePath = Paths.get(responseDir + "/" + modelInfo.getUuid());
Integer epoch = null;
boolean exists;
Pattern pattern = Pattern.compile("best_changed_fscore_epoch_(\\d+)\\.pth");
try (Stream<Path> s = Files.list(responsePath)) {
epoch =
s.filter(Files::isRegularFile)
.map(
p -> {
Matcher matcher = pattern.matcher(p.getFileName().toString());
if (matcher.matches()) {
return Integer.parseInt(matcher.group(1)); // ← 숫자 부분 추출
}
return null;
})
.filter(Objects::nonNull)
.findFirst()
.orElse(null);
} catch (IOException e) {
throw new RuntimeException(e);
}
// best_changed_fscore_epoch_숫자.pth -> 숫자 값 가지고 와서 베스트 에폭에 업데이트 하기
modelTrainMetricsJobCoreService.updateModelSelectedBestEpoch(modelInfo.getModelId(), epoch);
modelTrainMetricsJobCoreService.updateModelMetricsTrainSaveYn(
modelInfo.getModelId(), "step1");
}

View File

@@ -1,10 +1,10 @@
package com.kamco.cd.training.train.service;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.kamco.cd.training.model.dto.ModelTrainMngDto;
import com.kamco.cd.training.postgres.core.ModelTrainJobCoreService;
import com.kamco.cd.training.postgres.core.ModelTrainMngCoreService;
import com.kamco.cd.training.train.dto.ModelTrainJobQueuedEvent;
import com.kamco.cd.training.train.dto.TrainRunRequest;
import java.time.ZonedDateTime;
import java.util.Map;
import java.util.UUID;
@@ -20,22 +20,38 @@ public class TestJobService {
private final ModelTrainJobCoreService modelTrainJobCoreService;
private final ModelTrainMngCoreService modelTrainMngCoreService;
private final DockerTrainService dockerTrainService;
private final ObjectMapper objectMapper;
private final ApplicationEventPublisher eventPublisher;
private final DataSetCountersService dataSetCounters;
/**
* 실행 예약 (QUEUE 등록)
*
* @param modelId
* @param uuid
* @param epoch
* @return
*/
@Transactional
public Long enqueue(Long modelId, UUID uuid, int epoch) {
// 마스터 확인
modelTrainMngCoreService.findModelById(modelId);
// 폴더 카운트
dataSetCounters.getCount(modelId);
// best epoch 업데이트
modelTrainMngCoreService.updateModelMasterBestEpoch(modelId, epoch);
// 파라미터 조회
TrainRunRequest trainRunRequest = modelTrainMngCoreService.findTrainRunRequest(modelId);
Map<String, Object> params = new java.util.LinkedHashMap<>();
params.put("jobType", "EVAL");
params.put("uuid", String.valueOf(uuid));
params.put("epoch", epoch);
params.put("datasetFolder", trainRunRequest.getDatasetFolder());
params.put("outputFolder", trainRunRequest.getOutputFolder());
int nextAttemptNo = modelTrainJobCoreService.findMaxAttemptNo(modelId) + 1;
@@ -43,13 +59,18 @@ public class TestJobService {
modelTrainJobCoreService.createQueuedJob(
modelId, nextAttemptNo, params, ZonedDateTime.now());
// step2 시작으로 마킹
modelTrainMngCoreService.markStep2InProgress(modelId, jobId);
// test training run 테이블에 적재하기
modelTrainJobCoreService.insertModelTestTrainingRun(modelId, jobId, epoch);
eventPublisher.publishEvent(new ModelTrainJobQueuedEvent(jobId));
return jobId;
}
/**
* 취소
*
* @param modelId
*/
@Transactional
public void cancel(Long modelId) {

View File

@@ -0,0 +1,213 @@
package com.kamco.cd.training.train.service;
import com.kamco.cd.training.train.dto.ModelTrainLinkDto;
import java.io.IOException;
import java.nio.file.*;
import java.util.List;
import lombok.RequiredArgsConstructor;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Service;
@Slf4j
@Service
@RequiredArgsConstructor
public class TmpDatasetService {
@Value("${train.docker.requestDir}")
private String requestDir;
@Value("${train.docker.basePath}")
private String trainBaseDir;
/**
* train, val, test 폴더별로 link
*
* @param uid 임시폴더 uuid
* @param type train, val, test
* @param links tif pull path
* @return
* @throws IOException
*/
public void buildTmpDatasetHardlink(String uid, String type, List<ModelTrainLinkDto> links)
throws IOException {
if (links == null || links.isEmpty()) {
throw new IOException("links is empty");
}
Path tmp = Path.of(trainBaseDir, "tmp", uid);
long linksMade = 0;
for (ModelTrainLinkDto dto : links) {
if (type == null) {
log.warn("SKIP - trainType null: {}", dto);
continue;
}
// type별 디렉토리 생성
Files.createDirectories(tmp.resolve(type).resolve("input1"));
Files.createDirectories(tmp.resolve(type).resolve("input2"));
Files.createDirectories(tmp.resolve(type).resolve("label"));
Files.createDirectories(tmp.resolve(type).resolve("label-json"));
// comparePath → input1
linksMade += link(tmp, type, "input1", dto.getComparePath());
// targetPath → input2
linksMade += link(tmp, type, "input2", dto.getTargetPath());
// labelPath → label
linksMade += link(tmp, type, "label", dto.getLabelPath());
// geoJsonPath -> label-json
linksMade += link(tmp, type, "label-json", dto.getGeoJsonPath());
}
if (linksMade == 0) {
throw new IOException("No symlinks created.");
}
log.info("tmp dataset created: {}, linksMade={}", tmp, linksMade);
}
private long link(Path tmp, String type, String part, String fullPath) throws IOException {
if (fullPath == null || fullPath.isBlank()) return 0;
Path src = Path.of(fullPath);
if (!Files.isRegularFile(src)) {
log.warn("SKIP (not file): {}", src);
return 0;
}
String fileName = src.getFileName().toString();
Path dst = tmp.resolve(type).resolve(part).resolve(fileName);
Files.createDirectories(dst.getParent());
if (Files.exists(dst) || Files.isSymbolicLink(dst)) {
Files.delete(dst);
}
Files.createSymbolicLink(dst, src);
log.info("symlink created: {} -> {}", dst, src);
return 1;
}
/**
* request 전체 폴더 link
*
* @param uid
* @param datasetUids
* @return
* @throws IOException
*/
public String buildTmpDatasetSymlink(String uid, List<String> datasetUids) throws IOException {
log.info("========== buildTmpDatasetSymlink START ==========");
log.info("uid={}", uid);
log.info("datasetUids={}", datasetUids);
log.info("requestDir(raw)={}", requestDir);
Path BASE = toPath(requestDir);
Path tmp = Path.of(trainBaseDir, "tmp", uid);
log.info("BASE={}", BASE);
log.info("BASE exists? {}", Files.isDirectory(BASE));
log.info("tmp={}", tmp);
long noDir = 0, scannedDirs = 0, regularFiles = 0, symlinksMade = 0;
// tmp 디렉토리 준비
for (String type : List.of("train", "val", "test")) {
for (String part : List.of("input1", "input2", "label", "label-json")) {
Path dir = tmp.resolve(type).resolve(part);
Files.createDirectories(dir);
log.info("createDirectories: {}", dir);
}
}
// 심볼릭 링크는 파일시스템이 달라도 작동하므로 FileStore 체크 불필요
for (String id : datasetUids) {
Path srcRoot = BASE.resolve(id);
log.info("---- dataset id={} srcRoot={} exists? {}", id, srcRoot, Files.isDirectory(srcRoot));
for (String type : List.of("train", "val", "test")) {
for (String part : List.of("input1", "input2", "label", "label-json")) {
Path srcDir = srcRoot.resolve(type).resolve(part);
if (!Files.isDirectory(srcDir)) {
log.warn("SKIP (not directory): {}", srcDir);
noDir++;
continue;
}
scannedDirs++;
log.info("SCAN dir={}", srcDir);
try (DirectoryStream<Path> stream = Files.newDirectoryStream(srcDir)) {
for (Path f : stream) {
if (!Files.isRegularFile(f)) {
log.debug("skip non-regular file: {}", f);
continue;
}
regularFiles++;
String dstName = id + "__" + f.getFileName();
Path dst = tmp.resolve(type).resolve(part).resolve(dstName);
// dst가 남아있으면 삭제(심볼릭링크든 파일이든)
if (Files.exists(dst) || Files.isSymbolicLink(dst)) {
Files.delete(dst);
log.debug("deleted existing: {}", dst);
}
try {
// 심볼릭 링크 생성 (파일시스템이 달라도 작동)
Files.createSymbolicLink(dst, f);
symlinksMade++;
log.debug("created symlink: {} => {}", dst, f);
} catch (IOException e) {
log.error("FAILED create symlink: {} => {}", dst, f, e);
throw e;
}
}
}
}
}
}
if (symlinksMade == 0) {
throw new IOException(
"No symlinks created. regularFiles="
+ regularFiles
+ ", scannedDirs="
+ scannedDirs
+ ", noDir="
+ noDir);
}
log.info("tmp dataset created: {}", tmp);
log.info(
"summary: scannedDirs={}, noDir={}, regularFiles={}, symlinksMade={}",
scannedDirs,
noDir,
regularFiles,
symlinksMade);
return uid;
}
private static Path toPath(String p) {
if (p.startsWith("~/")) {
return Paths.get(System.getProperty("user.home")).resolve(p.substring(2)).normalize();
}
return Paths.get(p).toAbsolutePath().normalize();
}
}

View File

@@ -1,12 +1,13 @@
package com.kamco.cd.training.train.service;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.kamco.cd.training.common.enums.TrainStatusType;
import com.kamco.cd.training.common.exception.CustomApiException;
import com.kamco.cd.training.model.dto.ModelTrainMngDto;
import com.kamco.cd.training.model.service.TmpDatasetService;
import com.kamco.cd.training.postgres.core.ModelTrainJobCoreService;
import com.kamco.cd.training.postgres.core.ModelTrainMngCoreService;
import com.kamco.cd.training.train.dto.ModelTrainJobDto;
import com.kamco.cd.training.train.dto.ModelTrainJobQueuedEvent;
import com.kamco.cd.training.train.dto.ModelTrainLinkDto;
import com.kamco.cd.training.train.dto.TrainRunRequest;
import java.io.IOException;
import java.nio.file.Files;
@@ -17,12 +18,15 @@ import java.util.List;
import java.util.Map;
import java.util.UUID;
import lombok.RequiredArgsConstructor;
import lombok.extern.log4j.Log4j2;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.ApplicationEventPublisher;
import org.springframework.http.HttpStatus;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
@Service
@Log4j2
@RequiredArgsConstructor
@Transactional(readOnly = true)
public class TrainJobService {
@@ -33,6 +37,7 @@ public class TrainJobService {
private final ObjectMapper objectMapper;
private final ApplicationEventPublisher eventPublisher;
private final TmpDatasetService tmpDatasetService;
private final DataSetCountersService dataSetCounters;
// 학습 결과가 저장될 호스트 디렉토리
@Value("${train.docker.responseDir}")
@@ -42,10 +47,18 @@ public class TrainJobService {
return modelTrainMngCoreService.findModelIdByUuid(uuid);
}
/** 실행 예약 (QUEUE 등록) */
/**
* 실행 예약 (QUEUE 등록)
*
* @param modelId
* @return
*/
@Transactional
public Long enqueue(Long modelId) {
// 폴더 카운트
dataSetCounters.getCount(modelId);
// 마스터 존재 확인(없으면 예외)
modelTrainMngCoreService.findModelById(modelId);
@@ -131,15 +144,18 @@ public class TrainJobService {
modelTrainMngCoreService.markStopped(modelId);
}
/**
* 학습 이어하기
*
* @param modelId 모델 id
* @param mode NONE 새로 시작, REQUIRE 이어하기
* @return
*/
private Long createNextAttempt(Long modelId, ResumeMode mode) {
ModelTrainMngDto.Basic master = modelTrainMngCoreService.findModelById(modelId);
if (TrainStatusType.IN_PROGRESS.getId().equals(master.getStatusCd())) {
throw new IllegalStateException("이미 진행중입니다.");
}
var lastJob =
ModelTrainJobDto lastJob =
modelTrainJobCoreService
.findLatestByModelId(modelId)
.orElseThrow(() -> new IllegalStateException("이전 실행 이력이 없습니다."));
@@ -163,7 +179,7 @@ public class TrainJobService {
// 체크포인트 탐지해서 resumeFrom 세팅
String resumeFrom = findResumeFromOrNull(nextParams);
if (resumeFrom == null) {
throw new IllegalStateException("이어하기 체크포인트가 없습니다.");
throw new CustomApiException("NOT_FOUND_DATA", HttpStatus.NOT_FOUND, "이어하기 체크포인트가 없습니다.");
}
nextParams.put("resumeFrom", resumeFrom);
nextParams.put("resume", true);
@@ -185,55 +201,112 @@ public class TrainJobService {
REQUIRE // 이어하기
}
/**
* 이어하기 체크포인트 탐지해서 resumeFrom 세팅
*
* @param paramsJson
* @return
*/
public String findResumeFromOrNull(Map<String, Object> paramsJson) {
if (paramsJson == null) return null;
Object out = paramsJson.get("outputFolder");
if (out == null) return null;
String outputFolder = String.valueOf(out).trim(); // uuid
String outputFolder = String.valueOf(out).trim();
if (outputFolder.isEmpty()) return null;
// 호스트 기준 경로
Path outDir = Paths.get(responseDir, outputFolder);
log.info("resume outDir response path: {}", outDir);
Path last = outDir.resolve("last_checkpoint");
log.info("resume last response path: {}", last);
if (!Files.isRegularFile(last)) return null;
try {
String ckptFile = Files.readString(last).trim(); // epoch_10.pth
if (ckptFile.isEmpty()) return null;
// last_checkpoint 내용 그대로 읽기
String containerPath = Files.readString(last).trim();
log.info("resume containerPath: {}", containerPath);
Path ckptHost = outDir.resolve(ckptFile);
if (!Files.isRegularFile(ckptHost)) return null;
if (containerPath.isEmpty()) return null;
// 컨테이너 경로 반환
return "/checkpoints/" + outputFolder + "/" + ckptFile;
// 호스트 경로로 변환해서 실제 파일 존재 확인
String hostPathStr = containerPath.replace("/checkpoints", responseDir);
Path hostPath = Paths.get(hostPathStr);
log.info("resume hostPath: {}", hostPath);
if (!Files.isRegularFile(hostPath)) return null;
// 3컨테이너 경로 그대로 반환
return containerPath;
} catch (Exception e) {
log.error("resume error", e);
return null;
}
}
/**
* 학습에 필요한 데이터셋 파일을 임시폴더 하나에 합치기
*
* @param modelUuid
* @return
*/
@Transactional
public UUID createTmpFile(UUID modelUuid) {
UUID tmpUuid = UUID.randomUUID();
String raw = tmpUuid.toString().toUpperCase().replace("-", "");
// model id 가져오기
Long modelId = modelTrainMngCoreService.findModelIdByUuid(modelUuid);
// model 에 연결된 dataset id 가져오기
List<Long> datasetIds = modelTrainMngCoreService.findModelDatasetMapp(modelId);
List<String> uids = modelTrainMngCoreService.findDatasetUid(datasetIds);
try {
// 데이터셋 심볼링크 생성
String pathUid = tmpDatasetService.buildTmpDatasetSymlink(raw, uids);
// String pathUid = tmpDatasetService.buildTmpDatasetSymlink(raw, uids);
// train path 모델 클래스별 조회
List<ModelTrainLinkDto> trainList = modelTrainMngCoreService.findDatasetTrainPath(modelId);
// validation path 모델 클래스별 조회
List<ModelTrainLinkDto> valList = modelTrainMngCoreService.findDatasetValPath(modelId);
// test path 모델 클래스별 조회
List<ModelTrainLinkDto> testList = modelTrainMngCoreService.findDatasetTestPath(modelId);
log.info(
"createTmpFile class list trainList = {} valList = {} testList = {}",
trainList.size(),
valList.size(),
testList.size());
// train 데이터셋 심볼링크 생성
tmpDatasetService.buildTmpDatasetHardlink(raw, "train", trainList);
// val 데이터셋 심볼링크 생성
tmpDatasetService.buildTmpDatasetHardlink(raw, "val", valList);
// test 데이터셋 심볼링크 생성
tmpDatasetService.buildTmpDatasetHardlink(raw, "test", testList);
ModelTrainMngDto.UpdateReq updateReq = new ModelTrainMngDto.UpdateReq();
updateReq.setRequestPath(pathUid);
updateReq.setRequestPath(raw);
// 학습모델을 수정한다.
modelTrainMngCoreService.updateModelMaster(modelId, updateReq);
} catch (IOException e) {
throw new RuntimeException(e);
}
log.error(
"createTmpFile failed. modelUuid={}, modelId={}, tmpRaw={}, datasetIdsSize={}, uidsSize={}",
modelUuid,
modelId,
raw,
(datasetIds == null ? null : datasetIds.size()),
(uids == null ? null : uids.size()),
e);
// 런타임 예외로 래핑하되, 메시지에 핵심 정보 포함
throw new CustomApiException(
"INTERNAL_SERVER_ERROR", HttpStatus.INTERNAL_SERVER_ERROR, "임시 데이터셋 생성에 실패했습니다.");
}
return modelUuid;
}
}

View File

@@ -11,24 +11,31 @@ import com.kamco.cd.training.train.dto.TrainRunRequest;
import com.kamco.cd.training.train.dto.TrainRunResult;
import java.util.Map;
import lombok.RequiredArgsConstructor;
import lombok.extern.log4j.Log4j2;
import org.springframework.scheduling.annotation.Async;
import org.springframework.stereotype.Component;
import org.springframework.transaction.event.TransactionPhase;
import org.springframework.transaction.event.TransactionalEventListener;
/** job 실행 */
@Log4j2
@Component
@RequiredArgsConstructor
public class TrainJobWorker {
private final ModelTrainJobCoreService modelTrainJobCoreService;
private final ModelTrainMngCoreService modelTrainMngCoreService;
private final ModelTrainMetricsJobService modelTrainMetricsJobService;
private final ModelTestMetricsJobService modelTestMetricsJobService;
private final DockerTrainService dockerTrainService;
private final ObjectMapper objectMapper;
@Async
@Async("trainJobExecutor")
@TransactionalEventListener(phase = TransactionPhase.AFTER_COMMIT)
public void handle(ModelTrainJobQueuedEvent event) {
log.info("[JOB] thread={}, jobId={}", Thread.currentThread().getName(), event.getJobId());
Long jobId = event.getJobId();
ModelTrainJobDto job =
@@ -50,29 +57,45 @@ public class TrainJobWorker {
String containerName =
(isEval ? "eval-" : "train-") + jobId + "-" + params.get("uuid").toString().substring(0, 8);
String type = isEval ? "TEST" : "TRAIN";
Integer totalEpoch = null;
if (params.containsKey("totalEpoch")) {
if (params.get("totalEpoch") != null) {
totalEpoch = Integer.parseInt(params.get("totalEpoch").toString());
}
}
modelTrainJobCoreService.markRunning(jobId, containerName, null, "TRAIN_WORKER", totalEpoch);
log.info("[JOB] markRunning start jobId={}, containerName={}", jobId, containerName);
// 실행 시작 처리
modelTrainJobCoreService.markRunning(
jobId, containerName, null, "TRAIN_WORKER", totalEpoch, type);
log.info("[JOB] markRunning done jobId={}", jobId);
try {
TrainRunResult result;
if (isEval) {
// step2 진행중 처리
modelTrainMngCoreService.markStep2InProgress(modelId, jobId);
String uuid = String.valueOf(params.get("uuid"));
int epoch = (int) params.get("epoch");
String datasetFolder = String.valueOf(params.get("datasetFolder"));
String outputFolder = String.valueOf(params.get("outputFolder"));
EvalRunRequest evalReq = new EvalRunRequest(uuid, epoch, null);
result = dockerTrainService.runEvalSync(evalReq, containerName);
EvalRunRequest evalReq = new EvalRunRequest();
evalReq.setUuid(uuid);
evalReq.setEpoch(epoch);
evalReq.setTimeoutSeconds(null);
evalReq.setDatasetFolder(datasetFolder);
evalReq.setOutputFolder(outputFolder);
log.info("[JOB] selected test epoch={}", epoch);
// 도커 실행 후 로그 수집
result = dockerTrainService.runEvalSync(containerName, evalReq);
} else {
// step1 진행중 처리
modelTrainMngCoreService.markStep1InProgress(modelId, jobId);
TrainRunRequest trainReq = toTrainRunRequest(params);
// 도커 실행 후 로그 수집
result = dockerTrainService.runTrainSync(trainReq, containerName);
}
@@ -85,32 +108,65 @@ public class TrainJobWorker {
return;
}
/**
* 0 정상 종료 SUCCESS 1~125 학습 코드 에러 FAILED 137 OOMKill FAILED 143 SIGTERM (stop) STOP -1 우리 내부
* 강제 중단 STOP
*/
if (result.getExitCode() == 0) {
// 성공 처리
modelTrainJobCoreService.markSuccess(jobId, result.getExitCode());
if (isEval) {
// step2 완료처리
modelTrainMngCoreService.markStep2Success(modelId);
// 결과 csv 파일 정보 등록
modelTestMetricsJobService.findTestValidMetricCsvFiles();
} else {
modelTrainMngCoreService.markStep1Success(modelId);
// 결과 csv 파일 정보 등록
modelTrainMetricsJobService.findTrainValidMetricCsvFiles();
}
} else {
String failMsg = result.getStatus() + "\n" + result.getLogs();
log.info("training fail exitCode={} Msg ={}", result.getExitCode(), failMsg);
if (result.getExitCode() == -1 || result.getExitCode() == 143) {
// 실패 처리
modelTrainJobCoreService.markPaused(
jobId, result.getExitCode(), result.getStatus() + "\n" + result.getLogs());
if (isEval) {
// 오류 정보 등록
modelTrainMngCoreService.markStep2Stop(modelId, "exit=" + result.getExitCode());
} else {
// 오류 정보 등록
modelTrainMngCoreService.markStep1Stop(modelId, "exit=" + result.getExitCode());
}
} else {
// 실패 처리
modelTrainJobCoreService.markFailed(
jobId, result.getExitCode(), result.getStatus() + "\n" + result.getLogs());
if (isEval) {
// 오류 정보 등록
modelTrainMngCoreService.markStep2Error(modelId, "exit=" + result.getExitCode());
} else {
// 오류 정보 등록
modelTrainMngCoreService.markError(modelId, "exit=" + result.getExitCode());
}
}
}
} catch (Exception e) {
modelTrainJobCoreService.markFailed(jobId, null, e.toString());
modelTrainJobCoreService.markFailed(jobId, null, e.getMessage());
if ("EVAL".equals(params.get("jobType"))) {
// 오류 정보 등록
modelTrainMngCoreService.markStep2Error(modelId, e.getMessage());
} else {
// 오류 정보 등록
modelTrainMngCoreService.markError(modelId, e.getMessage());
}
}

View File

@@ -16,7 +16,6 @@ spring:
datasource:
url: jdbc:postgresql://192.168.2.127:15432/kamco_training_db
# url: jdbc:postgresql://localhost:15432/kamco_training_db
username: kamco_training_user
password: kamco_training_user_2025_!@#
hikari:
@@ -48,7 +47,7 @@ member:
init_password: kamco1234!
swagger:
local-port: 9080
local-port: 8080
file:
sync-root-dir: /app/original-images/
@@ -58,11 +57,16 @@ file:
dataset-dir: /home/kcomu/data/request/
dataset-tmp-dir: ${file.dataset-dir}tmp/
pt-path: /home/kcomu/data/response/v6-cls-checkpoints/
pt-FileName: yolov8_6th-6m.pt
train:
docker:
image: "kamco-cd-train:love_latest"
requestDir: "/home/kcomu/data/request"
responseDir: "/home/kcomu/data/response"
containerPrefix: "kamco-cd-train"
shmSize: "16g"
image: kamco-cd-train:latest
requestDir: /home/kcomu/data/request
responseDir: /home/kcomu/data/response
basePath: /home/kcomu/data
containerPrefix: kamco-cd-train
shmSize: 16g
ipcHost: true

View File

@@ -4,19 +4,16 @@ spring:
on-profile: prod
jpa:
show-sql: true
show-sql: false # 운영 환경에서는 성능을 위해 비활성화
hibernate:
ddl-auto: validate
properties:
hibernate:
default_batch_fetch_size: 100 # ✅ 성능 - N+1 쿼리 방지
order_updates: true # ✅ 성능 - 업데이트 순서 정렬로 데드락 방지
use_sql_comments: true # ⚠️ 선택 - SQL에 주석 추가 (디버깅용)
format_sql: true # ⚠️ 선택 - SQL 포맷팅 (가독성)
default_batch_fetch_size: 100 # N+1 쿼리 방지
order_updates: true # 업데이트 순서 정렬로 데드락 방지
datasource:
url: jdbc:postgresql://127.0.01:15432/kamco_training_db
# url: jdbc:postgresql://localhost:15432/kamco_training_db
url: jdbc:postgresql://kamco-cd-train-db:5432/kamco_training_db
username: kamco_training_user
password: kamco_training_user_2025_!@#
hikari:
@@ -31,13 +28,14 @@ spring:
default-timeout: 300 # 5분 트랜잭션 타임아웃
jwt:
secret: "kamco_token_dev_dfc6446d-68fc-4eba-a2ff-c80a14a0bf3a"
# ⚠️ 운영 환경에서는 반드시 별도의 강력한 시크릿 키를 사용하세요
secret: "kamco_token_prod_CHANGE_THIS_TO_SECURE_SECRET_KEY"
access-token-validity-in-ms: 86400000 # 1일
refresh-token-validity-in-ms: 604800000 # 7일
token:
refresh-cookie-name: kamco # 개발용 쿠키 이름
refresh-cookie-secure: false # 로컬 http 테스트면 false
refresh-cookie-name: kamco
refresh-cookie-secure: true # HTTPS 환경에서 필수
springdoc:
swagger-ui:
@@ -46,11 +44,29 @@ springdoc:
member:
init_password: kamco1234!
swagger:
local-port: 9080
file:
sync-root-dir: /app/original-images/
sync-tmp-dir: ${file.sync-root-dir}tmp/
sync-file-extention: tfw,tif
dataset-dir: /home/kcomu/data/request/
dataset-tmp-dir: ${file.dataset-dir}tmp/
pt-path: /home/kcomu/data/response/v6-cls-checkpoints/
pt-FileName: yolov8_6th-6m.pt
train:
docker:
image: "kamco-cd-train:latest"
requestDir: "/home/kcomu/data/request"
responseDir: "/home/kcomu/data/response"
containerPrefix: "kamco-cd-train"
shmSize: "16g"
image: kamco-cd-train:latest
requestDir: /home/kcomu/data/request
responseDir: /home/kcomu/data/response
basePath: /home/kcomu/data
containerPrefix: kamco-cd-train
shmSize: 16g
ipcHost: true

View File

@@ -10,10 +10,6 @@ spring:
datasource:
driver-class-name: org.postgresql.Driver
hikari:
jdbc:
time_zone: UTC
batch_size: 50
# 권장 설정
minimum-idle: 2
maximum-pool-size: 2
connection-timeout: 20000
@@ -21,18 +17,11 @@ spring:
max-lifetime: 1800000
leak-detection-threshold: 60000
data:
redis:
host: localhost
port: 6379
password:
jpa:
hibernate:
ddl-auto: update # 스키마 자동 관리 활성화
properties:
hibernate:
hbm2ddl:
auto: update
javax:
persistence:
validation:
@@ -51,9 +40,12 @@ logging:
level:
org:
springframework:
web: DEBUG
security: DEBUG
web: INFO
security: INFO
root: INFO
# actuator
management:
health:
@@ -77,20 +69,3 @@ management:
exposure:
include:
- "health"
# GeoJSON 파일 모니터링 설정
geojson:
monitor:
watch-directory: ~/geojson/upload
processed-directory: ~/geojson/processed
error-directory: ~/geojson/error
temp-directory: /tmp/geojson_extract
cron-expression: "0/30 * * * * *" # 매 30초마다 실행
supported-extensions:
- zip
- tar
- tar.gz
- tgz
max-file-size: 104857600 # 100MB

View File

@@ -0,0 +1,87 @@
<!doctype html>
<html lang="ko">
<head>
<meta charset="utf-8" />
<title>학습데이터 ZIP 다운로드</title>
</head>
<body>
<h3>학습데이터 ZIP 다운로드</h3>
UUID:
<input id="uuid" value="95cb116c-380a-41c0-98d8-4d1142f15bbf" />
<br><br>
modelVer:
<input id="moderVer" value="G2.HPs_0001.95cb116c-380a-41c0-98d8-4d1142f15bbf" />
<br><br>
JWT Token:
<input id="token" style="width:600px;" placeholder="Bearer 토큰 붙여넣기" />
<br><br>
<button onclick="download()">다운로드</button>
<br><br>
<progress id="bar" value="0" max="100" style="width:400px;"></progress>
<div id="status"></div>
<script>
async function download() {
const uuid = document.getElementById("uuid").value.trim();
const moderVer = document.getElementById("moderVer").value.trim();
const token = document.getElementById("token").value.trim();
if (!uuid) {
alert("UUID 입력하세요");
return;
}
if (!token) {
alert("토큰 입력하세요");
return;
}
const url = `/api/models/download/${uuid}`;
const res = await fetch(url, {
headers: {
"Authorization": token.startsWith("Bearer ")
? token
: `Bearer ${token}`,
"kamco-download-uuid": uuid
}
});
if (!res.ok) {
document.getElementById("status").innerText =
"실패: " + res.status;
return;
}
const total = parseInt(res.headers.get("Content-Length") || "0", 10);
const reader = res.body.getReader();
const chunks = [];
let received = 0;
while (true) {
const { done, value } = await reader.read();
if (done) break;
chunks.push(value);
received += value.length;
if (total) {
document.getElementById("bar").value =
(received / total) * 100;
}
}
const blob = new Blob(chunks);
const a = document.createElement("a");
a.href = URL.createObjectURL(blob);
a.download = moderVer + ".zip";
a.click();
document.getElementById("status").innerText = "완료 ✅";
}
</script>
</body>
</html>

View File

@@ -18,12 +18,6 @@ spring:
max-lifetime: 1800000
leak-detection-threshold: 60000
data:
redis:
host: localhost
port: 6379
password:
jpa:
hibernate:
ddl-auto: none # 테스트 환경에서는 DDL 자동 생성/수정 비활성화
@@ -69,20 +63,6 @@ management:
include:
- "health"
geojson:
monitor:
watch-directory: ~/geojson/upload
processed-directory: ~/geojson/processed
error-directory: ~/geojson/error
temp-directory: /tmp/geojson_extract
cron-expression: "0/30 * * * * *"
supported-extensions:
- zip
- tar
- tar.gz
- tgz
max-file-size: 104857600
jwt:
secret: "test_secret_key_for_testing_purposes_only"
access-token-validity-in-ms: 86400000