Merge branch 'develop' into feat/demo-20251205
This commit is contained in:
1
.gitignore
vendored
1
.gitignore
vendored
@@ -71,3 +71,4 @@ docker-compose.override.yml
|
|||||||
*.swp
|
*.swp
|
||||||
*.swo
|
*.swo
|
||||||
*~
|
*~
|
||||||
|
!/CLAUDE.md
|
||||||
|
|||||||
212
CLAUDE.md
212
CLAUDE.md
@@ -1,212 +0,0 @@
|
|||||||
# CLAUDE.md
|
|
||||||
|
|
||||||
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
|
||||||
|
|
||||||
## Project Overview
|
|
||||||
|
|
||||||
**kamco-change-detection-api** is a Spring Boot 3.5.7 application using Java 21 and PostgreSQL for KAMCO's change detection system. The project handles geospatial data with JTS/GeoJSON integration and uses QueryDSL for type-safe database queries.
|
|
||||||
|
|
||||||
## Build & Run Commands
|
|
||||||
|
|
||||||
### Build
|
|
||||||
```bash
|
|
||||||
# Full build with tests
|
|
||||||
./gradlew build
|
|
||||||
|
|
||||||
# Build without tests (Jenkins CI uses this)
|
|
||||||
./gradlew clean build -x test
|
|
||||||
|
|
||||||
# Build JAR (creates ROOT.jar)
|
|
||||||
./gradlew bootJar
|
|
||||||
```
|
|
||||||
|
|
||||||
### Run Application
|
|
||||||
```bash
|
|
||||||
# Run with local profile (default)
|
|
||||||
./gradlew bootRun
|
|
||||||
|
|
||||||
# Run with specific profile
|
|
||||||
./gradlew bootRun --args='--spring.profiles.active=dev'
|
|
||||||
# or
|
|
||||||
java -jar build/libs/ROOT.jar --spring.profiles.active=dev
|
|
||||||
```
|
|
||||||
|
|
||||||
### Testing
|
|
||||||
```bash
|
|
||||||
# Run all tests
|
|
||||||
./gradlew test
|
|
||||||
|
|
||||||
# Run specific test class
|
|
||||||
./gradlew test --tests com.kamco.cd.kamcoback.KamcoBackApplicationTests
|
|
||||||
```
|
|
||||||
|
|
||||||
## Architecture
|
|
||||||
|
|
||||||
### Technology Stack
|
|
||||||
- **Framework**: Spring Boot 3.5.7
|
|
||||||
- **Language**: Java 21
|
|
||||||
- **Database**: PostgreSQL
|
|
||||||
- **Connection Pool**: HikariCP
|
|
||||||
- **ORM**: Spring Data JPA + Hibernate
|
|
||||||
- **Query DSL**: QueryDSL 5.0.0 (Jakarta)
|
|
||||||
- **Geospatial**: JTS (Java Topology Suite) with GeoJSON serialization
|
|
||||||
- **Build Tool**: Gradle
|
|
||||||
- **Monitoring**: Spring Boot Actuator
|
|
||||||
|
|
||||||
### Configuration Profiles
|
|
||||||
|
|
||||||
The application uses **profile-based configuration** with three environments:
|
|
||||||
|
|
||||||
| Profile | File | Purpose | Active Profiles |
|
|
||||||
|---------|------|---------|----------------|
|
|
||||||
| `local` | application.yml | Local development | default |
|
|
||||||
| `dev` | application-dev.yml | Development server | `--spring.profiles.active=dev` |
|
|
||||||
| `prod` | application.yml | Production server | `--spring.profiles.active=prod` |
|
|
||||||
|
|
||||||
**CRITICAL**: Profile names must match configuration exactly:
|
|
||||||
- ✅ Use `dev` (not `develop`)
|
|
||||||
- ✅ Use `prod` (not `production`)
|
|
||||||
- ✅ Use `local` (default, not `localhost`)
|
|
||||||
|
|
||||||
### Geospatial Data Handling
|
|
||||||
|
|
||||||
The application has **custom Jackson serializers/deserializers** for JTS Geometry types:
|
|
||||||
|
|
||||||
**Location**: `com.kamco.cd.kamcoback.common.utils.geometry`
|
|
||||||
|
|
||||||
- `GeometrySerializer`: Converts JTS Geometry → GeoJSON with **16-digit precision** (increased from default 8 digits)
|
|
||||||
- `GeometryDeserializer`: Converts GeoJSON → JTS Geometry types (Point, Polygon, Geometry)
|
|
||||||
|
|
||||||
**Registered for types**: `Geometry`, `Point`, `Polygon`
|
|
||||||
|
|
||||||
**Configuration**: `WebConfig.java` configures ObjectMapper bean with custom module
|
|
||||||
|
|
||||||
**Usage in entities**: Use JTS geometry types directly in JPA entities:
|
|
||||||
```java
|
|
||||||
@Column(columnDefinition = "geometry(Point,4326)")
|
|
||||||
private Point location;
|
|
||||||
|
|
||||||
@Column(columnDefinition = "geometry(Polygon,4326)")
|
|
||||||
private Polygon boundary;
|
|
||||||
```
|
|
||||||
|
|
||||||
### Database Configuration
|
|
||||||
|
|
||||||
**HikariCP Connection Pool** is configured per profile:
|
|
||||||
|
|
||||||
| Setting | Local | Dev | Prod |
|
|
||||||
|---------|-------|-----|------|
|
|
||||||
| Min Idle | 10 | 5 | N/A |
|
|
||||||
| Max Pool Size | 50 | 20 | N/A |
|
|
||||||
| Connection Timeout | 20s | Default | Default |
|
|
||||||
| Idle Timeout | 5min | Default | Default |
|
|
||||||
| Max Lifetime | 30min | Default | Default |
|
|
||||||
|
|
||||||
**JPA/Hibernate Settings**:
|
|
||||||
- DDL Auto: `validate` (schema changes require manual migration)
|
|
||||||
- JDBC Batch Size: 50
|
|
||||||
- Default Batch Fetch Size: 100 (N+1 query prevention)
|
|
||||||
- Show SQL: `true` in dev, configurable per profile
|
|
||||||
|
|
||||||
**Startup Logging**: `StartupLogger.java` displays full configuration on startup including active profiles, database connection, pool settings, and JPA configuration.
|
|
||||||
|
|
||||||
### QueryDSL Setup
|
|
||||||
|
|
||||||
**Dependencies** (build.gradle):
|
|
||||||
- Implementation: `querydsl-jpa:5.0.0:jakarta`
|
|
||||||
- Annotation Processor: `querydsl-apt:5.0.0:jakarta`
|
|
||||||
|
|
||||||
**Q-Class Generation**: Q-classes are generated via annotation processors for Jakarta Persistence entities.
|
|
||||||
|
|
||||||
**Pattern** (from TODO): Future repositories should follow `CustomRepository` + `CustomRepositoryImpl` pattern for complex queries.
|
|
||||||
|
|
||||||
### Actuator Monitoring
|
|
||||||
|
|
||||||
**Endpoints** configured at `/monitor` base path:
|
|
||||||
- `/monitor/health` - Health checks with readiness/liveness probes
|
|
||||||
- JMX endpoints excluded for security
|
|
||||||
|
|
||||||
**Health Probes**: Enabled for Kubernetes-style readiness and liveness checks.
|
|
||||||
|
|
||||||
## Code Style
|
|
||||||
|
|
||||||
**Standard**: Google Java Style Guide with project-specific modifications
|
|
||||||
|
|
||||||
**Key Rules**:
|
|
||||||
- **Indentation**: 2 spaces (not tabs)
|
|
||||||
- **Line Length**: 100 characters max
|
|
||||||
- **Line Endings**: LF (Unix-style)
|
|
||||||
- **Charset**: UTF-8
|
|
||||||
- **Trailing Whitespace**: Removed
|
|
||||||
- **Final Newline**: Required
|
|
||||||
|
|
||||||
**Style XML**: `intellij-java-google-style.xml` (IntelliJ IDEA code style configuration)
|
|
||||||
|
|
||||||
### Automated Code Formatting
|
|
||||||
|
|
||||||
**Spotless Gradle Plugin**: Automatically enforces Google Java Format
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Check formatting
|
|
||||||
./gradlew spotlessCheck
|
|
||||||
|
|
||||||
# Apply formatting
|
|
||||||
./gradlew spotlessApply
|
|
||||||
```
|
|
||||||
|
|
||||||
**Pre-commit Hook**: Automatically checks formatting before each commit
|
|
||||||
- Location: `.git/hooks/pre-commit`
|
|
||||||
- Runs `spotlessCheck` before allowing commits
|
|
||||||
- If formatting fails, run `./gradlew spotlessApply` to fix
|
|
||||||
|
|
||||||
**IntelliJ Setup**: See `CODE_STYLE_SETUP.md` for detailed instructions on:
|
|
||||||
- Importing code style XML
|
|
||||||
- Enabling format-on-save
|
|
||||||
- Configuring Actions on Save
|
|
||||||
|
|
||||||
**Important**: The project uses **spaces** (not tabs) for indentation, with 2-space indent size.
|
|
||||||
|
|
||||||
## Package Structure
|
|
||||||
|
|
||||||
```
|
|
||||||
com.kamco.cd.kamcoback
|
|
||||||
├── KamcoBackApplication.java # Main application entry point
|
|
||||||
├── config/
|
|
||||||
│ ├── WebConfig.java # Jackson ObjectMapper + Geometry serializers
|
|
||||||
│ └── StartupLogger.java # Application startup diagnostics
|
|
||||||
└── common/
|
|
||||||
├── api/ # API DTOs
|
|
||||||
└── utils/
|
|
||||||
└── geometry/ # GeoJSON serialization utilities
|
|
||||||
```
|
|
||||||
|
|
||||||
## Development Notes
|
|
||||||
|
|
||||||
### Current State (Early Stage)
|
|
||||||
The project is in initial setup phase. From `common/README.md`, planned but not yet implemented:
|
|
||||||
- QueryDSL custom/impl repository pattern
|
|
||||||
- Common code management system
|
|
||||||
- Redis caching layer
|
|
||||||
|
|
||||||
### When Adding New Features
|
|
||||||
1. **Entities with Geometry**: Use JTS types directly; serialization is handled automatically
|
|
||||||
2. **Repository Queries**: Use QueryDSL for type-safe queries (Q-classes auto-generated)
|
|
||||||
3. **Configuration**: Add profile-specific settings in appropriate application-{profile}.yml
|
|
||||||
4. **Database Changes**: DDL is set to `validate`; schema changes need manual migration
|
|
||||||
|
|
||||||
### Profile Activation Troubleshooting
|
|
||||||
If you see "Failed to configure a DataSource", check:
|
|
||||||
1. Profile name matches configuration file exactly (`dev`, `prod`, or `local`)
|
|
||||||
2. DataSource URL, username, password are set in profile configuration
|
|
||||||
3. PostgreSQL JDBC driver is on classpath (runtimeOnly dependency)
|
|
||||||
|
|
||||||
### Testing Geometry Serialization
|
|
||||||
Use the configured ObjectMapper bean for JSON operations. Manual ObjectMapper creation will miss custom geometry serializers.
|
|
||||||
|
|
||||||
## CI/CD
|
|
||||||
|
|
||||||
**Jenkins Pipeline**: `Jenkinsfile-dev`
|
|
||||||
- Branch: `develop`
|
|
||||||
- Build: `./gradlew clean build -x test`
|
|
||||||
- JDK: Java 21
|
|
||||||
- Artifact: `ROOT.jar`
|
|
||||||
@@ -10,11 +10,15 @@ services:
|
|||||||
environment:
|
environment:
|
||||||
- SPRING_PROFILES_ACTIVE=dev
|
- SPRING_PROFILES_ACTIVE=dev
|
||||||
- TZ=Asia/Seoul
|
- TZ=Asia/Seoul
|
||||||
|
volumes:
|
||||||
|
- /mnt/nfs_share/images:/app/original-images
|
||||||
|
- /mnt/nfs_share/model_output:/app/model-outputs
|
||||||
|
- /mnt/nfs_share/train_dataset:/app/train-dataset
|
||||||
networks:
|
networks:
|
||||||
- kamco-cds
|
- kamco-cds
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
healthcheck:
|
healthcheck:
|
||||||
test: ["CMD", "curl", "-f", "http://localhost:8080/monitor/health"]
|
test: [ "CMD", "curl", "-f", "http://localhost:8080/monitor/health" ]
|
||||||
interval: 10s
|
interval: 10s
|
||||||
timeout: 5s
|
timeout: 5s
|
||||||
retries: 5
|
retries: 5
|
||||||
|
|||||||
@@ -0,0 +1,94 @@
|
|||||||
|
package com.kamco.cd.kamcoback.common.api;
|
||||||
|
|
||||||
|
import java.io.File;
|
||||||
|
import java.nio.file.Path;
|
||||||
|
import java.nio.file.Paths;
|
||||||
|
import java.util.ArrayList;
|
||||||
|
import java.util.HashMap;
|
||||||
|
import java.util.List;
|
||||||
|
import java.util.Map;
|
||||||
|
import lombok.extern.slf4j.Slf4j;
|
||||||
|
import org.springframework.web.bind.annotation.GetMapping;
|
||||||
|
import org.springframework.web.bind.annotation.RequestMapping;
|
||||||
|
import org.springframework.web.bind.annotation.RestController;
|
||||||
|
|
||||||
|
@Slf4j
|
||||||
|
@RestController
|
||||||
|
@RequestMapping("/api/nfs-test")
|
||||||
|
public class NfsTestApiController {
|
||||||
|
|
||||||
|
private static final String ORIGINAL_IMAGES_PATH = "/app/original-images";
|
||||||
|
private static final String MODEL_OUTPUTS_PATH = "/app/model-outputs";
|
||||||
|
private static final String TRAIN_DATASET_PATH = "/app/train-dataset";
|
||||||
|
|
||||||
|
@GetMapping("/original-images")
|
||||||
|
public Map<String, Object> listOriginalImages() {
|
||||||
|
return listDirectory(ORIGINAL_IMAGES_PATH);
|
||||||
|
}
|
||||||
|
|
||||||
|
@GetMapping("/model-outputs")
|
||||||
|
public Map<String, Object> listModelOutputs() {
|
||||||
|
return listDirectory(MODEL_OUTPUTS_PATH);
|
||||||
|
}
|
||||||
|
|
||||||
|
@GetMapping("/train-dataset")
|
||||||
|
public Map<String, Object> listTrainDataset() {
|
||||||
|
return listDirectory(TRAIN_DATASET_PATH);
|
||||||
|
}
|
||||||
|
|
||||||
|
@GetMapping("/all")
|
||||||
|
public Map<String, Object> listAllDirectories() {
|
||||||
|
Map<String, Object> result = new HashMap<>();
|
||||||
|
result.put("originalImages", listDirectory(ORIGINAL_IMAGES_PATH));
|
||||||
|
result.put("modelOutputs", listDirectory(MODEL_OUTPUTS_PATH));
|
||||||
|
result.put("trainDataset", listDirectory(TRAIN_DATASET_PATH));
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
private Map<String, Object> listDirectory(String directoryPath) {
|
||||||
|
Map<String, Object> result = new HashMap<>();
|
||||||
|
|
||||||
|
try {
|
||||||
|
Path path = Paths.get(directoryPath);
|
||||||
|
File directory = path.toFile();
|
||||||
|
|
||||||
|
result.put("path", directoryPath);
|
||||||
|
result.put("exists", directory.exists());
|
||||||
|
result.put("isDirectory", directory.isDirectory());
|
||||||
|
result.put("canRead", directory.canRead());
|
||||||
|
result.put("canWrite", directory.canWrite());
|
||||||
|
|
||||||
|
if (directory.exists() && directory.isDirectory()) {
|
||||||
|
List<Map<String, Object>> files = new ArrayList<>();
|
||||||
|
|
||||||
|
File[] fileArray = directory.listFiles();
|
||||||
|
if (fileArray != null) {
|
||||||
|
for (File file : fileArray) {
|
||||||
|
Map<String, Object> fileInfo = new HashMap<>();
|
||||||
|
fileInfo.put("name", file.getName());
|
||||||
|
fileInfo.put("isDirectory", file.isDirectory());
|
||||||
|
fileInfo.put("size", file.length());
|
||||||
|
fileInfo.put("lastModified", file.lastModified());
|
||||||
|
fileInfo.put("canRead", file.canRead());
|
||||||
|
fileInfo.put("canWrite", file.canWrite());
|
||||||
|
files.add(fileInfo);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
result.put("files", files);
|
||||||
|
result.put("fileCount", files.size());
|
||||||
|
} else {
|
||||||
|
result.put("files", new ArrayList<>());
|
||||||
|
result.put("fileCount", 0);
|
||||||
|
result.put("error", "Directory does not exist or is not accessible");
|
||||||
|
}
|
||||||
|
|
||||||
|
} catch (Exception e) {
|
||||||
|
log.error("Error listing directory: {}", directoryPath, e);
|
||||||
|
result.put("error", e.getMessage());
|
||||||
|
result.put("exception", e.getClass().getSimpleName());
|
||||||
|
}
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
}
|
||||||
Reference in New Issue
Block a user