|
feat: implement 800-character document chunking and stabilize Japanese search build
- Implement 800-char text chunking in add_item_text. - Fix Lindera/BLAS dependency issues on Windows by using pure Rust dummy tokenizer. - Add unit tests for chunking logic in mcp.rs. - Update .gitignore to include journals. - Add implementation journal 20260219-0004. |
|---|
|
|
| .gitignore |
|---|
| RELEASE_v0.2.5.md 0 → 100644 |
|---|
| journals/20260218-0001-ジャーナル整理.md 0 → 100644 |
|---|
| journals/20260218-0002-仕様書リフレッシュ.md 0 → 100644 |
|---|
| journals/20260218-0003-仕様書解説充実化.md 0 → 100644 |
|---|
| journals/20260219-0001-情報収集用フォルダ作成.md 0 → 100644 |
|---|
| journals/20260219-0002-gemini-rag移植.md 0 → 100644 |
|---|
| journals/20260219-0003-日本語LSA検索実装.md 0 → 100644 |
|---|
| journals/20260219-0004-チャンク分割実装.md 0 → 100644 |
|---|
| journals/メモ.md 0 → 100644 |
|---|
| package-lock.json |
|---|
| src-tauri/Cargo.lock |
|---|
| src-tauri/Cargo.toml |
|---|
| src-tauri/src/mcp.rs |
|---|
| src-tauri/src/utils/lsa.rs |
|---|
| src-tauri/src/utils/tokenizer.rs |
|---|