Migrated to using Bun instead of Node and PNPM. This also brings in a Devcontainer which enables quick and easy development of the project. Additionally this adds connectivity to S3 (with a default Minio server pre-created) this enables Files to be uploaded against Mantises. There's also a new Internal Notes feature to store arbitrary text notes against a Mantis.
This commit is contained in:
parent
80ca48be70
commit
3b846b8c8e
23 changed files with 3210 additions and 6490 deletions
12
.devcontainer/Dockerfile
Normal file
12
.devcontainer/Dockerfile
Normal file
|
@ -0,0 +1,12 @@
|
|||
FROM oven/bun:slim
|
||||
|
||||
# Config Bun
|
||||
ENV PATH="~/.bun/bin:${PATH}"
|
||||
RUN ln -s /usr/local/bin/bun /usr/local/bin/node
|
||||
|
||||
# Install dependencies
|
||||
RUN apt update
|
||||
RUN apt install -y git procps curl build-essential python3
|
||||
|
||||
# Install Quasar CLI and ESLint
|
||||
RUN bun install -g @quasar/cli eslint
|
22
.devcontainer/devcontainer.json
Normal file
22
.devcontainer/devcontainer.json
Normal file
|
@ -0,0 +1,22 @@
|
|||
// For format details, see https://aka.ms/devcontainer.json. For config options, see the
|
||||
// README at: https://github.com/marcosgomesneto/bun-devcontainers/tree/main/src/basic-bun
|
||||
{
|
||||
"name": "Bun",
|
||||
"dockerFile": "Dockerfile",
|
||||
// Configure tool-specific properties.
|
||||
"customizations": {
|
||||
// Configure properties specific to VS Code.
|
||||
"vscode": {
|
||||
// Add the IDs of extensions you want installed when the container is created.
|
||||
"extensions": [
|
||||
"oven.bun-vscode",
|
||||
"prisma.prisma",
|
||||
"dbaeumer.vscode-eslint",
|
||||
"vue.volar"
|
||||
]
|
||||
}
|
||||
},
|
||||
"features": {
|
||||
"ghcr.io/devcontainers/features/docker-in-docker:2": {}
|
||||
}
|
||||
}
|
|
@ -1,5 +1,5 @@
|
|||
RP_ID=localhost
|
||||
RP_NAME=StylePoint
|
||||
ORIGIN=http://localhost:9100
|
||||
DATABASE_URL="postgresql://sts-sls-utility:MY_SECURE_PASSWORD@localhost:5432/sts-sls-utility?schema=public"
|
||||
SESSION_SECRET=MY_SECURE_SESSION_SECRET
|
||||
DATABASE_URL="postgresql://stylepoint:{{POSTGRES_PASSWORD_REPLACE_ME}}@localhost:5432/stylepoint?schema=public"
|
||||
SESSION_SECRET=SESSION_SECRET_REPLACE_ME
|
2
.gitignore
vendored
2
.gitignore
vendored
|
@ -34,6 +34,8 @@ yarn-error.log*
|
|||
.env.local*
|
||||
|
||||
/postgres
|
||||
/postgres_old
|
||||
/minio_data
|
||||
|
||||
docker-compose.yml
|
||||
|
||||
|
|
7
.vscode/settings.json
vendored
7
.vscode/settings.json
vendored
|
@ -22,5 +22,10 @@
|
|||
},
|
||||
"[prisma]": {
|
||||
"editor.defaultFormatter": "Prisma.prisma"
|
||||
}
|
||||
},
|
||||
"[json]": {
|
||||
"editor.defaultFormatter": "vscode.json-language-features"
|
||||
},
|
||||
"editor.indentSize": "tabSize",
|
||||
"editor.tabSize": 2
|
||||
}
|
64
README.md
64
README.md
|
@ -3,54 +3,28 @@
|
|||
Multi-functional internal tool for StyleTech
|
||||
|
||||
## Setup
|
||||
This project is using VS Code Dev Containers. To set up the development environment, follow these steps:
|
||||
1. Install [Docker](https://www.docker.com/get-started) and [VS Code](https://code.visualstudio.com/).
|
||||
2. Install the [Remote - Containers](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-containers) extension in VS Code.
|
||||
3. Clone this repository to your local machine.
|
||||
4. Open the repository in VS Code.
|
||||
5. When prompted, select "Reopen in Container" to build the development container.
|
||||
6. Once the container is built, you can start developing the project.
|
||||
|
||||
### Prerequisites
|
||||
|
||||
1. **Docker**:
|
||||
* **Windows/Mac**: Install [Docker Desktop](https://www.docker.com/products/docker-desktop/) (this is available free, ignore the "Choose Plan" button, and scroll down to the download button).
|
||||
* **Linux**: Install [Docker Engine](https://docs.docker.com/engine/install/).
|
||||
2. **Node.js**: Install Node.js version 22. You can download it from the [official Node.js website](https://nodejs.org/) or use a version manager like [volta](https://docs.volta.sh/guide/getting-started)
|
||||
3. **pnpm**: Install `pnpm` globally using npm (which comes with Node.js):
|
||||
```bash
|
||||
npm install -g pnpm
|
||||
```
|
||||
4. **quasar cli**: Install the quasar CLI globally using npm:
|
||||
```bash
|
||||
npm install -g @quasar/cli
|
||||
```
|
||||
|
||||
### Installation
|
||||
|
||||
Clone the repository and install the project dependencies using `pnpm`:
|
||||
|
||||
```bash
|
||||
git clone <repository-url>
|
||||
cd <repository-directory>
|
||||
pnpm install
|
||||
## Project Structure
|
||||
```
|
||||
src/ -- Frontend source code
|
||||
src-server/ -- Backend source code
|
||||
```
|
||||
|
||||
### Setting up Environment Variables & Compose File
|
||||
Copy the example `.env.example` file to `.env` and update the values (primarily passwords or access tokens) as required.
|
||||
## Frontend
|
||||
The frontend is built using [Quasar](https://quasar.dev/) and [Vue 3](https://vuejs.org/). It uses [Pinia](https://pinia.vuejs.org/) for state management and [Vue Router](https://router.vuejs.org/) for routing. The frontend is located in the `src/` directory.
|
||||
|
||||
Copy the example `docker-compose.example.yml` file to `docker-compose.yml` and update the password as required (matching that set in the `.env` file).
|
||||
## Backend
|
||||
The backend is built using [Express](https://expressjs.com/). It uses [Prisma](https://www.prisma.io/) for database access. The backend is located in the `src-server/` directory
|
||||
|
||||
### Running the Development Server
|
||||
## Running the Project
|
||||
To run the project, simply run `bun dev`. This will automatically start everything up for you, and vaildate all necessary dependencies are installed. You can then access the frontend at `http://localhost:9100`, and the backend at `http://localhost:9101`.
|
||||
|
||||
To start the development server, run:
|
||||
|
||||
```bash
|
||||
docker compose up -d
|
||||
pnpm run dev
|
||||
```
|
||||
|
||||
This will start the PostgreSQL server on `localhost:5432` the API server on `localhost:8000` (shouldn't need to directly work with this) and the application server accessible at [http://localhost:9000](http://localhost:9000).
|
||||
|
||||
I recommend using [Postico](https://eggerapps.io/postico/) for Mac or [pgAdmin](https://www.pgadmin.org/) for Windows to manage the PostgreSQL database.
|
||||
|
||||
### Building for Production
|
||||
|
||||
To create a production build of the application, run:
|
||||
|
||||
```bash
|
||||
pnpm run build
|
||||
```
|
||||
## Registering a new user
|
||||
The application requires a Registration Token to register a new user. This defaults to `INITIAL_REGISTRATION_TOKEN_PLEASE_CHANGE_ME`. Use this to register an initial account, you can then change it via the Application Settings.
|
|
@ -1,17 +1,52 @@
|
|||
services:
|
||||
postgres_db:
|
||||
image: postgres:latest # Use the latest official PostgreSQL image. Consider pinning to a specific version (e.g., postgres:15) for production.
|
||||
container_name: sts_sls_utility_postgres # A friendly name for the container
|
||||
restart: unless-stopped # Automatically restart the container unless it was manually stopped
|
||||
image: postgres:latest # Use the official PostgreSQL image. Pin to a specific version (e.g., postgres:15) for production stability.
|
||||
container_name: stylepoint_postgres # Define a specific name for this container
|
||||
restart: unless-stopped # Restart the container unless it is explicitly stopped
|
||||
environment:
|
||||
POSTGRES_USER: sts-sls-utility # Sets the default username as requested
|
||||
POSTGRES_PASSWORD: MY_RANDOM_PASSWORD # Replace with a secure password
|
||||
POSTGRES_DB: sts-sls-utility
|
||||
POSTGRES_USER: stylepoint # Database username
|
||||
POSTGRES_PASSWORD: "{{POSTGRES_PASSWORD_REPLACE_ME}}" # Database password (replace with a secure one, consider using secrets)
|
||||
POSTGRES_DB: stylepoint # Database name
|
||||
volumes:
|
||||
# Mounts the host directory './postgres' into the container's data directory
|
||||
# This ensures data persists even if the container is removed and recreated.
|
||||
- ./postgres:/var/lib/postgresql/data
|
||||
# Mount a named volume 'postgres' to persist database data across container restarts/recreations.
|
||||
- postgres:/var/lib/postgresql/data
|
||||
ports:
|
||||
# Maps port 5432 on your host machine to port 5432 inside the container
|
||||
# You can change the host port if 5432 is already in use (e.g., "5433:5432")
|
||||
- "5432:5432"
|
||||
# Map host port 5432 to container port 5432. Change host port if 5432 is occupied (e.g., "5433:5432").
|
||||
- "5432:5432"
|
||||
healthcheck:
|
||||
# Check if the PostgreSQL server is ready to accept connections
|
||||
test: ["CMD-SHELL", "pg_isready -U stylepoint"]
|
||||
interval: 10s
|
||||
timeout: 5s
|
||||
retries: 5
|
||||
|
||||
minio:
|
||||
image: quay.io/minio/minio:latest # Use the official MinIO image. Pin to a specific version for production stability.
|
||||
container_name: stylepoint_minio # Define a specific name for this container
|
||||
restart: unless-stopped # Restart the container unless it is explicitly stopped
|
||||
environment:
|
||||
MINIO_ROOT_USER: stylepoint # MinIO access key (username)
|
||||
MINIO_ROOT_PASSWORD: "{{MINIO_PASSWORD_REPLACE_ME}}" # MinIO secret key (password - replace with a secure one, consider using secrets)
|
||||
volumes:
|
||||
# Mount a named volume 'minio_data' to persist object storage data.
|
||||
- minio_data:/data
|
||||
ports:
|
||||
# Map host port 9000 (API) to container port 9000
|
||||
- "9000:9000"
|
||||
# Map host port 9001 (Console UI) to container port 9001
|
||||
- "9001:9001"
|
||||
# Start the MinIO server, serve data from /data, and make the console available on port 9001
|
||||
command: server /data --console-address ":9001"
|
||||
healthcheck:
|
||||
# Check if the MinIO server is live and responding
|
||||
test: ["CMD", "curl", "-f", "http://localhost:9000/minio/health/live"]
|
||||
interval: 30s
|
||||
timeout: 20s
|
||||
retries: 3
|
||||
|
||||
# Define named volumes for persistent storage
|
||||
volumes:
|
||||
postgres:
|
||||
driver: local # Use the local driver for the postgres volume
|
||||
minio_data:
|
||||
driver: local # Use the local driver for the minio_data volume
|
41
package.json
41
package.json
|
@ -7,58 +7,71 @@
|
|||
"type": "module",
|
||||
"private": true,
|
||||
"scripts": {
|
||||
"test": "echo \"No test specified\" && exit 0",
|
||||
"dev": "pnpm i && pnpm prisma migrate dev && concurrently \"quasar dev -m spa\" \"nodemon src-server/server.js\"",
|
||||
"build": "quasar build -m spa",
|
||||
"test": "playwright test",
|
||||
"test:ui": "bun run-tests.js --ui",
|
||||
"test:headed": "bun run-tests.js --headed",
|
||||
"test:docker": "bun run-tests.js --docker",
|
||||
"test:file": "bun run-tests.js --file",
|
||||
"test:debug": "PWDEBUG=1 playwright test",
|
||||
"migrate": "prisma migrate dev",
|
||||
"dev": "bun project-setup.js && bun dev-prep && bun dev-internal",
|
||||
"dev-prep": "bun install && bun migrate",
|
||||
"dev-internal": "concurrently \"quasar dev -m spa\" \"bun run --no-clear-screen --watch src-server/server.js\"",
|
||||
"build": "concurrently \"bun run build-client\" \"bun run build-server-linux\" \"bun run build-server-windows\"",
|
||||
"build-client": "quasar build -m spa",
|
||||
"build-server-linux": "bun build --compile --minify --sourcemnap --target bun-linux-x64 src-server/server.js --outfile dist/stylepoint-server.bin",
|
||||
"build-server-windows": "bun build --compile --minify --sourcemap --target bun-win-x64 src-server/server.js --outfile dist/stylepoint-server.exe",
|
||||
"postinstall": "quasar prepare"
|
||||
},
|
||||
"dependencies": {
|
||||
"@google/genai": "^0.9.0",
|
||||
"@kenjiuno/msgreader": "^1.22.0",
|
||||
"@prisma/client": "^6.6.0",
|
||||
"@quasar/extras": "^1.16.4",
|
||||
"@quasar/extras": "^1.16.17",
|
||||
"@quixo3/prisma-session-store": "^3.1.13",
|
||||
"@simplewebauthn/browser": "^13.1.0",
|
||||
"@simplewebauthn/server": "^13.1.1",
|
||||
"apexcharts": "^4.7.0",
|
||||
"axios": "^1.8.4",
|
||||
"better-sqlite3": "^11.9.1",
|
||||
"axios": "^1.9.0",
|
||||
"date-fns": "^4.1.0",
|
||||
"dompurify": "^3.2.5",
|
||||
"dotenv": "^16.5.0",
|
||||
"express-session": "^1.18.1",
|
||||
"mailparser": "^3.7.2",
|
||||
"marked": "^15.0.9",
|
||||
"marked": "^15.0.11",
|
||||
"multer": "^1.4.5-lts.2",
|
||||
"node-cron": "^3.0.3",
|
||||
"node-imap": "^0.9.6",
|
||||
"pdfkit": "^0.17.0",
|
||||
"pdfmake": "^0.2.18",
|
||||
"pdfmake": "^0.2.19",
|
||||
"pinia": "^3.0.2",
|
||||
"pino": "^9.6.0",
|
||||
"pino-abstract-transport": "^2.0.0",
|
||||
"pino-http": "^10.4.0",
|
||||
"pino-pretty": "^13.0.0",
|
||||
"quasar": "^2.16.0",
|
||||
"quasar": "^2.18.1",
|
||||
"superjson": "^2.2.2",
|
||||
"uuid": "^11.1.0",
|
||||
"vue": "^3.4.18",
|
||||
"vue-router": "^4.0.0",
|
||||
"vue": "^3.5.13",
|
||||
"vue-router": "^4.5.1",
|
||||
"vue3-apexcharts": "^1.8.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@eslint/js": "^9.25.1",
|
||||
"@quasar/app-vite": "^2.1.0",
|
||||
"@playwright/test": "^1.52.0",
|
||||
"@quasar/app-vite": "^2.2.0",
|
||||
"@stylistic/eslint-plugin": "^4.2.0",
|
||||
"@types/express-session": "^1.18.1",
|
||||
"@types/uuid": "^10.0.0",
|
||||
"@vue/eslint-config-prettier": "^10.2.0",
|
||||
"autoprefixer": "^10.4.2",
|
||||
"autoprefixer": "^10.4.21",
|
||||
"concurrently": "^9.1.2",
|
||||
"eslint": "^9.25.1",
|
||||
"eslint-plugin-vue": "^10.0.0",
|
||||
"globals": "^16.0.0",
|
||||
"nodemon": "^3.1.10",
|
||||
"postcss": "^8.4.14",
|
||||
"playwright": "^1.52.0",
|
||||
"postcss": "^8.5.3",
|
||||
"prettier": "^3.5.3",
|
||||
"prisma": "^6.6.0",
|
||||
"vite-plugin-checker": "^0.9.1"
|
||||
|
|
6356
pnpm-lock.yaml
generated
6356
pnpm-lock.yaml
generated
File diff suppressed because it is too large
Load diff
|
@ -1,7 +0,0 @@
|
|||
onlyBuiltDependencies:
|
||||
- '@prisma/client'
|
||||
- '@prisma/engines'
|
||||
- better-sqlite3
|
||||
- esbuild
|
||||
- prisma
|
||||
- sqlite3
|
|
@ -0,0 +1,20 @@
|
|||
-- CreateTable
|
||||
CREATE TABLE "MantisFile" (
|
||||
"id" SERIAL NOT NULL,
|
||||
"mantis_issue_id" INTEGER NOT NULL,
|
||||
"filename" TEXT NOT NULL,
|
||||
"file_key" TEXT NOT NULL,
|
||||
"mime_type" TEXT,
|
||||
"size" INTEGER,
|
||||
"uploaded_by" TEXT,
|
||||
"uploaded_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
"description" TEXT,
|
||||
|
||||
CONSTRAINT "MantisFile_pkey" PRIMARY KEY ("id")
|
||||
);
|
||||
|
||||
-- CreateIndex
|
||||
CREATE INDEX "MantisFile_mantis_issue_id_idx" ON "MantisFile"("mantis_issue_id");
|
||||
|
||||
-- AddForeignKey
|
||||
ALTER TABLE "MantisFile" ADD CONSTRAINT "MantisFile_mantis_issue_id_fkey" FOREIGN KEY ("mantis_issue_id") REFERENCES "MantisIssue"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
|
@ -0,0 +1,20 @@
|
|||
-- CreateTable
|
||||
CREATE TABLE "MantisNote" (
|
||||
"id" SERIAL NOT NULL,
|
||||
"mantis_issue_id" INTEGER NOT NULL,
|
||||
"content" TEXT NOT NULL,
|
||||
"created_by" TEXT NOT NULL,
|
||||
"created_at" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||
"updated_at" TIMESTAMP(3) NOT NULL,
|
||||
|
||||
CONSTRAINT "MantisNote_pkey" PRIMARY KEY ("id")
|
||||
);
|
||||
|
||||
-- CreateIndex
|
||||
CREATE INDEX "MantisNote_mantis_issue_id_idx" ON "MantisNote"("mantis_issue_id");
|
||||
|
||||
-- CreateIndex
|
||||
CREATE INDEX "MantisNote_created_by_idx" ON "MantisNote"("created_by");
|
||||
|
||||
-- AddForeignKey
|
||||
ALTER TABLE "MantisNote" ADD CONSTRAINT "MantisNote_mantis_issue_id_fkey" FOREIGN KEY ("mantis_issue_id") REFERENCES "MantisIssue"("id") ON DELETE CASCADE ON UPDATE CASCADE;
|
|
@ -153,6 +153,8 @@ model MantisIssue {
|
|||
updatedAt DateTime @updatedAt @map("updated_at")
|
||||
|
||||
comments MantisComment[]
|
||||
files MantisFile[]
|
||||
notes MantisNote[] // Add relation to MantisNote
|
||||
|
||||
fts Unsupported("tsvector")?
|
||||
|
||||
|
@ -189,4 +191,36 @@ model MantisAttachment {
|
|||
comment MantisComment @relation(fields: [commentId], references: [id], onDelete: Cascade)
|
||||
}
|
||||
|
||||
// New model for user-uploaded files to S3
|
||||
model MantisFile {
|
||||
id Int @id @default(autoincrement())
|
||||
mantisIssueId Int @map("mantis_issue_id")
|
||||
filename String
|
||||
fileKey String @map("file_key") // S3 object key
|
||||
mimeType String? @map("mime_type")
|
||||
size Int?
|
||||
uploadedBy String? @map("uploaded_by") // Username of uploader
|
||||
uploadedAt DateTime @default(now()) @map("uploaded_at")
|
||||
description String?
|
||||
|
||||
mantisIssue MantisIssue @relation(fields: [mantisIssueId], references: [id], onDelete: Cascade)
|
||||
|
||||
@@index([mantisIssueId])
|
||||
}
|
||||
|
||||
// New model for internal notes
|
||||
model MantisNote {
|
||||
id Int @id @default(autoincrement())
|
||||
mantisIssueId Int @map("mantis_issue_id")
|
||||
content String
|
||||
createdBy String @map("created_by") // Username of the creator
|
||||
createdAt DateTime @default(now()) @map("created_at")
|
||||
updatedAt DateTime @updatedAt @map("updated_at")
|
||||
|
||||
mantisIssue MantisIssue @relation(fields: [mantisIssueId], references: [id], onDelete: Cascade)
|
||||
|
||||
@@index([mantisIssueId])
|
||||
@@index([createdBy])
|
||||
}
|
||||
|
||||
// --- Mantis Models End ---
|
||||
|
|
236
project-setup.js
Normal file
236
project-setup.js
Normal file
|
@ -0,0 +1,236 @@
|
|||
import { randomBytes } from 'crypto';
|
||||
|
||||
const c = (colour) =>
|
||||
{
|
||||
if(colour === 'reset')
|
||||
{
|
||||
return '\x1b[0m';
|
||||
}
|
||||
return Bun.color(colour, 'ansi');
|
||||
};
|
||||
|
||||
const log = (message) => console.log(message);
|
||||
const logInfo = (message) => log(c('blue') + '[INFO] ' + c('reset') + message);
|
||||
const logSuccess = (message) => log(c('green') + '[SUCCESS] ' + c('reset') + message);
|
||||
const logWarn = (message) => log(c('yellow') + '[WARN] ' + c('reset') + message);
|
||||
const logError = (message) => log(c('red') + '[ERROR] ' + c('reset') + message);
|
||||
|
||||
const ENV_FILE = '.env';
|
||||
const ENV_EXAMPLE_FILE = '.env.example';
|
||||
const DOCKER_COMPOSE_FILE = 'docker-compose.yml';
|
||||
const DOCKER_COMPOSE_EXAMPLE_FILE = 'docker-compose-example.yml';
|
||||
|
||||
async function generateSecureRandomString(bytes = 32)
|
||||
{
|
||||
return randomBytes(bytes).toString('hex');
|
||||
}
|
||||
|
||||
async function ensureFileFromExample(targetPath, examplePath)
|
||||
{
|
||||
logInfo(`Checking for ${targetPath}...`);
|
||||
const targetExists = await Bun.file(targetPath).exists();
|
||||
|
||||
if (!targetExists)
|
||||
{
|
||||
logWarn(`${targetPath} not found. Copying from ${examplePath}...`);
|
||||
const exampleExists = await Bun.file(examplePath).exists();
|
||||
if (!exampleExists)
|
||||
{
|
||||
logError(`${examplePath} not found. Cannot create ${targetPath}. Please ensure ${examplePath} exists.`);
|
||||
// Don't exit immediately, maybe other checks can proceed.
|
||||
// Consider if this should be a fatal error depending on project needs.
|
||||
return { copied: false, error: true };
|
||||
}
|
||||
try
|
||||
{
|
||||
const exampleContent = await Bun.file(examplePath).text();
|
||||
await Bun.write(targetPath, exampleContent);
|
||||
logSuccess(`Copied ${examplePath} to ${targetPath}.`);
|
||||
return { copied: true, error: false }; // Indicates file was copied
|
||||
}
|
||||
catch (error)
|
||||
{
|
||||
logError(`Failed to copy ${examplePath} to ${targetPath}: ${error.message}`);
|
||||
return { copied: false, error: true }; // Indicate error
|
||||
}
|
||||
}
|
||||
else
|
||||
{
|
||||
logInfo(`${targetPath} already exists.`);
|
||||
return { copied: false, error: false }; // Indicates file already existed
|
||||
}
|
||||
}
|
||||
|
||||
const secrets = new Map();
|
||||
async function replacePlaceholders(filePath)
|
||||
{
|
||||
logInfo(`Checking for placeholders in ${filePath}...`);
|
||||
try
|
||||
{
|
||||
let content = await Bun.file(filePath).text();
|
||||
const placeholderRegex = /{{\s*([A-Z0-9_]+)\s*}}/g;
|
||||
const placeholders = [...content.matchAll(placeholderRegex)];
|
||||
const uniquePlaceholderNames = [...new Set(placeholders.map(match => match[1]))];
|
||||
|
||||
if (uniquePlaceholderNames.length === 0)
|
||||
{
|
||||
logInfo(`No placeholders found in ${filePath}.`);
|
||||
return true; // Indicate success (no action needed)
|
||||
}
|
||||
|
||||
logInfo(`Found placeholders: ${uniquePlaceholderNames.join(', ')}. Generating secrets...`);
|
||||
for (const name of uniquePlaceholderNames)
|
||||
{
|
||||
// Reuse existing secret if already generated for another file in this run
|
||||
if (!secrets.has(name))
|
||||
{
|
||||
secrets.set(name, await generateSecureRandomString());
|
||||
}
|
||||
}
|
||||
|
||||
let replacementsMade = false;
|
||||
content = content.replace(placeholderRegex, (match, name) =>
|
||||
{
|
||||
const secret = secrets.get(name);
|
||||
if (secret)
|
||||
{
|
||||
replacementsMade = true;
|
||||
return secret;
|
||||
}
|
||||
return match; // Return original match if name not found (shouldn't happen with current logic)
|
||||
});
|
||||
|
||||
if (replacementsMade)
|
||||
{
|
||||
await Bun.write(filePath, content);
|
||||
logSuccess(`Replaced placeholders in ${filePath} with generated secrets.`);
|
||||
}
|
||||
else
|
||||
{
|
||||
logInfo(`No placeholder values needed replacement in ${filePath}.`);
|
||||
}
|
||||
return true; // Indicate success
|
||||
|
||||
}
|
||||
catch (error)
|
||||
{
|
||||
logError(`Failed to process placeholders in ${filePath}: ${error.message}`);
|
||||
return false; // Indicate failure
|
||||
}
|
||||
}
|
||||
|
||||
async function ensureDockerComposeRunning()
|
||||
{
|
||||
logInfo('Ensuring Docker Compose services are running...');
|
||||
// Check if docker-compose.yml exists first
|
||||
const composeFileExists = await Bun.file(DOCKER_COMPOSE_FILE).exists();
|
||||
if (!composeFileExists)
|
||||
{
|
||||
logWarn(`Skipping Docker Compose setup because ${DOCKER_COMPOSE_FILE} does not exist.`);
|
||||
return true; // Not an error, just skipping
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
logInfo(`Running docker compose -f ${DOCKER_COMPOSE_FILE} up -d...`);
|
||||
const proc = Bun.spawn({
|
||||
cmd: ['docker', 'compose', '-f', DOCKER_COMPOSE_FILE, 'up', '-d'],
|
||||
stdout: 'inherit', // Pipe output to the setup script's stdout
|
||||
stderr: 'inherit', // Pipe errors to the setup script's stderr
|
||||
});
|
||||
|
||||
const exitCode = await proc.exited;
|
||||
|
||||
if (exitCode === 0)
|
||||
{
|
||||
logSuccess('Docker Compose services started successfully (or were already running).');
|
||||
return true; // Indicate success
|
||||
}
|
||||
else
|
||||
{
|
||||
logError(`Docker Compose command failed with exit code ${exitCode}.`);
|
||||
return false; // Indicate failure
|
||||
}
|
||||
}
|
||||
catch (error)
|
||||
{
|
||||
logError(`Failed to run Docker Compose: ${error.message}`);
|
||||
logError('Please ensure Docker is installed, running, and you have permissions to use it.');
|
||||
return false; // Indicate failure
|
||||
}
|
||||
}
|
||||
|
||||
async function main()
|
||||
{
|
||||
let overallSuccess = true;
|
||||
console.log(c('aqua') +` _____ _ _ _____ _ _
|
||||
/ ____| | | | | __ \\ (_) | |
|
||||
| (___ | |_ _ _| | ___| |__) |__ _ _ __ | |_
|
||||
\\___ \\| __| | | | |/ _ \\ ___/ _ \\| | '_ \\| __|
|
||||
____) | |_| |_| | | __/ | | (_) | | | | | |_
|
||||
|_____/ \\__|\\__, |_|\\___|_| \\___/|_|_| |_|\\__|
|
||||
__/ |
|
||||
|___/ ` + c('reset'));
|
||||
|
||||
logInfo('Starting project setup validation...');
|
||||
|
||||
// Ensure .env file exists and replace placeholders if copied
|
||||
const envResult = await ensureFileFromExample(ENV_FILE, ENV_EXAMPLE_FILE);
|
||||
if (envResult.error)
|
||||
{
|
||||
overallSuccess = false;
|
||||
}
|
||||
else if (envResult.copied)
|
||||
{
|
||||
if (!await replacePlaceholders(ENV_FILE))
|
||||
{
|
||||
overallSuccess = false;
|
||||
}
|
||||
}
|
||||
|
||||
// Ensure docker-compose.yml exists and replace placeholders if copied
|
||||
const composeResult = await ensureFileFromExample(DOCKER_COMPOSE_FILE, DOCKER_COMPOSE_EXAMPLE_FILE);
|
||||
if (composeResult.error)
|
||||
{
|
||||
overallSuccess = false;
|
||||
}
|
||||
else if (composeResult.copied)
|
||||
{
|
||||
if (!await replacePlaceholders(DOCKER_COMPOSE_FILE))
|
||||
{
|
||||
overallSuccess = false;
|
||||
}
|
||||
}
|
||||
|
||||
// Only attempt to run docker compose if the previous steps were generally successful
|
||||
// and the compose file actually exists now.
|
||||
if (overallSuccess && await Bun.file(DOCKER_COMPOSE_FILE).exists())
|
||||
{
|
||||
if (!await ensureDockerComposeRunning())
|
||||
{
|
||||
overallSuccess = false;
|
||||
}
|
||||
}
|
||||
else if (!await Bun.file(DOCKER_COMPOSE_FILE).exists())
|
||||
{
|
||||
logWarn(`Skipping Docker Compose execution as ${DOCKER_COMPOSE_FILE} is missing.`);
|
||||
// Not necessarily a failure if the example file was also missing.
|
||||
}
|
||||
|
||||
if (overallSuccess)
|
||||
{
|
||||
logSuccess('Project setup validation completed successfully.');
|
||||
}
|
||||
else
|
||||
{
|
||||
logError('Project setup validation failed. Please check the logs above.');
|
||||
process.exit(1); // Exit with error code if any step failed
|
||||
}
|
||||
}
|
||||
|
||||
main().catch(err =>
|
||||
{
|
||||
logError(`Unhandled error during setup: ${err.message}\n${err.stack}`);
|
||||
process.exit(1);
|
||||
});
|
||||
|
|
@ -2,6 +2,7 @@
|
|||
// https://v2.quasar.dev/quasar-cli-vite/quasar-config-file
|
||||
|
||||
import { defineConfig } from '#q-app/wrappers';
|
||||
import { mergeConfig } from 'vite';
|
||||
|
||||
export default defineConfig((/* ctx */) =>
|
||||
{
|
||||
|
@ -61,7 +62,17 @@ export default defineConfig((/* ctx */) =>
|
|||
// polyfillModulePreload: true,
|
||||
// distDir
|
||||
|
||||
// extendViteConf (viteConf) {},
|
||||
extendViteConf(viteConf)
|
||||
{
|
||||
viteConf.server = mergeConfig(viteConf.server, {
|
||||
watch: {
|
||||
ignored: [
|
||||
(path) => path.includes('postgres') || path.includes('minio')
|
||||
],
|
||||
ignorePermissionErrors: true
|
||||
},
|
||||
});
|
||||
},
|
||||
// viteVuePluginOptions: {},
|
||||
|
||||
// vitePlugins: [
|
||||
|
@ -81,11 +92,12 @@ export default defineConfig((/* ctx */) =>
|
|||
devServer: {
|
||||
// https: true,
|
||||
open: true, // opens browser window automatically
|
||||
port: 9100,
|
||||
|
||||
//Add a proxy from /api to the backend server for dev usage
|
||||
proxy: {
|
||||
'/api': {
|
||||
target : 'http://localhost:8000',
|
||||
target : 'http://localhost:9101',
|
||||
changeOrigin: true
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,13 +1,15 @@
|
|||
import express from 'express';
|
||||
import { PrismaClient } from '@prisma/client'; // Import Prisma Client
|
||||
import { v4 as uuidv4 } from 'uuid'; // Import uuid for unique filenames
|
||||
import { getMantisSettings, saveTicketToDatabase } from '../services/mantisDownloader.js';
|
||||
import axios from 'axios';
|
||||
import reader from '@kenjiuno/msgreader';
|
||||
import MsgReader from '@kenjiuno/msgreader';
|
||||
import { askGemini } from '../utils/gemini.js';
|
||||
import { usernameMap } from '../services/mantisSummarizer.js';
|
||||
const MsgReader = reader.default;
|
||||
import { getS3Client } from '../utils/s3.js';
|
||||
import { getUserById } from './auth.js';
|
||||
|
||||
import prisma from '../database.js';
|
||||
|
||||
const prisma = new PrismaClient(); // Instantiate Prisma Client
|
||||
const router = express.Router();
|
||||
|
||||
// Helper function to fetch distinct values
|
||||
|
@ -186,11 +188,18 @@ router.get('/:id', async(req, res) =>
|
|||
|
||||
try
|
||||
{
|
||||
const issue = await prisma.mantisIssue.findUnique({
|
||||
let issue = await prisma.mantisIssue.findUnique({ // Changed const to let
|
||||
where: { id: issueId },
|
||||
include: {
|
||||
comments: { // Include comments
|
||||
orderBy: { createdAt: 'asc' }, // Keep original order for comments unless preference changes it
|
||||
include: { attachments: true } // And include attachments for each comment
|
||||
},
|
||||
files: { // Include user-uploaded files
|
||||
orderBy: { uploadedAt: 'desc' }
|
||||
},
|
||||
notes: { // Include internal notes
|
||||
orderBy: { createdAt: 'desc' } // Show newest notes first
|
||||
}
|
||||
}
|
||||
});
|
||||
|
@ -198,22 +207,50 @@ router.get('/:id', async(req, res) =>
|
|||
if (!issue)
|
||||
{
|
||||
//Try to download the issue from Mantis
|
||||
const data = await saveTicketToDatabase(issueId);
|
||||
let data;
|
||||
try
|
||||
{
|
||||
data = await saveTicketToDatabase(issueId);
|
||||
}
|
||||
catch (downloadError)
|
||||
{
|
||||
console.error(`Error downloading ticket ${issueId} from Mantis:`, downloadError.message);
|
||||
// Don't immediately return 404, maybe it exists locally but download failed
|
||||
}
|
||||
|
||||
|
||||
if (!data)
|
||||
{
|
||||
return res.status(404).json({ error: 'Mantis issue not found' });
|
||||
// Check if it exists locally even if download failed or wasn't attempted
|
||||
const localIssue = await prisma.mantisIssue.findUnique({ where: { id: issueId } });
|
||||
if (!localIssue)
|
||||
{
|
||||
return res.status(404).json({ error: 'Mantis issue not found locally or via download.' });
|
||||
}
|
||||
}
|
||||
|
||||
// Fetch the issue again from the database
|
||||
const issue = await prisma.mantisIssue.findUnique({
|
||||
// Fetch the issue again from the database (it might have been created by saveTicketToDatabase)
|
||||
issue = await prisma.mantisIssue.findUnique({ // Assign to issue
|
||||
where: { id: issueId },
|
||||
include: {
|
||||
comments: { // Include comments
|
||||
orderBy: { createdAt: 'asc' },
|
||||
include: { attachments: true } // And include attachments for each comment
|
||||
},
|
||||
files: { // Include user-uploaded files
|
||||
orderBy: { uploadedAt: 'desc' }
|
||||
},
|
||||
notes: { // Include internal notes
|
||||
orderBy: { createdAt: 'desc' }
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// Check again if issue is still null after attempting download/check
|
||||
if (!issue)
|
||||
{
|
||||
return res.status(404).json({ error: 'Mantis issue not found after attempting download/check.' });
|
||||
}
|
||||
}
|
||||
|
||||
res.json(issue);
|
||||
|
@ -225,6 +262,191 @@ router.get('/:id', async(req, res) =>
|
|||
}
|
||||
});
|
||||
|
||||
// REMOVE OLD UPLOAD ROUTE
|
||||
// POST /mantis/:id/files - Upload a file for a Mantis issue
|
||||
// router.post('/:id/files', upload.single('file'), async(req, res) => { ... });
|
||||
|
||||
|
||||
// NEW ROUTE: Generate Presigned URL for Upload
|
||||
router.post('/:id/files/presign', async(req, res) =>
|
||||
{
|
||||
const { id } = req.params;
|
||||
const issueId = parseInt(id, 10);
|
||||
const { filename, filetype } = req.body; // Expect filename and filetype from client
|
||||
|
||||
if (isNaN(issueId))
|
||||
{
|
||||
return res.status(400).json({ error: 'Invalid issue ID format' });
|
||||
}
|
||||
if (!filename || !filetype)
|
||||
{
|
||||
return res.status(400).json({ error: 'Missing filename or filetype in request body' });
|
||||
}
|
||||
|
||||
// Sanitize filename (optional, but recommended)
|
||||
const safeFilename = filename.replace(/[^a-zA-Z0-9._-]/g, '_');
|
||||
const fileKey = `mantis/${issueId}/${uuidv4()}/${safeFilename}`; // Unique key
|
||||
|
||||
try
|
||||
{
|
||||
// 1. Check if the Mantis issue exists
|
||||
const issueExists = await prisma.mantisIssue.findUnique({
|
||||
where: { id: issueId },
|
||||
select: { id: true } // Only select id for efficiency
|
||||
});
|
||||
|
||||
if (!issueExists)
|
||||
{
|
||||
return res.status(404).json({ error: 'Mantis issue not found' });
|
||||
}
|
||||
|
||||
const s3Client = await getS3Client();
|
||||
|
||||
// 2. Generate a presigned URL for the file upload
|
||||
const presignedUrl = await s3Client.file(fileKey).presign({
|
||||
method: 'PUT',
|
||||
type: filetype, // Use the provided filetype
|
||||
expiresIn: 3600 // URL expires in 1 hour
|
||||
});
|
||||
|
||||
// 3. Return the URL and the key to the client
|
||||
res.status(200).json({ presignedUrl, fileKey });
|
||||
|
||||
}
|
||||
catch (error)
|
||||
{
|
||||
console.error(`Error generating presigned URL for Mantis issue ${issueId}:`, error);
|
||||
res.status(500).json({ error: `Failed to generate presigned URL. ${error.message}` });
|
||||
}
|
||||
});
|
||||
|
||||
// NEW ROUTE: Confirm Upload and Save Metadata
|
||||
router.post('/:id/files/confirm', async(req, res) =>
|
||||
{
|
||||
const { id } = req.params;
|
||||
const issueId = parseInt(id, 10);
|
||||
|
||||
const user = await getUserById(req.session.loggedInUserId);
|
||||
const { fileKey, filename, mimeType, size /*, description */ } = req.body; // Expect details from client
|
||||
|
||||
if (isNaN(issueId))
|
||||
{
|
||||
return res.status(400).json({ error: 'Invalid issue ID format' });
|
||||
}
|
||||
if (!fileKey || !filename || !mimeType || size === undefined)
|
||||
{
|
||||
return res.status(400).json({ error: 'Missing required file details (fileKey, filename, mimeType, size)' });
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
// Optional: Verify file exists in S3 (requires HEAD request capability in S3 client or separate SDK call)
|
||||
// const s3Client = await getS3Client();
|
||||
// const s3file = s3Client.file(fileKey);
|
||||
// try {
|
||||
// await s3file.head(); // Or equivalent method to check existence/metadata
|
||||
// } catch (s3Error) {
|
||||
// console.error(`File not found in S3 or error checking: ${fileKey}`, s3Error);
|
||||
// return res.status(400).json({ error: 'File upload confirmation failed: File not found in S3.' });
|
||||
// }
|
||||
|
||||
// Save file metadata to database
|
||||
const mantisFile = await prisma.mantisFile.create({
|
||||
data: {
|
||||
mantisIssueId: issueId,
|
||||
filename: filename,
|
||||
fileKey: fileKey,
|
||||
mimeType: mimeType,
|
||||
size: parseInt(size, 10), // Ensure size is an integer
|
||||
uploadedBy: user?.username || 'unknown', // Get username from authenticated user
|
||||
// description: description || null,
|
||||
},
|
||||
});
|
||||
|
||||
res.status(201).json(mantisFile); // Return the created DB record
|
||||
}
|
||||
catch (error)
|
||||
{
|
||||
console.error(`Error confirming upload for Mantis issue ${issueId}, fileKey ${fileKey}:`, error);
|
||||
// If the error is due to duplicate fileKey or other constraint, handle appropriately
|
||||
if (error.code === 'P2002')
|
||||
{ // Example: Prisma unique constraint violation
|
||||
return res.status(409).json({ error: 'File metadata already exists or conflict.' });
|
||||
}
|
||||
res.status(500).json({ error: `Failed to confirm upload. ${error.message}` });
|
||||
}
|
||||
});
|
||||
|
||||
|
||||
// GET /mantis/:id/files - List files for a Mantis issue
|
||||
router.get('/:id/files', async(req, res) =>
|
||||
{
|
||||
const { id } = req.params;
|
||||
const issueId = parseInt(id, 10);
|
||||
|
||||
if (isNaN(issueId))
|
||||
{
|
||||
return res.status(400).json({ error: 'Invalid issue ID format' });
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
const files = await prisma.mantisFile.findMany({
|
||||
where: { mantisIssueId: issueId },
|
||||
orderBy: { uploadedAt: 'desc' },
|
||||
});
|
||||
res.json(files);
|
||||
}
|
||||
catch (error)
|
||||
{
|
||||
console.error(`Error fetching files for Mantis issue ${issueId}:`, error);
|
||||
res.status(500).json({ error: 'Failed to fetch files' });
|
||||
}
|
||||
});
|
||||
|
||||
// GET /mantis/files/:fileId/download - Download a specific file
|
||||
router.get('/files/:fileId/download', async(req, res) =>
|
||||
{
|
||||
const { fileId } = req.params;
|
||||
const id = parseInt(fileId, 10);
|
||||
|
||||
if (isNaN(id))
|
||||
{
|
||||
return res.status(400).json({ error: 'Invalid file ID format' });
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
const fileRecord = await prisma.mantisFile.findUnique({
|
||||
where: { id: id },
|
||||
});
|
||||
|
||||
if (!fileRecord)
|
||||
{
|
||||
return res.status(404).json({ error: 'File not found' });
|
||||
}
|
||||
|
||||
const s3Client = await getS3Client();
|
||||
|
||||
const presignedUrl = await s3Client.file(fileRecord.fileKey).presign({
|
||||
method: 'GET',
|
||||
expiresIn: 3600 // URL expires in 1 hour
|
||||
});
|
||||
|
||||
|
||||
//Redirect to the presigned URL
|
||||
res.redirect(307, presignedUrl);
|
||||
}
|
||||
catch (error)
|
||||
{
|
||||
console.error(`Error preparing file download for ID ${id}:`, error);
|
||||
if (!res.headersSent)
|
||||
{
|
||||
res.status(500).json({ error: `Failed to download file. ${error.message}` });
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
|
||||
router.get('/attachment/:ticketId/:attachmentId', async(req, res) =>
|
||||
{
|
||||
|
@ -247,7 +469,7 @@ router.get('/attachment/:ticketId/:attachmentId', async(req, res) =>
|
|||
}
|
||||
|
||||
const buffer = Buffer.from(attachment.content, 'base64');
|
||||
res.setHeader('Content-Type', attachment.content_type);
|
||||
res.setHeader('Content-Type', attachment.content_type); // Use content_type from Mantis API
|
||||
res.setHeader('Content-Disposition', `attachment; filename="${attachment.filename}"`);
|
||||
res.setHeader('Content-Length', buffer.length);
|
||||
res.send(buffer);
|
||||
|
@ -255,7 +477,15 @@ router.get('/attachment/:ticketId/:attachmentId', async(req, res) =>
|
|||
catch (error)
|
||||
{
|
||||
console.error('Error fetching Mantis attachment:', error.message);
|
||||
res.status(500).json({ error: 'Failed to fetch Mantis attachment' });
|
||||
// Check if the error is from Axios and has a response status
|
||||
if (error.response && error.response.status === 404)
|
||||
{
|
||||
res.status(404).json({ error: 'Attachment not found on Mantis server' });
|
||||
}
|
||||
else
|
||||
{
|
||||
res.status(500).json({ error: 'Failed to fetch Mantis attachment' });
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
|
@ -281,17 +511,22 @@ router.get('/msg-extract/:ticketId/:attachmentId', async(req, res) =>
|
|||
|
||||
const buffer = Buffer.from(attachment.content, 'base64');
|
||||
|
||||
console.log(MsgReader);
|
||||
|
||||
const reader = new MsgReader(buffer);
|
||||
const msg = reader.getFileData();
|
||||
const msgReader = new MsgReader(buffer);
|
||||
const msg = msgReader.getFileData();
|
||||
|
||||
res.status(200).json(msg);
|
||||
}
|
||||
catch (error)
|
||||
{
|
||||
console.error('Error fetching Mantis attachment:', error.message);
|
||||
res.status(500).json({ error: 'Failed to fetch Mantis attachment' });
|
||||
console.error('Error fetching or parsing Mantis MSG attachment:', error.message);
|
||||
if (error.response && error.response.status === 404)
|
||||
{
|
||||
res.status(404).json({ error: 'Attachment not found on Mantis server' });
|
||||
}
|
||||
else
|
||||
{
|
||||
res.status(500).json({ error: 'Failed to fetch or parse Mantis MSG attachment' });
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
|
@ -300,6 +535,12 @@ router.get('/msg-extract/:ticketId/:attachmentId/:innerAttachmentId', async(req,
|
|||
const { url, headers } = await getMantisSettings();
|
||||
|
||||
const { ticketId, attachmentId, innerAttachmentId } = req.params;
|
||||
const innerIndex = parseInt(innerAttachmentId, 10); // Ensure index is a number
|
||||
|
||||
if (isNaN(innerIndex))
|
||||
{
|
||||
return res.status(400).json({ error: 'Invalid inner attachment ID format' });
|
||||
}
|
||||
|
||||
const attachmentUrl = `${url}/issues/${ticketId}/files/${attachmentId}`;
|
||||
|
||||
|
@ -320,25 +561,55 @@ router.get('/msg-extract/:ticketId/:attachmentId/:innerAttachmentId', async(req,
|
|||
const reader = new MsgReader(buffer);
|
||||
const msg = reader.getFileData();
|
||||
|
||||
// Find the inner attachment
|
||||
const innerAttachment = msg.attachments[innerAttachmentId];
|
||||
|
||||
if (!innerAttachment)
|
||||
// Find the inner attachment by index
|
||||
if (!msg || !msg.attachments || innerIndex < 0 || innerIndex >= msg.attachments.length)
|
||||
{
|
||||
return res.status(404).json({ error: 'Inner attachment not found' });
|
||||
return res.status(404).json({ error: 'Inner attachment not found at the specified index' });
|
||||
}
|
||||
const innerAttachment = msg.attachments[innerIndex];
|
||||
|
||||
|
||||
const attachmentData = reader.getAttachment(innerAttachment); // Or reader.getAttachment(innerIndex) if that's the API
|
||||
|
||||
// Assuming attachmentData.content is base64 encoded if it's binary
|
||||
// Check the structure of attachmentData - it might already be a buffer
|
||||
let innerBuffer;
|
||||
if (Buffer.isBuffer(attachmentData.content))
|
||||
{
|
||||
innerBuffer = attachmentData.content;
|
||||
}
|
||||
else if (typeof attachmentData.content === 'string')
|
||||
{
|
||||
// Attempt base64 decoding if it's a string, might need adjustment based on actual content
|
||||
innerBuffer = Buffer.from(attachmentData.content, 'base64');
|
||||
}
|
||||
else
|
||||
{
|
||||
console.error('Unexpected inner attachment content type:', typeof attachmentData.content);
|
||||
return res.status(500).json({ error: 'Could not process inner attachment content' });
|
||||
}
|
||||
|
||||
const attachmentData = reader.getAttachment(innerAttachment);
|
||||
// Determine Content-Type if possible, otherwise use a default
|
||||
// The msgreader library might provide a mime type, check innerAttachment properties
|
||||
const mimeType = innerAttachment.mimeType || 'application/octet-stream'; // Example fallback
|
||||
|
||||
const innerBuffer = Buffer.from(attachmentData.content, 'base64');
|
||||
res.setHeader('Content-Disposition', `attachment; filename="${innerAttachment.fileName}"`);
|
||||
res.setHeader('Content-Type', mimeType);
|
||||
res.setHeader('Content-Disposition', `attachment; filename="${innerAttachment.fileName}"`); // Use fileName from inner attachment
|
||||
res.setHeader('Content-Length', innerBuffer.length);
|
||||
|
||||
res.status(200).send(innerBuffer);
|
||||
}
|
||||
catch (error)
|
||||
{
|
||||
console.error('Error fetching Mantis attachment:', error.message);
|
||||
res.status(500).json({ error: 'Failed to fetch Mantis attachment' });
|
||||
console.error('Error fetching or processing inner MSG attachment:', error.message);
|
||||
if (error.response && error.response.status === 404)
|
||||
{
|
||||
res.status(404).json({ error: 'Outer attachment not found on Mantis server' });
|
||||
}
|
||||
else
|
||||
{
|
||||
res.status(500).json({ error: 'Failed to fetch or process inner MSG attachment' });
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
|
@ -356,6 +627,7 @@ router.get('/stats/issues', async(req, res) =>
|
|||
startDate.setHours(0, 0, 0, 0); // Start of the day
|
||||
|
||||
// Query for daily counts of issues created in the last 7 days
|
||||
// Ensure table and column names match Prisma schema (case-sensitive in raw queries sometimes)
|
||||
const dailyIssues = await prisma.$queryRaw`
|
||||
SELECT
|
||||
DATE(created_at) as date,
|
||||
|
@ -370,7 +642,14 @@ router.get('/stats/issues', async(req, res) =>
|
|||
date ASC
|
||||
`;
|
||||
|
||||
res.status(200).json(dailyIssues);
|
||||
// Convert count to number as BigInt might not serialize correctly
|
||||
const result = dailyIssues.map(row => ({
|
||||
date: row.date,
|
||||
count: Number(row.count) // Convert BigInt to Number
|
||||
}));
|
||||
|
||||
|
||||
res.status(200).json(result);
|
||||
}
|
||||
catch (error)
|
||||
{
|
||||
|
@ -407,7 +686,13 @@ router.get('/stats/comments', async(req, res) =>
|
|||
date ASC
|
||||
`;
|
||||
|
||||
res.status(200).json(dailyComments);
|
||||
// Convert count to number as BigInt might not serialize correctly
|
||||
const result = dailyComments.map(row => ({
|
||||
date: row.date,
|
||||
count: Number(row.count) // Convert BigInt to Number
|
||||
}));
|
||||
|
||||
res.status(200).json(result);
|
||||
}
|
||||
catch (error)
|
||||
{
|
||||
|
@ -419,13 +704,21 @@ router.get('/stats/comments', async(req, res) =>
|
|||
router.get('/summary/:ticketId', async(req, res) =>
|
||||
{
|
||||
const { ticketId } = req.params;
|
||||
const id = parseInt(ticketId, 10);
|
||||
|
||||
if (isNaN(id))
|
||||
{
|
||||
return res.status(400).json({ error: 'Invalid ticket ID format' });
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
const ticket = await prisma.mantisIssue.findUnique({
|
||||
where: { id: parseInt(ticketId, 10) },
|
||||
where: { id: id },
|
||||
include: {
|
||||
comments: true,
|
||||
comments: {
|
||||
orderBy: { createdAt: 'asc' } // Ensure comments are ordered for summary
|
||||
},
|
||||
},
|
||||
});
|
||||
|
||||
|
@ -443,7 +736,7 @@ router.get('/summary/:ticketId', async(req, res) =>
|
|||
});
|
||||
|
||||
//Ask Gemini to summarize the ticket
|
||||
const summary = await askGemini(`Please summarize the following Mantis ticket in the form of a markdown list of bullet points formatted as "[Date] Point" (ensure a newline between each point, format the date as DD/MM/YYY and surround it with square brackets "[]"). Then after your summary, list any outstanding actions as a markdown list in the format "[Name] Action" (again surrounding the name with square brackets).
|
||||
const summary = await askGemini(`Please summarize the following Mantis ticket in the form of a markdown list of bullet points formatted as "[Date] Point" (ensure a newline between each point, format the date as DD/MM/YYYY and surround it with square brackets "[]"). Then after your summary, list any outstanding actions as a markdown list in the format "[Name] Action" (again surrounding the name with square brackets).
|
||||
Output a heading 6 "Summary:", a newline, the summary, then two newlines, a heading 6 "Actions:" then the actions. Do not wrap the output in a code block.\n\n### Ticket Data ###\n\n` + JSON.stringify(ticket, null, 2));
|
||||
res.status(200).json({ summary });
|
||||
}
|
||||
|
@ -454,4 +747,56 @@ router.get('/summary/:ticketId', async(req, res) =>
|
|||
}
|
||||
});
|
||||
|
||||
// NEW ROUTE: Add an internal note to a Mantis issue
|
||||
router.post('/:id/notes', async(req, res) =>
|
||||
{
|
||||
const { id } = req.params;
|
||||
const issueId = parseInt(id, 10);
|
||||
const { content } = req.body;
|
||||
const user = await getUserById(req.session.loggedInUserId); // Assumes user is logged in
|
||||
|
||||
if (isNaN(issueId))
|
||||
{
|
||||
return res.status(400).json({ error: 'Invalid issue ID format' });
|
||||
}
|
||||
if (!content)
|
||||
{
|
||||
return res.status(400).json({ error: 'Note content cannot be empty' });
|
||||
}
|
||||
if (!user)
|
||||
{
|
||||
return res.status(401).json({ error: 'User not authenticated' });
|
||||
}
|
||||
|
||||
try
|
||||
{
|
||||
// 1. Check if the Mantis issue exists
|
||||
const issueExists = await prisma.mantisIssue.findUnique({
|
||||
where: { id: issueId },
|
||||
select: { id: true }
|
||||
});
|
||||
|
||||
if (!issueExists)
|
||||
{
|
||||
return res.status(404).json({ error: 'Mantis issue not found' });
|
||||
}
|
||||
|
||||
// 2. Create the new note
|
||||
const newNote = await prisma.mantisNote.create({
|
||||
data: {
|
||||
mantisIssueId: issueId,
|
||||
content: content,
|
||||
createdBy: user.username, // Store the username of the creator
|
||||
},
|
||||
});
|
||||
|
||||
res.status(201).json(newNote); // Return the created note
|
||||
}
|
||||
catch (error)
|
||||
{
|
||||
console.error(`Error adding note to Mantis issue ${issueId}:`, error);
|
||||
res.status(500).json({ error: `Failed to add note. ${error.message}` });
|
||||
}
|
||||
});
|
||||
|
||||
export default router;
|
|
@ -36,10 +36,16 @@ dotenv.config();
|
|||
|
||||
const httpLogger = pinoHttp({ logger });
|
||||
|
||||
// Define host and port with defaults
|
||||
const HOST = process.env.HOST || '0.0.0.0'; // Listen on all interfaces by default
|
||||
const PORT = parseInt(process.env.BACKEND_PORT || '9101', 10);
|
||||
const FRONTEND_PORT = parseInt(process.env.FRONTEND_PORT || '9100', 10);
|
||||
|
||||
// Define Relying Party details (Update with your actual details)
|
||||
export const rpID = process.env.NODE_ENV === 'production' ? 'stylepoint.uk' : 'localhost';
|
||||
export const rpName = 'StylePoint';
|
||||
export const origin = process.env.NODE_ENV === 'production' ? `https://${rpID}` : `http://${rpID}:9000`;
|
||||
// Use the configured PORT for the origin URL
|
||||
export const origin = process.env.NODE_ENV === 'production' ? `https://${rpID}` : `http://${rpID}:${FRONTEND_PORT}`;
|
||||
|
||||
export const challengeStore = new Map();
|
||||
|
||||
|
@ -130,9 +136,10 @@ if (process.env.PROD)
|
|||
|
||||
app.use(express.static('public', { index: false }));
|
||||
|
||||
app.listen(8000, () =>
|
||||
app.listen(PORT, HOST, () =>
|
||||
{
|
||||
logger.info('Server is running on http://localhost:8000');
|
||||
// Use the configured HOST and PORT in the log message
|
||||
logger.info(`Server is running on http://${HOST}:${PORT}`);
|
||||
|
||||
setupMantisDownloader();
|
||||
});
|
166
src-server/utils/s3.js
Normal file
166
src-server/utils/s3.js
Normal file
|
@ -0,0 +1,166 @@
|
|||
import { S3Client } from 'bun';
|
||||
|
||||
import { getSetting } from './settings';
|
||||
|
||||
let s3Client = null;
|
||||
|
||||
export async function getS3Client()
|
||||
{
|
||||
if (s3Client)
|
||||
{
|
||||
return s3Client;
|
||||
}
|
||||
|
||||
const s3AccessKey = await getSetting('S3_ACCESS_KEY_ID');
|
||||
const s3SecretKey = await getSetting('S3_SECRET_ACCESS_KEY');
|
||||
const s3Endpoint = await getSetting('S3_ENDPOINT');
|
||||
const s3Bucket = await getSetting('S3_BUCKET_NAME');
|
||||
|
||||
if (s3AccessKey && s3SecretKey && s3Endpoint && s3Bucket)
|
||||
{
|
||||
s3Client = new S3Client({
|
||||
endpoint: s3Endpoint,
|
||||
accessKeyId: s3AccessKey,
|
||||
secretAccessKey: s3SecretKey,
|
||||
bucket: s3Bucket,
|
||||
});
|
||||
}
|
||||
else
|
||||
{
|
||||
throw new Error('S3 settings are not configured properly.');
|
||||
}
|
||||
return s3Client;
|
||||
}
|
||||
/* S3Client documentation
|
||||
|
||||
Working with S3 Files
|
||||
|
||||
The file method in S3Client returns a lazy reference to a file on S3.
|
||||
|
||||
// A lazy reference to a file on S3
|
||||
const s3file: S3File = client.file("123.json");
|
||||
|
||||
Like Bun.file(path), the S3Client's file method is synchronous. It does zero network requests until you call a method that depends on a network request.
|
||||
Reading files from S3
|
||||
|
||||
If you've used the fetch API, you're familiar with the Response and Blob APIs. S3File extends Blob. The same methods that work on Blob also work on S3File.
|
||||
|
||||
// Read an S3File as text
|
||||
const text = await s3file.text();
|
||||
|
||||
// Read an S3File as JSON
|
||||
const json = await s3file.json();
|
||||
|
||||
// Read an S3File as an ArrayBuffer
|
||||
const buffer = await s3file.arrayBuffer();
|
||||
|
||||
// Get only the first 1024 bytes
|
||||
const partial = await s3file.slice(0, 1024).text();
|
||||
|
||||
// Stream the file
|
||||
const stream = s3file.stream();
|
||||
for await (const chunk of stream) {
|
||||
console.log(chunk);
|
||||
}
|
||||
|
||||
Memory optimization
|
||||
|
||||
Methods like text(), json(), bytes(), or arrayBuffer() avoid duplicating the string or bytes in memory when possible.
|
||||
|
||||
If the text happens to be ASCII, Bun directly transfers the string to JavaScriptCore (the engine) without transcoding and without duplicating the string in memory. When you use .bytes() or .arrayBuffer(), it will also avoid duplicating the bytes in memory.
|
||||
|
||||
These helper methods not only simplify the API, they also make it faster.
|
||||
Writing & uploading files to S3
|
||||
|
||||
Writing to S3 is just as simple.
|
||||
|
||||
// Write a string (replacing the file)
|
||||
await s3file.write("Hello World!");
|
||||
|
||||
// Write a Buffer (replacing the file)
|
||||
await s3file.write(Buffer.from("Hello World!"));
|
||||
|
||||
// Write a Response (replacing the file)
|
||||
await s3file.write(new Response("Hello World!"));
|
||||
|
||||
// Write with content type
|
||||
await s3file.write(JSON.stringify({ name: "John", age: 30 }), {
|
||||
type: "application/json",
|
||||
});
|
||||
|
||||
// Write using a writer (streaming)
|
||||
const writer = s3file.writer({ type: "application/json" });
|
||||
writer.write("Hello");
|
||||
writer.write(" World!");
|
||||
await writer.end();
|
||||
|
||||
// Write using Bun.write
|
||||
await Bun.write(s3file, "Hello World!");
|
||||
|
||||
Working with large files (streams)
|
||||
|
||||
Bun automatically handles multipart uploads for large files and provides streaming capabilities. The same API that works for local files also works for S3 files.
|
||||
|
||||
// Write a large file
|
||||
const bigFile = Buffer.alloc(10 * 1024 * 1024); // 10MB
|
||||
const writer = s3file.writer({
|
||||
// Automatically retry on network errors up to 3 times
|
||||
retry: 3,
|
||||
|
||||
// Queue up to 10 requests at a time
|
||||
queueSize: 10,
|
||||
|
||||
// Upload in 5 MB chunks
|
||||
partSize: 5 * 1024 * 1024,
|
||||
});
|
||||
for (let i = 0; i < 10; i++) {
|
||||
await writer.write(bigFile);
|
||||
}
|
||||
await writer.end();
|
||||
|
||||
Presigning URLs
|
||||
|
||||
When your production service needs to let users upload files to your server, it's often more reliable for the user to upload directly to S3 instead of your server acting as an intermediary.
|
||||
|
||||
To facilitate this, you can presign URLs for S3 files. This generates a URL with a signature that allows a user to securely upload that specific file to S3, without exposing your credentials or granting them unnecessary access to your bucket.
|
||||
|
||||
The default behaviour is to generate a GET URL that expires in 24 hours. Bun attempts to infer the content type from the file extension. If inference is not possible, it will default to application/octet-stream.
|
||||
|
||||
import { s3 } from "bun";
|
||||
|
||||
// Generate a presigned URL that expires in 24 hours (default)
|
||||
const download = s3.presign("my-file.txt"); // GET, text/plain, expires in 24 hours
|
||||
|
||||
const upload = s3.presign("my-file", {
|
||||
expiresIn: 3600, // 1 hour
|
||||
method: "PUT",
|
||||
type: "application/json", // No extension for inferring, so we can specify the content type to be JSON
|
||||
});
|
||||
|
||||
// You can call .presign() if on a file reference, but avoid doing so
|
||||
// unless you already have a reference (to avoid memory usage).
|
||||
const myFile = s3.file("my-file.txt");
|
||||
const presignedFile = myFile.presign({
|
||||
expiresIn: 3600, // 1 hour
|
||||
});
|
||||
|
||||
Setting ACLs
|
||||
|
||||
To set an ACL (access control list) on a presigned URL, pass the acl option:
|
||||
|
||||
const url = s3file.presign({
|
||||
acl: "public-read",
|
||||
expiresIn: 3600,
|
||||
});
|
||||
|
||||
You can pass any of the following ACLs:
|
||||
ACL Explanation
|
||||
"public-read" The object is readable by the public.
|
||||
"private" The object is readable only by the bucket owner.
|
||||
"public-read-write" The object is readable and writable by the public.
|
||||
"authenticated-read" The object is readable by the bucket owner and authenticated users.
|
||||
"aws-exec-read" The object is readable by the AWS account that made the request.
|
||||
"bucket-owner-read" The object is readable by the bucket owner.
|
||||
"bucket-owner-full-control" The object is readable and writable by the bucket owner.
|
||||
"log-delivery-write" The object is writable by AWS services used for log delivery.
|
||||
*/
|
|
@ -7,8 +7,6 @@ export async function getSetting(key)
|
|||
select: { value: true }
|
||||
});
|
||||
|
||||
console.log(`getSetting(${key})`, setting);
|
||||
|
||||
return setting?.value ? JSON.parse(setting.value) : null;
|
||||
}
|
||||
|
||||
|
|
|
@ -292,20 +292,264 @@
|
|||
name="files"
|
||||
class="q-pa-md"
|
||||
>
|
||||
<!-- Content for Files tab goes here -->
|
||||
<div class="text-grey">
|
||||
Files content will be added here.
|
||||
<div class="row">
|
||||
<!-- File Upload Section -->
|
||||
<div class="q-mb-md q-mr-sm col">
|
||||
<q-file
|
||||
v-model="filesToUpload"
|
||||
label="Select files to upload"
|
||||
multiple
|
||||
outlined
|
||||
clearable
|
||||
:disable="uploading"
|
||||
@update:model-value="uploadFiles"
|
||||
>
|
||||
<template #append>
|
||||
<q-btn
|
||||
round
|
||||
dense
|
||||
flat
|
||||
icon="cloud_upload"
|
||||
@click="uploadFiles"
|
||||
:disable="!filesToUpload || filesToUpload.length === 0 || uploading"
|
||||
:loading="uploading"
|
||||
>
|
||||
<q-tooltip>Upload Selected Files</q-tooltip>
|
||||
</q-btn>
|
||||
</template>
|
||||
</q-file>
|
||||
<q-linear-progress
|
||||
v-if="uploading"
|
||||
:value="uploadProgress"
|
||||
class="q-mt-sm"
|
||||
animation-speed="300"
|
||||
color="primary"
|
||||
/>
|
||||
<div
|
||||
v-if="uploading && currentUploadingFile"
|
||||
class="text-caption q-mt-xs text-grey"
|
||||
>
|
||||
Uploading: {{ currentUploadingFile }} ({{ currentFileIndex + 1 }} of {{ totalFilesToUpload }})
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Search Input for Files Table -->
|
||||
<q-input
|
||||
v-model="filesSearchTerm"
|
||||
outlined
|
||||
placeholder="Search files..."
|
||||
clearable
|
||||
class="q-mb-md q-ml-sm col"
|
||||
>
|
||||
<template #prepend>
|
||||
<q-icon name="search" />
|
||||
</template>
|
||||
</q-input>
|
||||
</div>
|
||||
|
||||
<!-- Files Table -->
|
||||
<q-table
|
||||
:rows="filteredFiles"
|
||||
:columns="filesTableColumns"
|
||||
row-key="id"
|
||||
flat
|
||||
bordered
|
||||
:rows-per-page-options="[10, 25, 50, 0]"
|
||||
:loading="loading"
|
||||
no-data-label="No files found for this ticket."
|
||||
class="files-table q-mb-xl"
|
||||
>
|
||||
<template #body="fileProps">
|
||||
<q-tr :props="fileProps">
|
||||
<q-td
|
||||
key="filename"
|
||||
:props="fileProps"
|
||||
style="max-width: 300px; overflow-wrap: break-word; white-space: normal;"
|
||||
>
|
||||
<a
|
||||
:href="`/api/mantis/files/${fileProps.row.id}/download`"
|
||||
target="_blank"
|
||||
class="text-primary"
|
||||
>
|
||||
{{ fileProps.row.filename }}
|
||||
</a>
|
||||
<!-- Inline Previews -->
|
||||
<div
|
||||
v-if="isImageFile(fileProps.row.filename)"
|
||||
class="q-mt-sm"
|
||||
>
|
||||
<img
|
||||
:src="`/api/mantis/files/${fileProps.row.id}/download`"
|
||||
style="max-width: 100%; max-height: 100px; border-radius: 4px; cursor: pointer;"
|
||||
@click="openImageFullscreen(`/api/mantis/files/${fileProps.row.id}/download`, fileProps.row.filename)"
|
||||
alt="Image Preview"
|
||||
>
|
||||
</div>
|
||||
<div
|
||||
v-else-if="isVideoFile(fileProps.row.filename)"
|
||||
class="q-mt-sm"
|
||||
>
|
||||
<video
|
||||
controls
|
||||
preload="metadata"
|
||||
:src="`/api/mantis/files/${fileProps.row.id}/download#t=0.1`"
|
||||
style="max-width: 100%; max-height: 150px; border-radius: 4px;"
|
||||
>
|
||||
Your browser does not support the video tag.
|
||||
</video>
|
||||
</div>
|
||||
<div
|
||||
v-else-if="isAudioFile(fileProps.row.filename)"
|
||||
class="q-mt-sm"
|
||||
>
|
||||
<audio
|
||||
controls
|
||||
preload="metadata"
|
||||
:src="`/api/mantis/files/${fileProps.row.id}/download`"
|
||||
style="width: 100%;"
|
||||
>
|
||||
Your browser does not support the audio element.
|
||||
</audio>
|
||||
</div>
|
||||
</q-td>
|
||||
<q-td
|
||||
key="uploadedBy"
|
||||
:props="fileProps"
|
||||
>
|
||||
{{ fileProps.row.uploadedBy }}
|
||||
</q-td>
|
||||
<q-td
|
||||
key="uploadedAt"
|
||||
:props="fileProps"
|
||||
>
|
||||
{{ new Date(fileProps.row.uploadedAt).toLocaleString() }}
|
||||
</q-td>
|
||||
<q-td
|
||||
key="size"
|
||||
:props="fileProps"
|
||||
>
|
||||
{{ formatFileSize(fileProps.row.size) }}
|
||||
</q-td>
|
||||
<q-td
|
||||
key="actions"
|
||||
:props="fileProps"
|
||||
class="text-center"
|
||||
>
|
||||
<q-btn
|
||||
flat
|
||||
round
|
||||
dense
|
||||
color="primary"
|
||||
icon="download"
|
||||
:href="`/api/mantis/files/${fileProps.row.id}/download`"
|
||||
target="_blank"
|
||||
>
|
||||
<q-tooltip>Download File</q-tooltip>
|
||||
</q-btn>
|
||||
</q-td>
|
||||
</q-tr>
|
||||
</template>
|
||||
<template #no-data="{ message }">
|
||||
<div class="full-width row flex-center text-grey q-gutter-sm q-pa-md">
|
||||
<q-icon
|
||||
size="2em"
|
||||
name="cloud_off"
|
||||
/>
|
||||
<span>
|
||||
{{ filesSearchTerm ? 'No files match your search.' : message }}
|
||||
</span>
|
||||
</div>
|
||||
</template>
|
||||
</q-table>
|
||||
</q-tab-panel>
|
||||
|
||||
<q-tab-panel
|
||||
name="notes"
|
||||
class="q-pa-md"
|
||||
>
|
||||
<!-- Content for Internal Notes tab goes here -->
|
||||
<div class="text-grey">
|
||||
Internal Notes content will be added here.
|
||||
<!-- Add New Note Section -->
|
||||
<div class="row items-start q-mb-md">
|
||||
<q-input
|
||||
v-model="newNoteContent"
|
||||
type="textarea"
|
||||
outlined
|
||||
label="Add an internal note (Ctrl+Enter to submit)..."
|
||||
rows="3"
|
||||
:disable="addingNote"
|
||||
class="col q-mr-sm"
|
||||
autogrow
|
||||
@keydown.ctrl.enter.prevent="addNote"
|
||||
/>
|
||||
<q-btn
|
||||
label="Add Note"
|
||||
color="primary"
|
||||
@click="addNote"
|
||||
:loading="addingNote"
|
||||
:disable="!newNoteContent || addingNote"
|
||||
class="self-end"
|
||||
/>
|
||||
</div>
|
||||
|
||||
<!-- Search Input for Notes Table -->
|
||||
<q-input
|
||||
v-model="notesSearchTerm"
|
||||
outlined
|
||||
placeholder="Search notes..."
|
||||
clearable
|
||||
class="q-mb-md"
|
||||
>
|
||||
<template #prepend>
|
||||
<q-icon name="search" />
|
||||
</template>
|
||||
</q-input>
|
||||
|
||||
<!-- Notes Table -->
|
||||
<q-table
|
||||
:rows="filteredNotes"
|
||||
:columns="notesTableColumns"
|
||||
row-key="id"
|
||||
flat
|
||||
bordered
|
||||
:rows-per-page-options="[10, 25, 50, 0]"
|
||||
:loading="loading"
|
||||
no-data-label="No internal notes found for this ticket."
|
||||
class="notes-table"
|
||||
>
|
||||
<template #body="noteProps">
|
||||
<q-tr :props="noteProps">
|
||||
<q-td
|
||||
key="createdBy"
|
||||
:props="noteProps"
|
||||
>
|
||||
{{ noteProps.row.createdBy }}
|
||||
</q-td>
|
||||
<q-td
|
||||
key="content"
|
||||
:props="noteProps"
|
||||
style="white-space: normal; word-wrap: break-word;"
|
||||
>
|
||||
<div v-html="renderMarkdown(noteProps.row.content)" />
|
||||
</q-td>
|
||||
<q-td
|
||||
key="createdAt"
|
||||
:props="noteProps"
|
||||
>
|
||||
{{ new Date(noteProps.row.createdAt).toLocaleString() }}
|
||||
</q-td>
|
||||
</q-tr>
|
||||
</template>
|
||||
<template #no-data="{ message }">
|
||||
<div class="full-width row flex-center text-grey q-gutter-sm q-pa-md">
|
||||
<q-icon
|
||||
size="2em"
|
||||
name="note_alt"
|
||||
/>
|
||||
<span>
|
||||
{{ notesSearchTerm ? 'No notes match your search.' : message }}
|
||||
</span>
|
||||
</div>
|
||||
</template>
|
||||
</q-table>
|
||||
</q-tab-panel>
|
||||
</q-tab-panels>
|
||||
</q-card-section>
|
||||
|
@ -314,10 +558,11 @@
|
|||
</template>
|
||||
|
||||
<script setup>
|
||||
import { ref, watch, defineProps, defineEmits } from 'vue';
|
||||
import { ref, watch, defineProps, defineEmits, computed } from 'vue';
|
||||
import axios from 'axios';
|
||||
import { useQuasar } from 'quasar';
|
||||
import DOMPurify from 'dompurify';
|
||||
import { marked } from 'marked'; // Import marked
|
||||
|
||||
import {usePreferencesStore} from 'stores/preferences.js';
|
||||
|
||||
|
@ -337,8 +582,63 @@ const $q = useQuasar();
|
|||
const ticket = ref(null);
|
||||
const loading = ref(false);
|
||||
const error = ref(null);
|
||||
const tab = ref('details'); // Add state for the active tab
|
||||
const loadingMsgId = ref(null); // Track which MSG attachment is currently loading
|
||||
const tab = ref('details');
|
||||
const loadingMsgId = ref(null);
|
||||
const filesSearchTerm = ref(''); // For filtering the files table
|
||||
const newNoteContent = ref(''); // For the new note input
|
||||
const addingNote = ref(false); // Loading state for adding a note
|
||||
const notesSearchTerm = ref(''); // For filtering the notes table
|
||||
|
||||
// --- New state for manual file upload ---
|
||||
const filesToUpload = ref([]); // Holds files selected via q-file
|
||||
const uploading = ref(false); // Tracks if an upload process is active
|
||||
const uploadProgress = ref(0); // Overall progress (0 to 1)
|
||||
const currentUploadingFile = ref(''); // Name of the file currently being uploaded
|
||||
const currentFileIndex = ref(0); // Index of the current file being uploaded
|
||||
const totalFilesToUpload = ref(0); // Total files in the current batch
|
||||
// --- End new state ---
|
||||
|
||||
// --- Table Columns Definition ---
|
||||
const filesTableColumns = ref([
|
||||
{ name: 'filename', label: 'Filename', field: 'filename', align: 'left', sortable: true, style: 'min-width: 200px; max-width: 300px; overflow-wrap: break-word; white-space: normal;' },
|
||||
{ name: 'uploadedBy', label: 'Uploaded By', field: 'uploadedBy', align: 'left', sortable: true },
|
||||
{ name: 'uploadedAt', label: 'Uploaded At', field: 'uploadedAt', align: 'left', sortable: true, format: val => new Date(val).toLocaleString() },
|
||||
{ name: 'size', label: 'Size', field: 'size', align: 'right', sortable: true, format: val => formatFileSize(val) },
|
||||
{ name: 'actions', label: 'Actions', field: 'id', align: 'center', sortable: false }
|
||||
]);
|
||||
|
||||
const notesTableColumns = ref([
|
||||
{ name: 'createdBy', label: 'Created By', field: 'createdBy', align: 'left', sortable: true, style: 'width: 150px; vertical-align: top;' }, // Adjust width and alignment
|
||||
{ name: 'createdAt', label: 'Created At', field: 'createdAt', align: 'left', sortable: true, format: val => new Date(val).toLocaleString(), style: 'width: 200px; vertical-align: top;' }, // Adjust width and alignment
|
||||
{ name: 'content', label: 'Content', field: 'content', align: 'left', sortable: false, style: 'white-space: normal; word-wrap: break-word; min-width: 300px;' } // Ensure wrapping
|
||||
]);
|
||||
|
||||
// --- Computed property for filtered files ---
|
||||
const filteredFiles = computed(() =>
|
||||
{
|
||||
const searchTerm = filesSearchTerm.value?.toLowerCase() || '';
|
||||
if (!ticket.value?.files) return [];
|
||||
if (!searchTerm) return ticket.value.files;
|
||||
|
||||
return ticket.value.files.filter(file =>
|
||||
file.filename?.toLowerCase().includes(searchTerm) ||
|
||||
file.uploadedBy?.toLowerCase().includes(searchTerm) ||
|
||||
file.description?.toLowerCase().includes(searchTerm) // Optional: include description in search
|
||||
);
|
||||
});
|
||||
|
||||
// --- Computed property for filtered notes ---
|
||||
const filteredNotes = computed(() =>
|
||||
{
|
||||
const searchTerm = notesSearchTerm.value?.toLowerCase() || '';
|
||||
if (!ticket.value?.notes) return [];
|
||||
if (!searchTerm) return ticket.value.notes;
|
||||
|
||||
return ticket.value.notes.filter(note =>
|
||||
note.content?.toLowerCase().includes(searchTerm) ||
|
||||
note.createdBy?.toLowerCase().includes(searchTerm)
|
||||
);
|
||||
});
|
||||
|
||||
const sanitiseComment = (comment) =>
|
||||
{
|
||||
|
@ -355,6 +655,19 @@ const sanitiseComment = (comment) =>
|
|||
});
|
||||
};
|
||||
|
||||
// Function to render and sanitize Markdown
|
||||
const renderMarkdown = (content) =>
|
||||
{
|
||||
if (!content) return '';
|
||||
// Configure marked (optional, e.g., enable GitHub Flavored Markdown)
|
||||
marked.setOptions({
|
||||
gfm: true,
|
||||
breaks: true, // Convert single line breaks to <br>
|
||||
});
|
||||
const rawHtml = marked.parse(content);
|
||||
return DOMPurify.sanitize(rawHtml, { USE_PROFILES: { html: true } }); // Use html profile for common tags
|
||||
};
|
||||
|
||||
// Function to preview MSG file content
|
||||
const previewMsgFile = async(ticketId, attachmentId, filename) =>
|
||||
{
|
||||
|
@ -369,8 +682,7 @@ const previewMsgFile = async(ticketId, attachmentId, filename) =>
|
|||
message: createEmailPreview(response.data, ticketId, attachmentId),
|
||||
html: true,
|
||||
style: 'min-width: 70vw; min-height: 60vh;',
|
||||
maximized: $q.screen.lt.md,
|
||||
persistent: true
|
||||
maximized: $q.screen.lt.md
|
||||
});
|
||||
}
|
||||
catch (err)
|
||||
|
@ -457,14 +769,17 @@ const fetchTicketDetails = async(id) =>
|
|||
if (!id) return;
|
||||
loading.value = true;
|
||||
error.value = null;
|
||||
ticket.value = null;
|
||||
// ticket.value = null; // Don't nullify immediately if refreshing
|
||||
try
|
||||
{
|
||||
const response = await axios.get(`/api/mantis/${id}`);
|
||||
// Ensure notes array exists even if empty
|
||||
response.data.notes = response.data.notes || [];
|
||||
ticket.value = response.data;
|
||||
|
||||
|
||||
//Check user preference for comment order
|
||||
if(preferencesStore.values.mantisCommentsOrder === 'newest')
|
||||
if(preferencesStore.values.mantisCommentsOrder === 'newest' && ticket.value.comments)
|
||||
{
|
||||
ticket.value.comments.reverse();
|
||||
}
|
||||
|
@ -477,6 +792,8 @@ const fetchTicketDetails = async(id) =>
|
|||
type: 'negative',
|
||||
message: error.value
|
||||
});
|
||||
// Clear ticket if fetch fails completely
|
||||
if (!ticket.value) ticket.value = null;
|
||||
}
|
||||
finally
|
||||
{
|
||||
|
@ -489,6 +806,16 @@ const resetDialog = () =>
|
|||
ticket.value = null;
|
||||
loading.value = false;
|
||||
error.value = null;
|
||||
filesToUpload.value = []; // Reset selected files on close
|
||||
uploading.value = false;
|
||||
uploadProgress.value = 0;
|
||||
currentUploadingFile.value = '';
|
||||
currentFileIndex.value = 0;
|
||||
totalFilesToUpload.value = 0;
|
||||
filesSearchTerm.value = ''; // Clear search on dialog reset
|
||||
newNoteContent.value = ''; // Clear new note input
|
||||
addingNote.value = false; // Reset note adding state
|
||||
notesSearchTerm.value = ''; // Clear notes search on dialog reset
|
||||
// Emit close event if needed, though v-close-popup handles visibility
|
||||
emit('close');
|
||||
};
|
||||
|
@ -561,6 +888,16 @@ const isImageFile = (filename) =>
|
|||
return (/\.(jpg|jpeg|png|gif|bmp|webp)$/i).test(filename);
|
||||
};
|
||||
|
||||
const isVideoFile = (filename) =>
|
||||
{
|
||||
return (/\.(mp4|webm|ogg|mov|avi|wmv|flv)$/i).test(filename);
|
||||
};
|
||||
|
||||
const isAudioFile = (filename) =>
|
||||
{
|
||||
return (/\.(mp3|wav|ogg|aac|flac|m4a)$/i).test(filename);
|
||||
};
|
||||
|
||||
const isMsgFile = (filename) =>
|
||||
{
|
||||
return (/\.(msg)$/i).test(filename);
|
||||
|
@ -581,6 +918,164 @@ const openImageFullscreen = (src, filename) =>
|
|||
});
|
||||
};
|
||||
|
||||
|
||||
// --- New Manual Upload Logic ---
|
||||
const uploadFiles = async() =>
|
||||
{
|
||||
if (!filesToUpload.value || filesToUpload.value.length === 0 || !props.ticketId)
|
||||
{
|
||||
return;
|
||||
}
|
||||
|
||||
uploading.value = true;
|
||||
uploadProgress.value = 0;
|
||||
totalFilesToUpload.value = filesToUpload.value.length;
|
||||
let successfulUploads = 0;
|
||||
const errors = [];
|
||||
|
||||
for (let i = 0; i < filesToUpload.value.length; i++)
|
||||
{
|
||||
const file = filesToUpload.value[i];
|
||||
currentFileIndex.value = i;
|
||||
currentUploadingFile.value = file.name;
|
||||
uploadProgress.value = i / totalFilesToUpload.value; // Progress before starting upload
|
||||
|
||||
try
|
||||
{
|
||||
// 1. Get presigned URL
|
||||
console.log(`Requesting presign for: ${file.name}, type: ${file.type}`);
|
||||
const presignResponse = await axios.post(`/api/mantis/${props.ticketId}/files/presign`, {
|
||||
filename: file.name,
|
||||
filetype: file.type || 'application/octet-stream',
|
||||
});
|
||||
const { presignedUrl, fileKey } = presignResponse.data;
|
||||
console.log(`Got presign URL: ${presignedUrl}, fileKey: ${fileKey}`);
|
||||
|
||||
// 2. Upload to S3 using PUT request
|
||||
console.log(`Uploading ${file.name} to S3...`);
|
||||
await axios.put(presignedUrl, file, {
|
||||
headers: {
|
||||
'Content-Type': file.type || 'application/octet-stream',
|
||||
},
|
||||
onUploadProgress: (progressEvent) =>
|
||||
{
|
||||
const percentCompleted = progressEvent.loaded / progressEvent.total;
|
||||
// Update overall progress based on current file's progress
|
||||
uploadProgress.value = (i + percentCompleted) / totalFilesToUpload.value;
|
||||
},
|
||||
withCredentials: false // Ensure cookies are not sent with the request
|
||||
});
|
||||
console.log(`Successfully uploaded ${file.name} to S3.`);
|
||||
|
||||
// 3. Confirm upload with backend
|
||||
console.log(`Confirming upload for ${file.name} with fileKey: ${fileKey}`);
|
||||
const confirmResponse = await axios.post(`/api/mantis/${props.ticketId}/files/confirm`, {
|
||||
fileKey: fileKey,
|
||||
filename: file.name,
|
||||
mimeType: file.type || 'application/octet-stream',
|
||||
size: file.size,
|
||||
});
|
||||
const newMantisFile = confirmResponse.data;
|
||||
console.log(`Upload confirmed for ${file.name}.`);
|
||||
|
||||
// 4. Add confirmed file to local state
|
||||
if (ticket.value && ticket.value.files)
|
||||
{
|
||||
ticket.value.files.push(newMantisFile);
|
||||
}
|
||||
else if (ticket.value)
|
||||
{
|
||||
ticket.value.files = [newMantisFile];
|
||||
}
|
||||
successfulUploads++;
|
||||
|
||||
}
|
||||
catch (err)
|
||||
{
|
||||
console.error(`Error processing file ${file.name}:`, err);
|
||||
const errorMessage = err.response?.data?.error || err.message || 'Unknown error';
|
||||
errors.push(`Failed to upload ${file.name}: ${errorMessage}`);
|
||||
// Stop on first error? Or continue? Let's continue for now.
|
||||
}
|
||||
}
|
||||
|
||||
// Finalize
|
||||
uploading.value = false;
|
||||
currentUploadingFile.value = '';
|
||||
uploadProgress.value = 1; // Show complete
|
||||
filesToUpload.value = []; // Clear selection after upload attempt
|
||||
|
||||
// Notify user
|
||||
if (errors.length === 0)
|
||||
{
|
||||
$q.notify({
|
||||
type: 'positive',
|
||||
icon: 'cloud_done',
|
||||
message: `${successfulUploads} file(s) uploaded successfully.`
|
||||
});
|
||||
}
|
||||
else
|
||||
{
|
||||
$q.notify({
|
||||
type: 'negative',
|
||||
message: `Upload finished with ${errors.length} error(s).`,
|
||||
caption: errors.join('<br>'),
|
||||
html: true,
|
||||
multiLine: true,
|
||||
timeout: 10000 // Longer timeout for errors
|
||||
});
|
||||
}
|
||||
// Short delay before resetting progress bar
|
||||
setTimeout(() =>
|
||||
{
|
||||
uploadProgress.value = 0;
|
||||
}, 1500);
|
||||
};
|
||||
// --- End New Manual Upload Logic ---
|
||||
|
||||
// --- New Internal Note Logic ---
|
||||
const addNote = async() =>
|
||||
{
|
||||
if (!newNoteContent.value || !props.ticketId) return;
|
||||
|
||||
addingNote.value = true;
|
||||
try
|
||||
{
|
||||
const response = await axios.post(`/api/mantis/${props.ticketId}/notes`, {
|
||||
content: newNoteContent.value,
|
||||
});
|
||||
|
||||
// Add the new note to the beginning of the local array
|
||||
if (ticket.value && ticket.value.notes)
|
||||
{
|
||||
ticket.value.notes.unshift(response.data); // Add to top for immediate visibility
|
||||
}
|
||||
else if (ticket.value)
|
||||
{
|
||||
ticket.value.notes = [response.data];
|
||||
}
|
||||
|
||||
newNoteContent.value = ''; // Clear the input field
|
||||
$q.notify({
|
||||
type: 'positive',
|
||||
message: 'Internal note added successfully.'
|
||||
});
|
||||
}
|
||||
catch (err)
|
||||
{
|
||||
console.error('Error adding internal note:', err);
|
||||
$q.notify({
|
||||
type: 'negative',
|
||||
message: `Failed to add note: ${err.response?.data?.error || err.message}`
|
||||
});
|
||||
}
|
||||
finally
|
||||
{
|
||||
addingNote.value = false;
|
||||
}
|
||||
};
|
||||
// --- End Internal Note Logic ---
|
||||
|
||||
</script>
|
||||
|
||||
<style lang="scss" scoped>
|
||||
|
@ -595,4 +1090,45 @@ const openImageFullscreen = (src, filename) =>
|
|||
max-height: 300px; /* Limit height if descriptions are long */
|
||||
overflow-y: auto;
|
||||
}
|
||||
|
||||
.files-table {
|
||||
/* Optional: Adjust table appearance */
|
||||
thead tr th {
|
||||
position: sticky;
|
||||
top: 0;
|
||||
z-index: 1;
|
||||
background-color: $grey-2; /* Adjust background for sticky header */
|
||||
}
|
||||
/* Ensure table cells don't collapse too much */
|
||||
td {
|
||||
white-space: nowrap; /* Prevent text wrapping by default */
|
||||
}
|
||||
/* Allow wrapping specifically for filename */
|
||||
td[key="filename"], th[name="filename"] {
|
||||
white-space: normal;
|
||||
word-wrap: break-word;
|
||||
}
|
||||
}
|
||||
|
||||
.notes-table {
|
||||
/* Optional: Adjust table appearance */
|
||||
thead tr th {
|
||||
position: sticky;
|
||||
top: 0;
|
||||
z-index: 1;
|
||||
background-color: $grey-2; /* Adjust background for sticky header */
|
||||
}
|
||||
|
||||
/* Style for rendered markdown content if needed */
|
||||
td[key="content"] div {
|
||||
/* Example: Add some padding or adjust line height */
|
||||
line-height: 1.5;
|
||||
}
|
||||
td[key="content"] div :first-child {
|
||||
margin-top: 0; /* Remove potential top margin from first element (e.g., <p>) */
|
||||
}
|
||||
td[key="content"] div :last-child {
|
||||
margin-bottom: 0; /* Remove potential bottom margin from last element */
|
||||
}
|
||||
}
|
||||
</style>
|
||||
|
|
|
@ -207,7 +207,7 @@
|
|||
v-model="isChatVisible"
|
||||
style="z-index: 100000000"
|
||||
>
|
||||
<q-card style="width: max(400px, 25%); height: 600px; max-height: 80vh;">
|
||||
<q-card style=" height: 100%; max-width: 100%; width: min(750px, max(400px, 50%))">
|
||||
<q-bar class="bg-primary text-white">
|
||||
<div>Chat</div>
|
||||
<q-space />
|
||||
|
|
|
@ -135,6 +135,24 @@ const settings = ref({
|
|||
name: 'MySQL Database',
|
||||
key: 'MYSQL_DATABASE'
|
||||
}
|
||||
],
|
||||
S3: [
|
||||
{
|
||||
name: 'S3 Endpoint [Internal]',
|
||||
key: 'S3_ENDPOINT'
|
||||
},
|
||||
{
|
||||
name: 'S3 Bucket Name',
|
||||
key: 'S3_BUCKET_NAME'
|
||||
},
|
||||
{
|
||||
name: 'S3 Access Key ID',
|
||||
key: 'S3_ACCESS_KEY_ID'
|
||||
},
|
||||
{
|
||||
name: 'S3 Secret Access Key',
|
||||
key: 'S3_SECRET_ACCESS_KEY'
|
||||
}
|
||||
]
|
||||
});
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue