WIP: Umami service
This commit is contained in:
164
CONTRIBUTING.md
164
CONTRIBUTING.md
@@ -12,9 +12,6 @@ This is a little list of what you can do to help the project:
|
|||||||
|
|
||||||
- [🧑💻 Develop your own ideas](#developer-contribution)
|
- [🧑💻 Develop your own ideas](#developer-contribution)
|
||||||
- [🌐 Translate the project](#translation)
|
- [🌐 Translate the project](#translation)
|
||||||
- [📄 Help sorting out the issues](#help-sorting-out-the-issues)
|
|
||||||
- [🎯 Test Pull Requests](#test-pull-requests)
|
|
||||||
- [✒️ Help with the documentation](#help-with-the-documentation)
|
|
||||||
|
|
||||||
## 👋 Introduction
|
## 👋 Introduction
|
||||||
|
|
||||||
@@ -60,6 +57,7 @@ You need to have [Docker Engine](https://docs.docker.com/engine/install/) instal
|
|||||||
- **Languages**: Node.js / Javascript / Typescript
|
- **Languages**: Node.js / Javascript / Typescript
|
||||||
- **Framework JS/TS**: Svelte / SvelteKit
|
- **Framework JS/TS**: Svelte / SvelteKit
|
||||||
- **Database ORM**: Prisma.io
|
- **Database ORM**: Prisma.io
|
||||||
|
- **Docker Engine**
|
||||||
|
|
||||||
### Database migrations
|
### Database migrations
|
||||||
|
|
||||||
@@ -83,59 +81,159 @@ You can add any open-source and self-hostable software (service/application) to
|
|||||||
|
|
||||||
## Backend
|
## Backend
|
||||||
|
|
||||||
I use MinIO as an example.
|
There are 5 steps you should make on the backend side.
|
||||||
|
|
||||||
You need to add a new folder to [src/routes/services/[id]](src/routes/services/[id]) with the low-capital name of the service. It should have three files with the following properties:
|
1. Create Prisma / database schema for the new service.
|
||||||
|
2. Add supported versions of the service.
|
||||||
|
3. Update global functions.
|
||||||
|
4. Create API endpoints.
|
||||||
|
5. Define automatically generated variables.
|
||||||
|
|
||||||
1. If you need to store passwords or any persistent data for the service, do the followings:
|
> I will use [Umami](https://umami.is/) as an example service.
|
||||||
|
|
||||||
- Update Prisma schema in [prisma/schema.prisma](prisma/schema.prisma). Add a new model with details about the required fields.
|
### Create Prisma / database schema for the new service.
|
||||||
- If you finished with the Prism schema, update the database schema with `pnpm db:push` command. It will also generate the Prisma Typescript types for you.
|
|
||||||
- Tip: If you use VSCode, you probably need to restart the `Typescript Language Server` to get the new types loaded in the running VSCode.
|
|
||||||
- Include the new service to `listServicesWithIncludes` function in [src/lib/database/services.ts](src/lib/database/services.ts)
|
|
||||||
|
|
||||||
**Important**: You need to take care of encryption / decryption of the data (where applicable).
|
You only need to do this if you store passwords or any persistent configuration. Mostly it is required by all services, but there are some exceptions, like NocoDB.
|
||||||
|
|
||||||
2. `index.json.ts`: A POST endpoint that updates Coolify's database about the service.
|
Update Prisma schema in [prisma/schema.prisma](prisma/schema.prisma).
|
||||||
|
|
||||||
Basic services only require updating the URL(fqdn) and the name of the service.
|
- Add new model with the new service name.
|
||||||
|
- Make a relationshup with `Service` model.
|
||||||
|
- In the `Service` model, the name of the new field should be with low-capital.
|
||||||
|
- If the service needs a database, define a `publicPort` field to be able to make it's database public, example field name in case of PostgreSQL: `postgresqlPublicPort`. It should be a optional field.
|
||||||
|
|
||||||
3. `start.json.ts`: A start endpoint that setups the docker-compose file (for Local Docker Engines) and starts the service.
|
If you are finished with the Prisma schema, you should update the database schema with `pnpm db:push` command.
|
||||||
|
|
||||||
- To start a service, you need to know Coolify supported images and tags of the service. For that you need to update `supportedServiceTypesAndVersions` function at [src/lib/components/common.ts](src/lib/components/common.ts).
|
> You must restart the running development environment to be able to use the new model
|
||||||
|
|
||||||
Example JSON:
|
> If you use VSCode, you probably need to restart the `Typescript Language Server` to get the new types loaded in the running VSCode.
|
||||||
|
|
||||||
```js
|
### Add supported versions
|
||||||
|
|
||||||
|
Supported versions are hardcoded into Coolify (for now).
|
||||||
|
|
||||||
|
You need to update `supportedServiceTypesAndVersions` function at [src/lib/components/common.ts](src/lib/components/common.ts). Example JSON:
|
||||||
|
|
||||||
|
```js
|
||||||
{
|
{
|
||||||
// Name used to identify the service in Coolify
|
// Name used to identify the service internally
|
||||||
name: 'minio',
|
name: 'umami',
|
||||||
// Fancier name to show to the user
|
// Fancier name to show to the user
|
||||||
fancyName: 'MinIO',
|
fancyName: 'Umami',
|
||||||
// Docker base image for the service
|
// Docker base image for the service
|
||||||
baseImage: 'minio/minio',
|
baseImage: 'ghcr.io/mikecao/umami',
|
||||||
|
// Optional: If there is any dependent image, you should list it here
|
||||||
|
images: [],
|
||||||
// Usable tags
|
// Usable tags
|
||||||
versions: ['latest'],
|
versions: ['postgresql-latest'],
|
||||||
// Which tag is the recommended
|
// Which tag is the recommended
|
||||||
recommendedVersion: 'latest',
|
recommendedVersion: 'postgresql-latest',
|
||||||
// Application's default port, MinIO listens on 9001 (and 9000, more details later on)
|
// Application's default port, Umami listens on 3000
|
||||||
ports: {
|
ports: {
|
||||||
main: 9001
|
main: 3000
|
||||||
}
|
}
|
||||||
},
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
- You need to define a compose file as `const composeFile: ComposeFile` found in [src/routes/services/[id]/minio/start.json.ts](src/routes/services/[id]/minio/start.json.ts)
|
### Update global functions
|
||||||
|
|
||||||
**IMPORTANT:** It should contain `all the default environment variables` that are required for the service to function correctly and `all the volumes to persist data` in restarts.
|
1. Add the new service to the `include` variable in [src/lib/database/services.ts](src/lib/database/services.ts), so it will be included in all places in the database queries where it is required.
|
||||||
|
|
||||||
- You could also define an `HTTP` or `TCP` proxy for every other port that should be proxied to your server. (See `startHttpProxy` and `startTcpProxy` functions in [src/lib/haproxy/index.ts](src/lib/haproxy/index.ts))
|
```js
|
||||||
|
const include: Prisma.ServiceInclude = {
|
||||||
|
destinationDocker: true,
|
||||||
|
persistentStorage: true,
|
||||||
|
serviceSecret: true,
|
||||||
|
minio: true,
|
||||||
|
plausibleAnalytics: true,
|
||||||
|
vscodeserver: true,
|
||||||
|
wordpress: true,
|
||||||
|
ghost: true,
|
||||||
|
meiliSearch: true,
|
||||||
|
umami: true // This line!
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
4. `stop.json.ts` A stop endpoint that stops the service.
|
2. Update the database update query with the new service type to `configureServiceType` function in [src/lib/database/services.ts](src/lib/database/services.ts). This function defines the automatically generated variables (passwords, users, etc.) and it's encryption process (if applicable).
|
||||||
|
|
||||||
It needs to stop all the services by their container name and proxies (if applicable).
|
```js
|
||||||
|
[...]
|
||||||
|
else if (type === 'umami') {
|
||||||
|
const postgresqlUser = cuid();
|
||||||
|
const postgresqlPassword = encrypt(generatePassword());
|
||||||
|
const postgresqlDatabase = 'umami';
|
||||||
|
const hashSalt = encrypt(generatePassword(64));
|
||||||
|
await prisma.service.update({
|
||||||
|
where: { id },
|
||||||
|
data: {
|
||||||
|
type,
|
||||||
|
umami: {
|
||||||
|
create: {
|
||||||
|
postgresqlDatabase,
|
||||||
|
postgresqlPassword,
|
||||||
|
postgresqlUser,
|
||||||
|
hashSalt,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
5. You need to add the automatically generated variables (passwords, users, etc.) for the new service at [src/lib/database/services.ts](src/lib/database/services.ts), `configureServiceType` function.
|
3. Add decryption process for configurations and passwords to `getService` function in [src/lib/database/services.ts](src/lib/database/services.ts)
|
||||||
|
|
||||||
|
```js
|
||||||
|
if (body.umami?.postgresqlPassword)
|
||||||
|
body.umami.postgresqlPassword = decrypt(body.umami.postgresqlPassword);
|
||||||
|
|
||||||
|
if (body.umami?.hashSalt) body.umami.hashSalt = decrypt(body.umami.hashSalt);
|
||||||
|
```
|
||||||
|
|
||||||
|
4. Add service deletion query to `removeService` function in [src/lib/database/services.ts](src/lib/database/services.ts)
|
||||||
|
|
||||||
|
### Create API endpoints.
|
||||||
|
|
||||||
|
You need to add a new folder under [src/routes/services/[id]](src/routes/services/[id]) with the low-capital name of the service. You need 3 default files in that folder.
|
||||||
|
|
||||||
|
#### `index.json.ts`:
|
||||||
|
|
||||||
|
It has a POST endpoint that updates the service details in Coolify's database, such as name, url, other configurations, like passwords. It should look something like this:
|
||||||
|
|
||||||
|
```js
|
||||||
|
import { getUserDetails } from '$lib/common';
|
||||||
|
import * as db from '$lib/database';
|
||||||
|
import { ErrorHandler } from '$lib/database';
|
||||||
|
import type { RequestHandler } from '@sveltejs/kit';
|
||||||
|
|
||||||
|
export const post: RequestHandler = async (event) => {
|
||||||
|
const { status, body } = await getUserDetails(event);
|
||||||
|
if (status === 401) return { status, body };
|
||||||
|
|
||||||
|
const { id } = event.params;
|
||||||
|
|
||||||
|
let { name, fqdn } = await event.request.json();
|
||||||
|
if (fqdn) fqdn = fqdn.toLowerCase();
|
||||||
|
|
||||||
|
try {
|
||||||
|
await db.updateService({ id, fqdn, name });
|
||||||
|
return { status: 201 };
|
||||||
|
} catch (error) {
|
||||||
|
return ErrorHandler(error);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
If it's necessary, you can create your own database update function, specifically for the new service.
|
||||||
|
|
||||||
|
#### `start.json.ts`
|
||||||
|
|
||||||
|
It has a POST endpoint that sets all the required secrets, persistent volumes, `docker-compose.yaml` file and sends a request to the specified docker engine.
|
||||||
|
|
||||||
|
You could also define an `HTTP` or `TCP` proxy for every other port that should be proxied to your server. (See `startHttpProxy` and `startTcpProxy` functions in [src/lib/haproxy/index.ts](src/lib/haproxy/index.ts))
|
||||||
|
|
||||||
|
#### `stop.json.ts`
|
||||||
|
|
||||||
|
It has a POST endpoint that stops the service and all dependent (TCP/HTTP proxies) containers. If publicPort is specified it also needs to cleanup it from the database.
|
||||||
|
|
||||||
## Frontend
|
## Frontend
|
||||||
|
|
||||||
|
@@ -301,6 +301,7 @@ model Service {
|
|||||||
serviceSecret ServiceSecret[]
|
serviceSecret ServiceSecret[]
|
||||||
meiliSearch MeiliSearch?
|
meiliSearch MeiliSearch?
|
||||||
persistentStorage ServicePersistentStorage[]
|
persistentStorage ServicePersistentStorage[]
|
||||||
|
umami Umami?
|
||||||
}
|
}
|
||||||
|
|
||||||
model PlausibleAnalytics {
|
model PlausibleAnalytics {
|
||||||
@@ -385,3 +386,17 @@ model MeiliSearch {
|
|||||||
createdAt DateTime @default(now())
|
createdAt DateTime @default(now())
|
||||||
updatedAt DateTime @updatedAt
|
updatedAt DateTime @updatedAt
|
||||||
}
|
}
|
||||||
|
|
||||||
|
model Umami {
|
||||||
|
id String @id @default(cuid())
|
||||||
|
serviceId String @unique
|
||||||
|
postgresqlUser String
|
||||||
|
postgresqlPassword String
|
||||||
|
postgresqlDatabase String
|
||||||
|
postgresqlPublicPort Int?
|
||||||
|
umamiAdminPassword String
|
||||||
|
hashSalt String
|
||||||
|
service Service @relation(fields: [serviceId], references: [id])
|
||||||
|
createdAt DateTime @default(now())
|
||||||
|
updatedAt DateTime @updatedAt
|
||||||
|
}
|
||||||
|
@@ -26,7 +26,7 @@ try {
|
|||||||
initialScope: {
|
initialScope: {
|
||||||
tags: {
|
tags: {
|
||||||
appId: process.env['COOLIFY_APP_ID'],
|
appId: process.env['COOLIFY_APP_ID'],
|
||||||
'os.arch': os.arch(),
|
'os.arch': getOsArch(),
|
||||||
'os.platform': os.platform(),
|
'os.platform': os.platform(),
|
||||||
'os.release': os.release()
|
'os.release': os.release()
|
||||||
}
|
}
|
||||||
@@ -175,3 +175,7 @@ export function generateTimestamp(): string {
|
|||||||
export function getDomain(domain: string): string {
|
export function getDomain(domain: string): string {
|
||||||
return domain?.replace('https://', '').replace('http://', '');
|
return domain?.replace('https://', '').replace('http://', '');
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function getOsArch() {
|
||||||
|
return os.arch();
|
||||||
|
}
|
||||||
|
@@ -180,5 +180,16 @@ export const supportedServiceTypesAndVersions = [
|
|||||||
ports: {
|
ports: {
|
||||||
main: 7700
|
main: 7700
|
||||||
}
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: 'umami',
|
||||||
|
fancyName: 'Umami',
|
||||||
|
baseImage: 'ghcr.io/mikecao/umami',
|
||||||
|
images: ['postgres:12-alpine'],
|
||||||
|
versions: ['postgresql-latest'],
|
||||||
|
recommendedVersion: 'postgresql-latest',
|
||||||
|
ports: {
|
||||||
|
main: 3000
|
||||||
|
}
|
||||||
}
|
}
|
||||||
];
|
];
|
||||||
|
@@ -1,20 +1,24 @@
|
|||||||
import { decrypt, encrypt } from '$lib/crypto';
|
import { decrypt, encrypt } from '$lib/crypto';
|
||||||
import type { Minio, Service } from '@prisma/client';
|
import type { Minio, Prisma, Service } from '@prisma/client';
|
||||||
import cuid from 'cuid';
|
import cuid from 'cuid';
|
||||||
import { generatePassword } from '.';
|
import { generatePassword } from '.';
|
||||||
import { prisma } from './common';
|
import { prisma } from './common';
|
||||||
|
|
||||||
|
const include: Prisma.ServiceInclude = {
|
||||||
|
destinationDocker: true,
|
||||||
|
persistentStorage: true,
|
||||||
|
serviceSecret: true,
|
||||||
|
minio: true,
|
||||||
|
plausibleAnalytics: true,
|
||||||
|
vscodeserver: true,
|
||||||
|
wordpress: true,
|
||||||
|
ghost: true,
|
||||||
|
meiliSearch: true,
|
||||||
|
umami: true
|
||||||
|
};
|
||||||
export async function listServicesWithIncludes() {
|
export async function listServicesWithIncludes() {
|
||||||
return await prisma.service.findMany({
|
return await prisma.service.findMany({
|
||||||
include: {
|
include,
|
||||||
destinationDocker: true,
|
|
||||||
minio: true,
|
|
||||||
plausibleAnalytics: true,
|
|
||||||
vscodeserver: true,
|
|
||||||
wordpress: true,
|
|
||||||
ghost: true,
|
|
||||||
meiliSearch: true
|
|
||||||
},
|
|
||||||
orderBy: { createdAt: 'desc' }
|
orderBy: { createdAt: 'desc' }
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
@@ -44,35 +48,21 @@ export async function getService({ id, teamId }: { id: string; teamId: string })
|
|||||||
if (teamId === '0') {
|
if (teamId === '0') {
|
||||||
body = await prisma.service.findFirst({
|
body = await prisma.service.findFirst({
|
||||||
where: { id },
|
where: { id },
|
||||||
include: {
|
include
|
||||||
destinationDocker: true,
|
|
||||||
plausibleAnalytics: true,
|
|
||||||
minio: true,
|
|
||||||
vscodeserver: true,
|
|
||||||
wordpress: true,
|
|
||||||
ghost: true,
|
|
||||||
serviceSecret: true,
|
|
||||||
meiliSearch: true,
|
|
||||||
persistentStorage: true
|
|
||||||
}
|
|
||||||
});
|
});
|
||||||
} else {
|
} else {
|
||||||
body = await prisma.service.findFirst({
|
body = await prisma.service.findFirst({
|
||||||
where: { id, teams: { some: { id: teamId } } },
|
where: { id, teams: { some: { id: teamId } } },
|
||||||
include: {
|
include
|
||||||
destinationDocker: true,
|
|
||||||
plausibleAnalytics: true,
|
|
||||||
minio: true,
|
|
||||||
vscodeserver: true,
|
|
||||||
wordpress: true,
|
|
||||||
ghost: true,
|
|
||||||
serviceSecret: true,
|
|
||||||
meiliSearch: true,
|
|
||||||
persistentStorage: true
|
|
||||||
}
|
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (body?.serviceSecret.length > 0) {
|
||||||
|
body.serviceSecret = body.serviceSecret.map((s) => {
|
||||||
|
s.value = decrypt(s.value);
|
||||||
|
return s;
|
||||||
|
});
|
||||||
|
}
|
||||||
if (body.plausibleAnalytics?.postgresqlPassword)
|
if (body.plausibleAnalytics?.postgresqlPassword)
|
||||||
body.plausibleAnalytics.postgresqlPassword = decrypt(
|
body.plausibleAnalytics.postgresqlPassword = decrypt(
|
||||||
body.plausibleAnalytics.postgresqlPassword
|
body.plausibleAnalytics.postgresqlPassword
|
||||||
@@ -99,15 +89,14 @@ export async function getService({ id, teamId }: { id: string; teamId: string })
|
|||||||
|
|
||||||
if (body.meiliSearch?.masterKey) body.meiliSearch.masterKey = decrypt(body.meiliSearch.masterKey);
|
if (body.meiliSearch?.masterKey) body.meiliSearch.masterKey = decrypt(body.meiliSearch.masterKey);
|
||||||
|
|
||||||
if (body?.serviceSecret.length > 0) {
|
if (body.wordpress?.ftpPassword) body.wordpress.ftpPassword = decrypt(body.wordpress.ftpPassword);
|
||||||
body.serviceSecret = body.serviceSecret.map((s) => {
|
|
||||||
s.value = decrypt(s.value);
|
if (body.umami?.postgresqlPassword)
|
||||||
return s;
|
body.umami.postgresqlPassword = decrypt(body.umami.postgresqlPassword);
|
||||||
});
|
if (body.umami?.umamiAdminPassword)
|
||||||
}
|
body.umami.umamiAdminPassword = decrypt(body.umami.umamiAdminPassword);
|
||||||
if (body.wordpress?.ftpPassword) {
|
if (body.umami?.hashSalt) body.umami.hashSalt = decrypt(body.umami.hashSalt);
|
||||||
body.wordpress.ftpPassword = decrypt(body.wordpress.ftpPassword);
|
|
||||||
}
|
|
||||||
const settings = await prisma.setting.findFirst();
|
const settings = await prisma.setting.findFirst();
|
||||||
|
|
||||||
return { ...body, settings };
|
return { ...body, settings };
|
||||||
@@ -233,6 +222,27 @@ export async function configureServiceType({
|
|||||||
meiliSearch: { create: { masterKey } }
|
meiliSearch: { create: { masterKey } }
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
} else if (type === 'umami') {
|
||||||
|
const umamiAdminPassword = encrypt(generatePassword());
|
||||||
|
const postgresqlUser = cuid();
|
||||||
|
const postgresqlPassword = encrypt(generatePassword());
|
||||||
|
const postgresqlDatabase = 'umami';
|
||||||
|
const hashSalt = encrypt(generatePassword(64));
|
||||||
|
await prisma.service.update({
|
||||||
|
where: { id },
|
||||||
|
data: {
|
||||||
|
type,
|
||||||
|
umami: {
|
||||||
|
create: {
|
||||||
|
umamiAdminPassword,
|
||||||
|
postgresqlDatabase,
|
||||||
|
postgresqlPassword,
|
||||||
|
postgresqlUser,
|
||||||
|
hashSalt
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -389,6 +399,7 @@ export async function removeService({ id }: { id: string }): Promise<void> {
|
|||||||
await prisma.servicePersistentStorage.deleteMany({ where: { serviceId: id } });
|
await prisma.servicePersistentStorage.deleteMany({ where: { serviceId: id } });
|
||||||
await prisma.meiliSearch.deleteMany({ where: { serviceId: id } });
|
await prisma.meiliSearch.deleteMany({ where: { serviceId: id } });
|
||||||
await prisma.ghost.deleteMany({ where: { serviceId: id } });
|
await prisma.ghost.deleteMany({ where: { serviceId: id } });
|
||||||
|
await prisma.umami.deleteMany({ where: { serviceId: id } });
|
||||||
await prisma.plausibleAnalytics.deleteMany({ where: { serviceId: id } });
|
await prisma.plausibleAnalytics.deleteMany({ where: { serviceId: id } });
|
||||||
await prisma.minio.deleteMany({ where: { serviceId: id } });
|
await prisma.minio.deleteMany({ where: { serviceId: id } });
|
||||||
await prisma.vscodeserver.deleteMany({ where: { serviceId: id } });
|
await prisma.vscodeserver.deleteMany({ where: { serviceId: id } });
|
||||||
|
21
src/routes/services/[id]/umami/index.json.ts
Normal file
21
src/routes/services/[id]/umami/index.json.ts
Normal file
@@ -0,0 +1,21 @@
|
|||||||
|
import { getUserDetails } from '$lib/common';
|
||||||
|
import * as db from '$lib/database';
|
||||||
|
import { ErrorHandler } from '$lib/database';
|
||||||
|
import type { RequestHandler } from '@sveltejs/kit';
|
||||||
|
|
||||||
|
export const post: RequestHandler = async (event) => {
|
||||||
|
const { status, body } = await getUserDetails(event);
|
||||||
|
if (status === 401) return { status, body };
|
||||||
|
|
||||||
|
const { id } = event.params;
|
||||||
|
|
||||||
|
let { name, fqdn } = await event.request.json();
|
||||||
|
if (fqdn) fqdn = fqdn.toLowerCase();
|
||||||
|
|
||||||
|
try {
|
||||||
|
await db.updateService({ id, fqdn, name });
|
||||||
|
return { status: 201 };
|
||||||
|
} catch (error) {
|
||||||
|
return ErrorHandler(error);
|
||||||
|
}
|
||||||
|
};
|
210
src/routes/services/[id]/umami/start.json.ts
Normal file
210
src/routes/services/[id]/umami/start.json.ts
Normal file
@@ -0,0 +1,210 @@
|
|||||||
|
import { asyncExecShell, createDirectories, getEngine, getUserDetails } from '$lib/common';
|
||||||
|
import * as db from '$lib/database';
|
||||||
|
import { promises as fs } from 'fs';
|
||||||
|
import yaml from 'js-yaml';
|
||||||
|
import type { RequestHandler } from '@sveltejs/kit';
|
||||||
|
import { ErrorHandler, getFreePort, getServiceImage } from '$lib/database';
|
||||||
|
import { makeLabelForServices } from '$lib/buildPacks/common';
|
||||||
|
import type { ComposeFile } from '$lib/types/composeFile';
|
||||||
|
import type { Service, DestinationDocker, ServiceSecret, Prisma } from '@prisma/client';
|
||||||
|
|
||||||
|
export const post: RequestHandler = async (event) => {
|
||||||
|
const { teamId, status, body } = await getUserDetails(event);
|
||||||
|
if (status === 401) return { status, body };
|
||||||
|
|
||||||
|
const { id } = event.params;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const service: Service & Prisma.ServiceInclude & { destinationDocker: DestinationDocker } =
|
||||||
|
await db.getService({ id, teamId });
|
||||||
|
const {
|
||||||
|
type,
|
||||||
|
version,
|
||||||
|
destinationDockerId,
|
||||||
|
destinationDocker,
|
||||||
|
serviceSecret,
|
||||||
|
umami: {
|
||||||
|
umamiAdminPassword,
|
||||||
|
postgresqlUser,
|
||||||
|
postgresqlPassword,
|
||||||
|
postgresqlDatabase,
|
||||||
|
hashSalt
|
||||||
|
}
|
||||||
|
} = service;
|
||||||
|
const network = destinationDockerId && destinationDocker.network;
|
||||||
|
const host = getEngine(destinationDocker.engine);
|
||||||
|
|
||||||
|
const { workdir } = await createDirectories({ repository: type, buildId: id });
|
||||||
|
const image = getServiceImage(type);
|
||||||
|
|
||||||
|
const config = {
|
||||||
|
umami: {
|
||||||
|
image: `${image}:${version}`,
|
||||||
|
environmentVariables: {
|
||||||
|
DATABASE_URL: `postgresql://${postgresqlUser}:${postgresqlPassword}@${id}-postgresql:5432/${postgresqlDatabase}`,
|
||||||
|
DATABASE_TYPE: 'postgresql',
|
||||||
|
HASH_SALT: hashSalt
|
||||||
|
}
|
||||||
|
},
|
||||||
|
postgresql: {
|
||||||
|
image: 'postgres:12-alpine',
|
||||||
|
volume: `${id}-postgresql-data:/var/lib/postgresql/data`,
|
||||||
|
environmentVariables: {
|
||||||
|
POSTGRES_USER: postgresqlUser,
|
||||||
|
POSTGRES_PASSWORD: postgresqlPassword,
|
||||||
|
POSTGRES_DB: postgresqlDatabase
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
if (serviceSecret.length > 0) {
|
||||||
|
serviceSecret.forEach((secret) => {
|
||||||
|
config.umami.environmentVariables[secret.name] = secret.value;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
console.log(umamiAdminPassword);
|
||||||
|
const initDbSQL = `
|
||||||
|
drop table if exists event;
|
||||||
|
drop table if exists pageview;
|
||||||
|
drop table if exists session;
|
||||||
|
drop table if exists website;
|
||||||
|
drop table if exists account;
|
||||||
|
|
||||||
|
create table account (
|
||||||
|
user_id serial primary key,
|
||||||
|
username varchar(255) unique not null,
|
||||||
|
password varchar(60) not null,
|
||||||
|
is_admin bool not null default false,
|
||||||
|
created_at timestamp with time zone default current_timestamp,
|
||||||
|
updated_at timestamp with time zone default current_timestamp
|
||||||
|
);
|
||||||
|
|
||||||
|
create table website (
|
||||||
|
website_id serial primary key,
|
||||||
|
website_uuid uuid unique not null,
|
||||||
|
user_id int not null references account(user_id) on delete cascade,
|
||||||
|
name varchar(100) not null,
|
||||||
|
domain varchar(500),
|
||||||
|
share_id varchar(64) unique,
|
||||||
|
created_at timestamp with time zone default current_timestamp
|
||||||
|
);
|
||||||
|
|
||||||
|
create table session (
|
||||||
|
session_id serial primary key,
|
||||||
|
session_uuid uuid unique not null,
|
||||||
|
website_id int not null references website(website_id) on delete cascade,
|
||||||
|
created_at timestamp with time zone default current_timestamp,
|
||||||
|
hostname varchar(100),
|
||||||
|
browser varchar(20),
|
||||||
|
os varchar(20),
|
||||||
|
device varchar(20),
|
||||||
|
screen varchar(11),
|
||||||
|
language varchar(35),
|
||||||
|
country char(2)
|
||||||
|
);
|
||||||
|
|
||||||
|
create table pageview (
|
||||||
|
view_id serial primary key,
|
||||||
|
website_id int not null references website(website_id) on delete cascade,
|
||||||
|
session_id int not null references session(session_id) on delete cascade,
|
||||||
|
created_at timestamp with time zone default current_timestamp,
|
||||||
|
url varchar(500) not null,
|
||||||
|
referrer varchar(500)
|
||||||
|
);
|
||||||
|
|
||||||
|
create table event (
|
||||||
|
event_id serial primary key,
|
||||||
|
website_id int not null references website(website_id) on delete cascade,
|
||||||
|
session_id int not null references session(session_id) on delete cascade,
|
||||||
|
created_at timestamp with time zone default current_timestamp,
|
||||||
|
url varchar(500) not null,
|
||||||
|
event_type varchar(50) not null,
|
||||||
|
event_value varchar(50) not null
|
||||||
|
);
|
||||||
|
|
||||||
|
create index website_user_id_idx on website(user_id);
|
||||||
|
|
||||||
|
create index session_created_at_idx on session(created_at);
|
||||||
|
create index session_website_id_idx on session(website_id);
|
||||||
|
|
||||||
|
create index pageview_created_at_idx on pageview(created_at);
|
||||||
|
create index pageview_website_id_idx on pageview(website_id);
|
||||||
|
create index pageview_session_id_idx on pageview(session_id);
|
||||||
|
create index pageview_website_id_created_at_idx on pageview(website_id, created_at);
|
||||||
|
create index pageview_website_id_session_id_created_at_idx on pageview(website_id, session_id, created_at);
|
||||||
|
|
||||||
|
create index event_created_at_idx on event(created_at);
|
||||||
|
create index event_website_id_idx on event(website_id);
|
||||||
|
create index event_session_id_idx on event(session_id);
|
||||||
|
|
||||||
|
insert into account (username, password, is_admin) values ('admin', '$2b$10$BUli0c.muyCW1ErNJc3jL.vFRFtFJWrT8/GcR4A.sUdCznaXiqFXa', true);`;
|
||||||
|
await fs.writeFile(`${workdir}/schema.postgresql.sql`, initDbSQL);
|
||||||
|
const Dockerfile = `
|
||||||
|
FROM ${config.postgresql.image}
|
||||||
|
COPY ./schema.postgresql.sql /docker-entrypoint-initdb.d/schema.postgresql.sql`;
|
||||||
|
await fs.writeFile(`${workdir}/Dockerfile`, Dockerfile);
|
||||||
|
const composeFile: ComposeFile = {
|
||||||
|
version: '3.8',
|
||||||
|
services: {
|
||||||
|
[id]: {
|
||||||
|
container_name: id,
|
||||||
|
image: config.umami.image,
|
||||||
|
environment: config.umami.environmentVariables,
|
||||||
|
networks: [network],
|
||||||
|
volumes: [],
|
||||||
|
restart: 'always',
|
||||||
|
labels: makeLabelForServices('umami'),
|
||||||
|
deploy: {
|
||||||
|
restart_policy: {
|
||||||
|
condition: 'on-failure',
|
||||||
|
delay: '5s',
|
||||||
|
max_attempts: 3,
|
||||||
|
window: '120s'
|
||||||
|
}
|
||||||
|
},
|
||||||
|
depends_on: [`${id}-postgresql`]
|
||||||
|
},
|
||||||
|
[`${id}-postgresql`]: {
|
||||||
|
build: workdir,
|
||||||
|
container_name: `${id}-postgresql`,
|
||||||
|
environment: config.postgresql.environmentVariables,
|
||||||
|
networks: [network],
|
||||||
|
volumes: [config.postgresql.volume],
|
||||||
|
restart: 'always',
|
||||||
|
deploy: {
|
||||||
|
restart_policy: {
|
||||||
|
condition: 'on-failure',
|
||||||
|
delay: '5s',
|
||||||
|
max_attempts: 3,
|
||||||
|
window: '120s'
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
networks: {
|
||||||
|
[network]: {
|
||||||
|
external: true
|
||||||
|
}
|
||||||
|
},
|
||||||
|
volumes: {
|
||||||
|
[config.postgresql.volume.split(':')[0]]: {
|
||||||
|
name: config.postgresql.volume.split(':')[0]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
const composeFileDestination = `${workdir}/docker-compose.yaml`;
|
||||||
|
await fs.writeFile(composeFileDestination, yaml.dump(composeFile));
|
||||||
|
|
||||||
|
try {
|
||||||
|
await asyncExecShell(`DOCKER_HOST=${host} docker compose -f ${composeFileDestination} pull`);
|
||||||
|
await asyncExecShell(`DOCKER_HOST=${host} docker compose -f ${composeFileDestination} up -d`);
|
||||||
|
return {
|
||||||
|
status: 200
|
||||||
|
};
|
||||||
|
} catch (error) {
|
||||||
|
console.log(error);
|
||||||
|
return ErrorHandler(error);
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
return ErrorHandler(error);
|
||||||
|
}
|
||||||
|
};
|
42
src/routes/services/[id]/umami/stop.json.ts
Normal file
42
src/routes/services/[id]/umami/stop.json.ts
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
import { getUserDetails, removeDestinationDocker } from '$lib/common';
|
||||||
|
import * as db from '$lib/database';
|
||||||
|
import { ErrorHandler } from '$lib/database';
|
||||||
|
import { checkContainer, stopTcpHttpProxy } from '$lib/haproxy';
|
||||||
|
import type { RequestHandler } from '@sveltejs/kit';
|
||||||
|
|
||||||
|
export const post: RequestHandler = async (event) => {
|
||||||
|
const { teamId, status, body } = await getUserDetails(event);
|
||||||
|
if (status === 401) return { status, body };
|
||||||
|
|
||||||
|
const { id } = event.params;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const service = await db.getService({ id, teamId });
|
||||||
|
const { destinationDockerId, destinationDocker } = service;
|
||||||
|
if (destinationDockerId) {
|
||||||
|
const engine = destinationDocker.engine;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const found = await checkContainer(engine, id);
|
||||||
|
if (found) {
|
||||||
|
await removeDestinationDocker({ id, engine });
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error(error);
|
||||||
|
}
|
||||||
|
try {
|
||||||
|
const found = await checkContainer(engine, `${id}-postgresql`);
|
||||||
|
if (found) {
|
||||||
|
await removeDestinationDocker({ id: `${id}-postgresql`, engine });
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error(error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
status: 200
|
||||||
|
};
|
||||||
|
} catch (error) {
|
||||||
|
return ErrorHandler(error);
|
||||||
|
}
|
||||||
|
};
|
Reference in New Issue
Block a user