mirror of https://github.com/logseq/logseq
refactor: no need for the backend jar and pg
parent
f0c0e44628
commit
552ec6d1fa
83
README.md
83
README.md
|
@ -62,51 +62,12 @@ The following is for developers and designers who want to build and run Logseq l
|
|||
|
||||
## Set up development environment
|
||||
|
||||
If you are on Windows, use the [Windows setup](#windows-setup) below.
|
||||
|
||||
### 1. Requirements
|
||||
|
||||
- [Node.js](https://nodejs.org/en/download/) & [Yarn](https://classic.yarnpkg.com/en/docs/install/)
|
||||
- [Java & Clojure](https://clojure.org/guides/getting_started)
|
||||
|
||||
- [PostgreSQL](https://www.postgresql.org/download/)
|
||||
|
||||
- [Node.js](https://nodejs.org/en/download/) & [Yarn](https://classic.yarnpkg.com/en/docs/install/)
|
||||
|
||||
### 2. Create a GitHub app
|
||||
|
||||
Follow the guide at <https://docs.github.com/en/free-pro-team@latest/developers/apps/creating-a-github-app>, where the user authorization "Callback URL" should be `http://localhost:3000/auth/github`.
|
||||
|
||||
Remember to download the `private-key.pem` which will be used for the next step. Also take note of your `App ID`, `Client ID`, and your newly generated `Client Secret` for use in step 4.
|
||||
|
||||
![Screenshot 2020-11-27 22-22-39 +0800](https://user-images.githubusercontent.com/479169/100460276-e0bad100-3101-11eb-8fed-1f7c85824b62.png)
|
||||
|
||||
**Add contents permission**:
|
||||
![Screenshot 2020-11-27 22-22-57 +0800](https://user-images.githubusercontent.com/479169/100460271-def10d80-3101-11eb-91bb-f2339a52d4f8.png)
|
||||
|
||||
|
||||
### 3. Set up PostgreSQL
|
||||
|
||||
Make sure you have PostgreSQL running. You can check if it's running with `pg_ctl -D /usr/local/var/postgres status` and use `pg_ctl -D /usr/local/var/postgres start` to start it up. You'll also need to make a Logseq DB in PostgreSQL. Do that with `createdb logseq`.
|
||||
|
||||
### 4. Add environment variables
|
||||
|
||||
``` bash
|
||||
export ENVIRONMENT="dev"
|
||||
export JWT_SECRET="xxxxxxxxxxxxxxxxxxxx"
|
||||
export COOKIE_SECRET="xxxxxxxxxxxxxxxxxxxx"
|
||||
export DATABASE_URL="postgres://localhost:5432/logseq"
|
||||
export GITHUB_APP2_NAME="logseq-test-your-username-app"
|
||||
export GITHUB_APP2_ID="your id"
|
||||
export GITHUB_APP2_KEY="xxxxxxxxxxxxxxxxxxxx" #Your Github App's Client ID
|
||||
export GITHUB_APP2_SECRET="xxxxxxxxxxxxxxxxxxxx"
|
||||
# Replace your-code-directory and your-app.private-key.pem with yours
|
||||
export GITHUB_APP_PEM="/your-code-directory/your-app.private-key.pem"
|
||||
export LOG_PATH="/tmp/logseq"
|
||||
export PG_USERNAME="xxx"
|
||||
export PG_PASSWORD="xxx"
|
||||
```
|
||||
|
||||
### 5. Compile to JavaScript
|
||||
### 2. Compile to JavaScript
|
||||
|
||||
``` bash
|
||||
git clone https://github.com/logseq/logseq
|
||||
|
@ -114,45 +75,9 @@ yarn
|
|||
yarn watch
|
||||
```
|
||||
|
||||
### 6. Start the Clojure server
|
||||
### 3. Open the browser
|
||||
|
||||
1. Download jar
|
||||
|
||||
Go to <https://github.com/logseq/logseq/releases>, download the `logseq.jar` and put it in the `logseq` directory.
|
||||
|
||||
2. Run jar
|
||||
|
||||
``` bash
|
||||
java -Duser.timezone=UTC -jar logseq.jar
|
||||
```
|
||||
|
||||
### 7. Open the browser
|
||||
|
||||
Open <http://localhost:3000>.
|
||||
|
||||
## Windows setup
|
||||
|
||||
### 1. Required software
|
||||
|
||||
Install Clojure through scoop-clojure: <https://github.com/littleli/scoop-clojure>. You can also install [Node.js](https://nodejs.org/en/), [Yarn](https://yarnpkg.com/) and [PostgreSQL](https://www.postgresql.org/download/) through scoop if you want to.
|
||||
|
||||
### 2. Create a GitHub app
|
||||
|
||||
Follow [Step 2](#2-create-a-github-app) above if you want Logseq to connect to GitHub. If not, skip this section. The `GITHUB_APP_PEM` variable in the `run-windows.bat` needs to be set with the correct directory for your system.
|
||||
|
||||
### 3. Set up PostgreSQL
|
||||
|
||||
Make sure you have PostgreSQL running. You can check if it's running with `pg_ctl status` and use `pg_ctl start` to start it up. You'll also need to make a Logseq DB in PostgreSQL. Do that with `createdb logseq`.
|
||||
|
||||
### 4. Download the Clojure server
|
||||
|
||||
Go to <https://github.com/logseq/logseq/releases>, download the `logseq.jar` and move into the root directory of repo.
|
||||
|
||||
### 5. Start Logseq
|
||||
|
||||
Run `start-windows.bat` which is located in the repo. This will open a second terminal that runs Logseq's backend server. To completely stop Logseq, you'll need to also close that second terminal that was opened.
|
||||
|
||||
`start-windows.bat` will try to start PostgreSQL for you if it's not already started.
|
||||
Open <http://localhost:3001>.
|
||||
|
||||
## Build errors
|
||||
### 1. The required namespace `devtools.preload` is not available.
|
||||
|
|
|
@ -1,2 +0,0 @@
|
|||
@echo off
|
||||
cmd-clojure %*
|
File diff suppressed because one or more lines are too long
|
@ -0,0 +1 @@
|
|||
../static
|
|
@ -1,24 +0,0 @@
|
|||
<!DOCTYPE html>
|
||||
<html><head><meta charset="utf-8"><meta content="minimum-scale=1, initial-scale=1, width=device-width, shrink-to-fit=no" name="viewport"><meta content="Agp2znmEoRKqxMhzbNL2R3UOCNcagP7+fu0KSM+09O21u7EHdJgqhTrslpfyFC/dSt6jvpaDzNiFf2769fLHMAUAAABoeyJvcmlnaW4iOiJodHRwczovL2xvZ3NlcS5jb206NDQzIiwiZmVhdHVyZSI6Ik5hdGl2ZUZpbGVTeXN0ZW0yIiwiZXhwaXJ5IjoxNTk3Mjg5MzY5LCJpc1N1YmRvbWFpbiI6dHJ1ZX0=" http-equiv="origin-trial"><link href="https://asset.logseq.com/static/style.css" rel="stylesheet" type="text/css"><link href="https://asset.logseq.com/static/img/logo.png" rel="shortcut icon" type="image/png"><link href="https://asset.logseq.com/static/img/logo.png" rel="shortcut icon" sizes="192x192"><link href="https://asset.logseq.com/static/img/logo.png" rel="apple-touch-icon"><meta content="summary" name="twitter:card"><meta content="A local-first notes app which uses Git to store and sync your knowledge." name="twitter:description"><meta content="@logseq" name="twitter:site"><meta content="A local-first notes app." name="twitter:title"><meta content="https://asset.logseq.com/static/img/logo.png" name="twitter:image:src"><meta content="A local-first notes app." name="twitter:image:alt"><meta content="A local-first notes app." property="og:title"><meta content="site" property="og:type"><meta content="https://logseq.com" property="og:url"><meta content="https://asset.logseq.com/static/img/logo.png" property="og:image"><meta content="A local-first notes app which uses Git to store and sync your knowledge." property="og:description"><title>Logseq: A local-first notes app</title><meta content="logseq" property="og:site_name"><meta description="A local-first notes app which uses Git to store and sync your knowledge."><script crossorigin="anonymous" defer onload="if (window.location.host != 'localhost:3000') {
|
||||
Sentry.init({dsn: 'https://636e9174ffa148c98d2b9d3369661683@o416451.ingest.sentry.io/5311485'});
|
||||
};" src="https://asset.logseq.com/static/js/sentry.min.js"></script></head><body><div id="root"></div><script>window.user={"name":"tiensonqin","email":"tiensonqin@gmail.com","avatar":"https://avatars3.githubusercontent.com/u/479169?v=4","repos":[{"id":"bc80efff-1420-4eb7-9e07-9506b8d9bbe0","url":"https://github.com/tiensonqin/notes"}],"preferred_format":"org","encrypt_object_key":"snRsaP8r9VG6KsXxu0IfDA"};</script><script src="https://asset.logseq.com/static/js/mldoc.min.js"></script><script src="/js/magic_portal.js"></script><script>let worker = new Worker("/js/worker.js");
|
||||
const portal = new MagicPortal(worker);
|
||||
;(async () => {
|
||||
const git = await portal.get('git');
|
||||
window.git = git;
|
||||
const fs = await portal.get('fs');
|
||||
window.fs = fs;
|
||||
const pfs = await portal.get('pfs');
|
||||
window.pfs = pfs;
|
||||
const workerThread = await portal.get('workerThread');
|
||||
window.workerThread = workerThread;
|
||||
})();
|
||||
</script><script src="https://asset.logseq.com/static/js/main.js"></script><script>
|
||||
(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
|
||||
(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
|
||||
m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
|
||||
})(window,document,'script','//www.google-analytics.com/analytics.js','ga');
|
||||
|
||||
ga('create', 'UA-171599883-1', 'logseq.com');
|
||||
ga('send', 'pageview');
|
||||
</script></body></html>
|
File diff suppressed because one or more lines are too long
|
@ -0,0 +1,2 @@
|
|||
!function(e,t){"object"==typeof exports&&"undefined"!=typeof module?module.exports=t():"function"==typeof define&&define.amd?define(t):e.MagicPortal=t()}(this,function(){var e=function(e){var t=this;this.rpc_counter=0,this.channel=e,this.foreign=new Map,this.local=new Map,this.calls=new Map,this.queue=[],this.connectionEstablished=!1,this.channel.addEventListener("message",function(e){var n=e.data;if(n&&"object"==typeof n)switch(n.type){case"MP_INIT":return t.onInit(n);case"MP_SET":return t.onSet(n);case"MP_CALL":return t.onCall(n);case"MP_RETURN":return t.onReturn(n)}}),this.channel.postMessage({type:"MP_INIT",id:1,reply:!0})};e.prototype.onInit=function(e){this.connectionEstablished=!0;var t=this.queue;this.queue=[];for(var n=0,o=t;n<o.length;n+=1){this.channel.postMessage(o[n])}e.reply&&this.channel.postMessage({type:"MP_INIT",reply:!1})},e.prototype.onSet=function(e){for(var t=this,n={},o=e.object,i=function(){var i=r[s],c=!e.void.includes(i);n[i]=function(){for(var e=[],n=arguments.length;n--;)e[n]=arguments[n];return t.rpc_counter=(t.rpc_counter+1)%Number.MAX_SAFE_INTEGER,new Promise(function(n,s){t.postMessage({type:"MP_CALL",object:o,method:i,id:t.rpc_counter,args:e,reply:c}),c?t.calls.set(t.rpc_counter,{resolve:n,reject:s}):n()})}},s=0,r=e.methods;s<r.length;s+=1)i();var c=this.foreign.get(e.object);this.foreign.set(e.object,n),"function"==typeof c&&c(n)},e.prototype.onCall=function(e){var t=this,n=this.local.get(e.object);n&&n[e.method].apply(n,e.args).then(function(n){return e.reply&&t.channel.postMessage({type:"MP_RETURN",id:e.id,result:n})}).catch(function(n){return t.channel.postMessage({type:"MP_RETURN",id:e.id,error:n.message})})},e.prototype.onReturn=function(e){if(this.calls.has(e.id)){var t=this.calls.get(e.id),n=t.resolve,o=t.reject;this.calls.delete(e.id),e.error?o(e.error):n(e.result)}},e.prototype.postMessage=function(e){this.connectionEstablished?this.channel.postMessage(e):this.queue.push(e)},e.prototype.set=function(e,t,n){void 0===n&&(n={}),this.local.set(e,t);var o=Object.entries(t).filter(function(e){return"function"==typeof e[1]}).map(function(e){return e[0]});this.postMessage({type:"MP_SET",object:e,methods:o,void:n.void||[]})},e.prototype.get=function(e){return new Promise(function(t,n){var o=this;return this.foreign.has(e)?t(this.foreign.get(e)):t(new Promise(function(t,n){return o.foreign.set(e,t)}))}.bind(this))};return function(t){var n=new e(t);Object.defineProperties(this,{get:{writable:!1,configurable:!1,value:n.get.bind(n)},set:{writable:!1,configurable:!1,value:n.set.bind(n)}})}});
|
||||
//# sourceMappingURL=index.umd.js.map
|
|
@ -0,0 +1,303 @@
|
|||
importScripts(
|
||||
// Batched optimization
|
||||
"/static/js/lightning-fs.min.js?v=0.0.2.3",
|
||||
"https://cdn.jsdelivr.net/npm/isomorphic-git@1.7.4/index.umd.min.js",
|
||||
"https://cdn.jsdelivr.net/npm/isomorphic-git@1.7.4/http/web/index.umd.js",
|
||||
// Fixed a bug
|
||||
"/static/js/magic_portal.js"
|
||||
);
|
||||
|
||||
const detect = () => {
|
||||
if (typeof window !== 'undefined' && !self.skipWaiting) {
|
||||
return 'window'
|
||||
} else if (typeof self !== 'undefined' && !self.skipWaiting) {
|
||||
return 'Worker'
|
||||
} else if (typeof self !== 'undefined' && self.skipWaiting) {
|
||||
return 'ServiceWorker'
|
||||
}
|
||||
};
|
||||
|
||||
function basicAuth (username, token) {
|
||||
return "Basic " + btoa(username + ":" + token);
|
||||
}
|
||||
|
||||
const fsName = 'logseq';
|
||||
const createFS = () => new LightningFS(fsName);
|
||||
let fs = createFS();
|
||||
let pfs = fs.promises;
|
||||
|
||||
if (detect() === 'Worker') {
|
||||
const portal = new MagicPortal(self);
|
||||
portal.set('git', git);
|
||||
portal.set('fs', fs);
|
||||
portal.set('pfs', pfs);
|
||||
portal.set('gitHttp', GitHttp);
|
||||
portal.set('workerThread', {
|
||||
setConfig: function (dir, path, value) {
|
||||
return git.setConfig ({
|
||||
fs,
|
||||
dir,
|
||||
path,
|
||||
value
|
||||
});
|
||||
},
|
||||
clone: function (dir, url, corsProxy, depth, branch, username, token) {
|
||||
return git.clone ({
|
||||
fs,
|
||||
dir,
|
||||
http: GitHttp,
|
||||
url,
|
||||
corsProxy,
|
||||
ref: branch,
|
||||
singleBranch: true,
|
||||
depth,
|
||||
headers: {
|
||||
"Authorization": basicAuth(username, token)
|
||||
}
|
||||
});
|
||||
},
|
||||
fetch: function (dir, url, corsProxy, depth, branch, username, token) {
|
||||
return git.fetch ({
|
||||
fs,
|
||||
dir,
|
||||
http: GitHttp,
|
||||
url,
|
||||
corsProxy,
|
||||
ref: branch,
|
||||
singleBranch: true,
|
||||
depth,
|
||||
headers: {
|
||||
"Authorization": basicAuth(username, token)
|
||||
}
|
||||
});
|
||||
},
|
||||
pull: function (dir, corsProxy, branch, username, token) {
|
||||
return git.pull ({
|
||||
fs,
|
||||
dir,
|
||||
http: GitHttp,
|
||||
corsProxy,
|
||||
ref: branch,
|
||||
singleBranch: true,
|
||||
// fast: true,
|
||||
headers: {
|
||||
"Authorization": basicAuth(username, token)
|
||||
}
|
||||
});
|
||||
},
|
||||
push: function (dir, corsProxy, branch, force, username, token) {
|
||||
return git.push ({
|
||||
fs,
|
||||
dir,
|
||||
http: GitHttp,
|
||||
ref: branch,
|
||||
corsProxy,
|
||||
remote: "origin",
|
||||
force,
|
||||
headers: {
|
||||
"Authorization": basicAuth(username, token)
|
||||
}
|
||||
});
|
||||
},
|
||||
merge: function (dir, branch) {
|
||||
return git.merge ({
|
||||
fs,
|
||||
dir,
|
||||
ours: branch,
|
||||
theirs: "remotes/origin/" + branch,
|
||||
// fastForwardOnly: true
|
||||
});
|
||||
},
|
||||
checkout: function (dir, branch) {
|
||||
return git.checkout ({
|
||||
fs,
|
||||
dir,
|
||||
ref: branch,
|
||||
});
|
||||
},
|
||||
log: function (dir, branch, depth) {
|
||||
return git.log ({
|
||||
fs,
|
||||
dir,
|
||||
ref: branch,
|
||||
depth,
|
||||
singleBranch: true
|
||||
})
|
||||
},
|
||||
add: function (dir, file) {
|
||||
return git.add ({
|
||||
fs,
|
||||
dir,
|
||||
filepath: file
|
||||
});
|
||||
},
|
||||
remove: function (dir, file) {
|
||||
return git.remove ({
|
||||
fs,
|
||||
dir,
|
||||
filepath: file
|
||||
});
|
||||
},
|
||||
commit: function (dir, message, name, email, parent) {
|
||||
if (parent) {
|
||||
return git.commit ({
|
||||
fs,
|
||||
dir,
|
||||
message,
|
||||
author: {name: name,
|
||||
email: email},
|
||||
parent: parent
|
||||
});
|
||||
} else {
|
||||
return git.commit ({
|
||||
fs,
|
||||
dir,
|
||||
message,
|
||||
author: {name: name,
|
||||
email: email}
|
||||
});
|
||||
}
|
||||
},
|
||||
readCommit: function (dir, oid) {
|
||||
return git.readCommit ({
|
||||
fs,
|
||||
dir,
|
||||
oid
|
||||
});
|
||||
},
|
||||
readBlob: function (dir, oid, path) {
|
||||
return git.readBlob ({
|
||||
fs,
|
||||
dir,
|
||||
oid,
|
||||
path
|
||||
});
|
||||
},
|
||||
writeRef: function (dir, branch, oid) {
|
||||
return git.writeRef ({
|
||||
fs,
|
||||
dir,
|
||||
ref: "refs/heads/" + branch,
|
||||
value: oid,
|
||||
force: true
|
||||
});
|
||||
},
|
||||
resolveRef: function (dir, ref) {
|
||||
return git.resolveRef ({
|
||||
fs,
|
||||
dir,
|
||||
ref
|
||||
});
|
||||
},
|
||||
listFiles: function (dir, branch) {
|
||||
return git.listFiles ({
|
||||
fs,
|
||||
dir,
|
||||
ref: branch
|
||||
});
|
||||
},
|
||||
rimraf: async function (path) {
|
||||
// try {
|
||||
// // First assume path is itself a file
|
||||
// await pfs.unlink(path)
|
||||
// // if that worked we're done
|
||||
// return
|
||||
// } catch (err) {
|
||||
// // Otherwise, path must be a directory
|
||||
// if (err.code !== 'EISDIR') throw err
|
||||
// }
|
||||
// Knowing path is a directory,
|
||||
// first, assume everything inside path is a file.
|
||||
let files = await pfs.readdir(path);
|
||||
for (let file of files) {
|
||||
let child = path + '/' + file
|
||||
try {
|
||||
await pfs.unlink(child)
|
||||
} catch (err) {
|
||||
if (err.code !== 'EISDIR') throw err
|
||||
}
|
||||
}
|
||||
// Assume what's left are directories and recurse.
|
||||
let dirs = await pfs.readdir(path)
|
||||
for (let dir of dirs) {
|
||||
let child = path + '/' + dir
|
||||
await rimraf(child, pfs)
|
||||
}
|
||||
// Finally, delete the empty directory
|
||||
await pfs.rmdir(path)
|
||||
},
|
||||
getFileStateChanges: async function (commitHash1, commitHash2, dir) {
|
||||
return git.walk({
|
||||
fs,
|
||||
dir,
|
||||
trees: [git.TREE({ ref: commitHash1 }), git.TREE({ ref: commitHash2 })],
|
||||
map: async function(filepath, [A, B]) {
|
||||
var type = 'equal';
|
||||
if (A === null) {
|
||||
type = "add";
|
||||
}
|
||||
|
||||
if (B === null) {
|
||||
type = "remove";
|
||||
}
|
||||
|
||||
// ignore directories
|
||||
if (filepath === '.') {
|
||||
return
|
||||
}
|
||||
if ((A !== null && (await A.type()) === 'tree')
|
||||
||
|
||||
(B !== null && (await B.type()) === 'tree')) {
|
||||
return
|
||||
}
|
||||
|
||||
// generate ids
|
||||
const Aoid = A !== null && await A.oid();
|
||||
const Boid = B !== null && await B.oid();
|
||||
|
||||
if (type === "equal") {
|
||||
// determine modification type
|
||||
if (Aoid !== Boid) {
|
||||
type = 'modify'
|
||||
}
|
||||
if (Aoid === undefined) {
|
||||
type = 'add'
|
||||
}
|
||||
if (Boid === undefined) {
|
||||
type = 'remove'
|
||||
}
|
||||
}
|
||||
|
||||
if (Aoid === undefined && Boid === undefined) {
|
||||
console.log('Something weird happened:')
|
||||
console.log(A)
|
||||
console.log(B)
|
||||
}
|
||||
|
||||
return {
|
||||
path: `/${filepath}`,
|
||||
type: type,
|
||||
}
|
||||
},
|
||||
})
|
||||
},
|
||||
statusMatrix: async function (dir) {
|
||||
await git.statusMatrix({ fs, dir });
|
||||
},
|
||||
getChangedFiles: async function (dir) {
|
||||
try {
|
||||
const FILE = 0, HEAD = 1, WORKDIR = 2;
|
||||
|
||||
let filenames = (await git.statusMatrix({ fs, dir }))
|
||||
.filter(row => row[HEAD] !== row[WORKDIR])
|
||||
.map(row => row[FILE]);
|
||||
|
||||
return filenames;
|
||||
} catch (err) {
|
||||
console.error(err);
|
||||
return [];
|
||||
}
|
||||
}
|
||||
});
|
||||
// self.addEventListener("message", ({ data }) => console.log(data));
|
||||
}
|
|
@ -26,6 +26,8 @@
|
|||
{:before-load frontend.core/stop
|
||||
;; after live-reloading finishes call this function
|
||||
:after-load frontend.core/start
|
||||
:http-root "public"
|
||||
:http-port 3001
|
||||
:preloads [devtools.preload]}}
|
||||
|
||||
:test
|
||||
|
|
|
@ -1,14 +0,0 @@
|
|||
@echo off
|
||||
SET ENVIRONMENT=dev
|
||||
SET JWT_SECRET=4fa183cf1d28460498b13330835e80ad
|
||||
SET COOKIE_SECRET=10a42ca724e34f4db6086a772d787034
|
||||
SET DATABASE_URL=postgres://localhost:5432/logseq
|
||||
SET GITHUB_APP2_ID=78728
|
||||
SET GITHUB_APP2_KEY=xxxxxxxxxxxxxxxxxxxx
|
||||
SET GITHUB_APP2_SECRET=xxxxxxxxxxxxxxxxxxxx
|
||||
SET GITHUB_APP_PEM=
|
||||
SET LOG_PATH=%AppData%\..\Local\Temp\logseq
|
||||
|
||||
pg_ctl start
|
||||
start cmd.exe /k "java -Duser.timezone=UTC -jar logseq.jar"
|
||||
yarn && yarn watch
|
Loading…
Reference in New Issue