Compare commits
68 Commits
v0.5.0-alp
...
v0.5.0-alp
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
4a597e0ba7 | ||
|
|
d5f3e25bea | ||
|
|
5e4c1a8359 | ||
|
|
d86e655ad2 | ||
|
|
80154bbf9f | ||
|
|
be853ba2a7 | ||
|
|
4d3036d030 | ||
|
|
ecb9b5e28f | ||
|
|
38e3c198f2 | ||
|
|
2f64501556 | ||
|
|
2c2554d73d | ||
|
|
69d1accf3f | ||
|
|
785bdb8ecb | ||
|
|
78a1947cec | ||
|
|
0ff59ecb4e | ||
|
|
b58fed16b4 | ||
|
|
6719be02c3 | ||
|
|
8757834e07 | ||
|
|
aa243d1b8a | ||
|
|
aeb18eb124 | ||
|
|
6c3e118ee3 | ||
|
|
3c0fe4d684 | ||
|
|
12fd9aa1ef | ||
|
|
821122a33d | ||
|
|
0d9406d991 | ||
|
|
350eec3bc7 | ||
|
|
e700b3105a | ||
|
|
dd2a730b4a | ||
|
|
c6766bbe77 | ||
|
|
e5d3204b6c | ||
|
|
4767cbd12b | ||
|
|
deb4118c5d | ||
|
|
4516df5aac | ||
|
|
663df7bdc2 | ||
|
|
e81f0a4a95 | ||
|
|
38cd13dc0c | ||
|
|
14fd470363 | ||
|
|
fc8d9dc1fe | ||
|
|
1659adb419 | ||
|
|
6490b77d4c | ||
|
|
23463b620e | ||
|
|
6bc331be75 | ||
|
|
87f6410877 | ||
|
|
b1ddfc3a49 | ||
|
|
d01e757d2f | ||
|
|
e593ce0420 | ||
|
|
578abfabb3 | ||
|
|
aa7b7e43ff | ||
|
|
af4d4e0246 | ||
|
|
fecb11cba4 | ||
|
|
614f886008 | ||
|
|
6fcb895d70 | ||
|
|
5a98ede45e | ||
|
|
779d462932 | ||
|
|
e301116e87 | ||
|
|
bd3a4a719d | ||
|
|
4cfdc72c00 | ||
|
|
3620a9d256 | ||
|
|
f254a51d59 | ||
|
|
99bbe58255 | ||
|
|
a400abff4c | ||
|
|
585806837e | ||
|
|
249aa999a3 | ||
|
|
aae1d8b34f | ||
|
|
9d3638fa46 | ||
|
|
5b2a830d2d | ||
|
|
b87943e39d | ||
|
|
c421fd0b25 |
2
.cargo/config
Normal file
@@ -0,0 +1,2 @@
|
||||
[target.x86_64-pc-windows-msvc]
|
||||
rustflags = ["-Ctarget-feature=+crt-static"]
|
||||
2
.gitignore
vendored
@@ -1,5 +1,5 @@
|
||||
/site
|
||||
/target
|
||||
/server/scratch
|
||||
/scratch-project
|
||||
**/*.rs.bk
|
||||
/generate-docs.run
|
||||
@@ -25,14 +25,23 @@ matrix:
|
||||
- cd plugin
|
||||
- luacov-coveralls -e $TRAVIS_BUILD_DIR/lua_install
|
||||
|
||||
- language: rust
|
||||
rust: 1.31.1
|
||||
cache: cargo
|
||||
|
||||
script:
|
||||
- cargo test --verbose
|
||||
|
||||
- language: rust
|
||||
rust: stable
|
||||
cache: cargo
|
||||
|
||||
script:
|
||||
- cargo test --verbose
|
||||
|
||||
- language: rust
|
||||
rust: beta
|
||||
cache: cargo
|
||||
|
||||
script:
|
||||
- cargo test --verbose
|
||||
@@ -1,6 +1,37 @@
|
||||
# Rojo Change Log
|
||||
# Rojo Changelog
|
||||
|
||||
## Current master
|
||||
## [Unreleased]
|
||||
|
||||
## [0.5.0 Alpha 4](https://github.com/LPGhatguy/rojo/releases/tag/v0.5.0-alpha.4) (February 8, 2019)
|
||||
* Added support for nested partitions ([#102](https://github.com/LPGhatguy/rojo/issues/102))
|
||||
* Added support for 'transmuting' partitions ([#112](https://github.com/LPGhatguy/rojo/issues/112))
|
||||
* Added support for aliasing filesystem paths ([#105](https://github.com/LPGhatguy/rojo/issues/105))
|
||||
* Changed Windows builds to statically link the CRT ([#89](https://github.com/LPGhatguy/rojo/issues/89))
|
||||
|
||||
## [0.5.0 Alpha 3](https://github.com/LPGhatguy/rojo/releases/tag/v0.5.0-alpha.3) (February 1, 2019)
|
||||
* Changed default project file name from `roblox-project.json` to `default.project.json` ([#120](https://github.com/LPGhatguy/rojo/pull/120))
|
||||
* The old file name will still be supported until 0.5.0 is fully released.
|
||||
* Added warning when loading project files that don't end in `.project.json`
|
||||
* This new extension enables Rojo to distinguish project files from random JSON files, which is necessary to support nested projects.
|
||||
* Added new (empty) diagnostic page served from the server
|
||||
* Added better error messages for when a file is missing that's referenced by a Rojo project
|
||||
* Added support for visualization endpoints returning GraphViz source when Dot is not available
|
||||
* Fixed an in-memory filesystem regression introduced recently ([#119](https://github.com/LPGhatguy/rojo/pull/119))
|
||||
|
||||
## [0.5.0 Alpha 2](https://github.com/LPGhatguy/rojo/releases/tag/v0.5.0-alpha.2) (January 28, 2019)
|
||||
* Added support for `.model.json` files, compatible with 0.4.x
|
||||
* Fixed in-memory filesystem not handling out-of-order filesystem change events
|
||||
* Fixed long-polling error caused by a promise mixup ([#110](https://github.com/LPGhatguy/rojo/issues/110))
|
||||
|
||||
## [0.5.0 Alpha 1](https://github.com/LPGhatguy/rojo/releases/tag/v0.5.0-alpha.1) (January 25, 2019)
|
||||
* Changed plugin UI to be way prettier
|
||||
* Thanks to [Reselim](https://github.com/Reselim) for the design!
|
||||
* Changed plugin error messages to be a little more useful
|
||||
* Removed unused 'Config' button in plugin UI
|
||||
* Fixed bug where bad server responses could cause the plugin to be in a bad state
|
||||
* Upgraded to rbx\_tree, rbx\_xml, and rbx\_binary 0.2.0, which dramatically expands the kinds of properties that Rojo can handle, especially in XML.
|
||||
|
||||
## [0.5.0 Alpha 0](https://github.com/LPGhatguy/rojo/releases/tag/v0.5.0-alpha.0) (January 14, 2019)
|
||||
* "Epiphany" rewrite, in progress since the beginning of time
|
||||
* New live sync protocol
|
||||
* Uses HTTP long polling to reduce request count and improve responsiveness
|
||||
@@ -25,36 +56,36 @@
|
||||
* Multiple places can be specified, like when building a multi-place game
|
||||
* Added support for specifying properties on services in project files
|
||||
|
||||
## 0.4.13 (November 12, 2018)
|
||||
## [0.4.13](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.13) (November 12, 2018)
|
||||
* When `rojo.json` points to a file or directory that does not exist, Rojo now issues a warning instead of throwing an error and exiting
|
||||
|
||||
## 0.4.12 (June 21, 2018)
|
||||
## [0.4.12](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.12) (June 21, 2018)
|
||||
* Fixed obscure assertion failure when renaming or deleting files ([#78](https://github.com/LPGhatguy/rojo/issues/78))
|
||||
* Added a `PluginAction` for the sync in command, which should help with some automation scripts ([#80](https://github.com/LPGhatguy/rojo/pull/80))
|
||||
|
||||
## 0.4.11 (June 10, 2018)
|
||||
## [0.4.11](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.11) (June 10, 2018)
|
||||
* Defensively insert existing instances into RouteMap; should fix most duplication cases when syncing into existing trees.
|
||||
* Fixed incorrect synchronization from `Plugin:_pull` that would cause polling to create issues
|
||||
* Fixed incorrect file routes being assigned to `init.lua` and `init.model.json` files
|
||||
* Untangled route handling-internals slightly
|
||||
|
||||
## 0.4.10 (June 2, 2018)
|
||||
## [0.4.10](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.10) (June 2, 2018)
|
||||
* Added support for `init.model.json` files, which enable versioning `Tool` instances (among other things) with Rojo. ([#66](https://github.com/LPGhatguy/rojo/issues/66))
|
||||
* Fixed obscure error when syncing into an invalid service.
|
||||
* Fixed multiple sync processes occurring when a server ID mismatch is detected.
|
||||
|
||||
## 0.4.9 (May 26, 2018)
|
||||
## [0.4.9](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.9) (May 26, 2018)
|
||||
* Fixed warning when renaming or removing files that would sometimes corrupt the instance cache ([#72](https://github.com/LPGhatguy/rojo/pull/72))
|
||||
* JSON models are no longer as strict -- `Children` and `Properties` are now optional.
|
||||
|
||||
## 0.4.8 (May 26, 2018)
|
||||
## [0.4.8](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.8) (May 26, 2018)
|
||||
* Hotfix to prevent errors from being thrown when objects managed by Rojo are deleted
|
||||
|
||||
## 0.4.7 (May 25, 2018)
|
||||
## [0.4.7](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.7) (May 25, 2018)
|
||||
* Added icons to the Rojo plugin, made by [@Vorlias](https://github.com/Vorlias)! ([#70](https://github.com/LPGhatguy/rojo/pull/70))
|
||||
* Server will now issue a warning if no partitions are specified in `rojo serve` ([#40](https://github.com/LPGhatguy/rojo/issues/40))
|
||||
|
||||
## 0.4.6 (May 21, 2018)
|
||||
## [0.4.6](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.6) (May 21, 2018)
|
||||
* Rojo handles being restarted by Roblox Studio more gracefully ([#67](https://github.com/LPGhatguy/rojo/issues/67))
|
||||
* Folders should no longer get collapsed when syncing occurs.
|
||||
* **Significant** robustness improvements with regards to caching.
|
||||
@@ -62,7 +93,7 @@
|
||||
* If there are any bugs with script duplication or caching in the future, restarting the Rojo server process will fix them for that session.
|
||||
* Fixed message in plugin not being prefixed with `Rojo: `.
|
||||
|
||||
## 0.4.5 (May 1, 2018)
|
||||
## [0.4.5](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.5) (May 1, 2018)
|
||||
* Rojo messages are now prefixed with `Rojo: ` to make them stand out in the output more.
|
||||
* Fixed server to notice file changes *much* more quickly. (200ms vs 1000ms)
|
||||
* Server now lists name of project when starting up.
|
||||
@@ -70,23 +101,23 @@
|
||||
* Fixed multiple sync operations occuring at the same time. ([#61](https://github.com/LPGhatguy/rojo/issues/61))
|
||||
* Partitions targeting files directly now work as expected. ([#57](https://github.com/LPGhatguy/rojo/issues/57))
|
||||
|
||||
## 0.4.4 (April 7, 2018)
|
||||
## [0.4.4](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.4) (April 7, 2018)
|
||||
* Fix small regression introduced in 0.4.3
|
||||
|
||||
## 0.4.3 (April 7, 2018)
|
||||
## [0.4.3](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.3) (April 7, 2018)
|
||||
* Plugin now automatically selects `HttpService` if it determines that HTTP isn't enabled ([#58](https://github.com/LPGhatguy/rojo/pull/58))
|
||||
* Plugin now has much more robust handling and will wipe all state when the server changes.
|
||||
* This should fix issues that would otherwise be solved by restarting Roblox Studio.
|
||||
|
||||
## 0.4.2 (April 4, 2018)
|
||||
## [0.4.2](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.2) (April 4, 2018)
|
||||
* Fixed final case of duplicated instance insertion, caused by reconciled instances not being inserted into `RouteMap`.
|
||||
* The reconciler is still not a perfect solution, especially if script instances get moved around without being destroyed. I don't think this can be fixed before a big refactor.
|
||||
|
||||
## 0.4.1 (April 1, 2018)
|
||||
## [0.4.1](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.1) (April 1, 2018)
|
||||
* Merged plugin repository into main Rojo repository for easier tracking.
|
||||
* Improved `RouteMap` object tracking; this should fix some cases of duplicated instances being synced into the tree.
|
||||
|
||||
## 0.4.0 (March 27, 2018)
|
||||
## [0.4.0](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.0) (March 27, 2018)
|
||||
* Protocol version 1, which shifts more responsibility onto the server
|
||||
* This is a **major breaking** change!
|
||||
* The server now has a content of 'filter plugins', which transform data at various stages in the pipeline
|
||||
@@ -94,36 +125,36 @@
|
||||
* Added `*.model.json` files, which let you embed small Roblox objects into your Rojo tree.
|
||||
* Improved error messages in some cases ([#46](https://github.com/LPGhatguy/rojo/issues/46))
|
||||
|
||||
## 0.3.2 (December 20, 2017)
|
||||
## [0.3.2](https://github.com/LPGhatguy/rojo/releases/tag/v0.3.2) (December 20, 2017)
|
||||
* Fixed `rojo serve` failing to correctly construct an absolute root path when passed as an argument
|
||||
* Fixed intense CPU usage when running `rojo serve`
|
||||
|
||||
## 0.3.1 (December 14, 2017)
|
||||
## [0.3.1](https://github.com/LPGhatguy/rojo/releases/tag/v0.3.1) (December 14, 2017)
|
||||
* Improved error reporting when invalid JSON is found in a `rojo.json` project
|
||||
* These messages are passed on from Serde
|
||||
|
||||
## 0.3.0 (December 12, 2017)
|
||||
## [0.3.0](https://github.com/LPGhatguy/rojo/releases/tag/v0.3.0) (December 12, 2017)
|
||||
* Factored out the plugin into a separate repository
|
||||
* Fixed server when using a file as a partition
|
||||
* Previously, trailing slashes were put on the end of a partition even if the read request was an empty string. This broke file reading on Windows when a partition pointed to a file instead of a directory!
|
||||
* Started running automatic tests on Travis CI (#9)
|
||||
|
||||
## 0.2.3 (December 4, 2017)
|
||||
## [0.2.3](https://github.com/LPGhatguy/rojo/releases/tag/v0.2.3) (December 4, 2017)
|
||||
* Plugin only release
|
||||
* Tightened `init` file rules to only match script files
|
||||
* Previously, Rojo would sometimes pick up the wrong file when syncing
|
||||
|
||||
## 0.2.2 (December 1, 2017)
|
||||
## [0.2.2](https://github.com/LPGhatguy/rojo/releases/tag/v0.2.2) (December 1, 2017)
|
||||
* Plugin only release
|
||||
* Fixed broken reconciliation behavior with `init` files
|
||||
|
||||
## 0.2.1 (December 1, 2017)
|
||||
## [0.2.1](https://github.com/LPGhatguy/rojo/releases/tag/v0.2.1) (December 1, 2017)
|
||||
* Plugin only release
|
||||
* Changes default port to 8000
|
||||
|
||||
## 0.2.0 (December 1, 2017)
|
||||
## [0.2.0](https://github.com/LPGhatguy/rojo/releases/tag/v0.2.0) (December 1, 2017)
|
||||
* Support for `init.lua` like rbxfs and rbxpacker
|
||||
* More robust syncing with a new reconciler
|
||||
|
||||
## 0.1.0 (November 29, 2017)
|
||||
## [0.1.0](https://github.com/LPGhatguy/rojo/releases/tag/v0.1.0) (November 29, 2017)
|
||||
* Initial release, functionally very similar to [rbxfs](https://github.com/LPGhatguy/rbxfs)
|
||||
462
Cargo.lock
generated
24
README.md
@@ -12,7 +12,10 @@
|
||||
<img src="https://img.shields.io/crates/v/rojo.svg?label=version" alt="Latest server version" />
|
||||
</a>
|
||||
<a href="https://lpghatguy.github.io/rojo/0.4.x">
|
||||
<img src="https://img.shields.io/badge/documentation-0.4.x-brightgreen.svg" alt="Rojo Documentation" />
|
||||
<img src="https://img.shields.io/badge/docs-0.4.x-brightgreen.svg" alt="Rojo Documentation" />
|
||||
</a>
|
||||
<a href="https://lpghatguy.github.io/rojo/0.5.x">
|
||||
<img src="https://img.shields.io/badge/docs-0.5.x-brightgreen.svg" alt="Rojo Documentation" />
|
||||
</a>
|
||||
</div>
|
||||
|
||||
@@ -28,17 +31,16 @@ Rojo is designed for **power users** who want to use the **best tools available*
|
||||
Rojo lets you:
|
||||
|
||||
* Work on scripts from the filesystem, in your favorite editor
|
||||
* Version your place, library, or plugin using Git or another VCS
|
||||
* Sync JSON-format models from the filesystem into your game
|
||||
* Version your place, model, or plugin using Git or another VCS
|
||||
* Sync `rbxmx` and `rbxm` models into your game in real time
|
||||
* Package and deploy your project to Roblox.com from the command line
|
||||
|
||||
Soon, Rojo will be able to:
|
||||
|
||||
* Sync scripts from Roblox Studio to the filesystem
|
||||
* Compile MoonScript and sync it into Roblox Studio
|
||||
* Sync `rbxmx` models between the filesystem and Roblox Studio
|
||||
* Package projects into `rbxmx` files from the command line
|
||||
* Sync instances from Roblox Studio to the filesystem
|
||||
* Compile MoonScript and other custom things for your project
|
||||
|
||||
## [Documentation](https://lpghatguy.github.io/rojo/0.4.x)
|
||||
## [Documentation](https://lpghatguy.github.io/rojo)
|
||||
You can also view the documentation by browsing the [docs](https://github.com/LPGhatguy/rojo/tree/master/docs) folder of the repository, but because it uses a number of Markdown extensions, it may not be very readable.
|
||||
|
||||
## Inspiration and Alternatives
|
||||
@@ -58,11 +60,9 @@ Here are a few, if you're looking for alternatives or supplements to Rojo:
|
||||
If you use a plugin that _isn't_ Rojo for syncing code, open an issue and let me know why! I'd like Rojo to be the end-all tool so that people stop reinventing solutions to this problem.
|
||||
|
||||
## Contributing
|
||||
The `master` branch is a rewrite known as **Epiphany**. It includes a breaking change to the project configuration format and an infrastructure overhaul.
|
||||
|
||||
Pull requests are welcome!
|
||||
|
||||
All pull requests are run against a test suite on Travis CI. That test suite should always pass!
|
||||
Rojo supports Rust 1.31.1 and newer. Any changes to the minimum required compiler version require a _minor_ version bump.
|
||||
|
||||
## License
|
||||
Rojo is available under the terms of the Mozilla Public License, Version 2.0. See [LICENSE](LICENSE) for details.
|
||||
Rojo is available under the terms of the Mozilla Public License, Version 2.0. See [LICENSE.txt](LICENSE.txt) for details.
|
||||
BIN
assets/round-rect-4px-radius.png
Normal file
|
After Width: | Height: | Size: 175 B |
@@ -1,3 +1,7 @@
|
||||
[TOC]
|
||||
|
||||
## Creating the Rojo Project
|
||||
|
||||
To use Rojo to build a place, you'll need to create a new project file, which tells Rojo how your project is structured on-disk and in Roblox.
|
||||
|
||||
Create a new folder, then run `rojo init` inside that folder to initialize an empty project.
|
||||
@@ -9,7 +13,7 @@ cd my-new-project
|
||||
rojo init
|
||||
```
|
||||
|
||||
Rojo will make a small project file in your directory, named `roblox-project.json`. It'll make sure that any code in the directory `src` will get put into `ReplicatedStorage.Source`.
|
||||
Rojo will make a small project file in your directory, named `default.project.json`. It'll make sure that any code in the directory `src` will get put into `ReplicatedStorage.Source`.
|
||||
|
||||
Speaking of, let's make sure we create a directory named `src`, and maybe a Lua file inside of it:
|
||||
|
||||
3
docs/extra.css
Normal file
@@ -0,0 +1,3 @@
|
||||
.md-typeset__table {
|
||||
width: 100%;
|
||||
}
|
||||
@@ -1,25 +0,0 @@
|
||||
Rojo has two components:
|
||||
|
||||
* The server, a binary written in Rust
|
||||
* The plugin, a Roblox Studio plugin written in Lua
|
||||
|
||||
It's important that the plugin and server are compatible. The plugin will show errors in the Roblox Studio Output window if there is a version mismatch.
|
||||
|
||||
## Installing the Server
|
||||
To install the server, either:
|
||||
|
||||
* If you have Rust installed, use `cargo install rojo`
|
||||
* Or, download a pre-built Windows binary from [the GitHub releases page](https://github.com/LPGhatguy/rojo/releases)
|
||||
|
||||
**The Rojo binary must be run from the command line, like Terminal on MacOS or `cmd.exe` on Windows. It's recommended that you put the Rojo binary on your `PATH` to make this easier.**
|
||||
|
||||
## Installing the Plugin
|
||||
To install the plugin, either:
|
||||
|
||||
* Install the plugin from the [Roblox plugin page](https://www.roblox.com/library/1211549683/Rojo).
|
||||
* This gives you less control over what version you install -- you will always have the latest version.
|
||||
* Or, download the latest release from [the GitHub releases section](https://github.com/LPGhatguy/rojo/releases) and install it into your Roblox plugins folder
|
||||
* You can open this folder by clicking the "Plugins Folder" button from the Plugins toolbar in Roblox Studio
|
||||
|
||||
## Visual Studio Code Extension
|
||||
If you use Visual Studio Code on Windows, you can install [Evaera's unofficial Rojo extension](https://marketplace.visualstudio.com/items?itemName=evaera.vscode-rojo), which will install both halves of Rojo for you. It even has a nifty UI to add partitions and start/stop the Rojo server!
|
||||
|
Before Width: | Height: | Size: 5.8 KiB |
|
Before Width: | Height: | Size: 17 KiB |
BIN
docs/images/plugins-folder-in-studio.png
Normal file
|
After Width: | Height: | Size: 19 KiB |
17
docs/images/sync-example-files.gv
Normal file
@@ -0,0 +1,17 @@
|
||||
digraph "Sync Files" {
|
||||
graph [
|
||||
ranksep = "0.7",
|
||||
nodesep = "0.5",
|
||||
];
|
||||
node [
|
||||
fontname = "monospace",
|
||||
shape = "record",
|
||||
];
|
||||
|
||||
my_model [label = "MyModel"]
|
||||
init_server [label = "init.server.lua"]
|
||||
foo [label = "foo.lua"]
|
||||
|
||||
my_model -> init_server
|
||||
my_model -> foo
|
||||
}
|
||||
38
docs/images/sync-example-files.svg
Normal file
@@ -0,0 +1,38 @@
|
||||
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
|
||||
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN"
|
||||
"http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
|
||||
<!-- Generated by graphviz version 2.38.0 (20140413.2041)
|
||||
-->
|
||||
<!-- Title: Sync Files Pages: 1 -->
|
||||
<svg width="258pt" height="132pt"
|
||||
viewBox="0.00 0.00 258.00 132.00" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
|
||||
<g id="graph0" class="graph" transform="scale(1 1) rotate(0) translate(4 128)">
|
||||
<title>Sync Files</title>
|
||||
<polygon fill="white" stroke="none" points="-4,4 -4,-128 254,-128 254,4 -4,4"/>
|
||||
<!-- my_model -->
|
||||
<g id="node1" class="node"><title>my_model</title>
|
||||
<polygon fill="none" stroke="black" points="104,-87.5 104,-123.5 178,-123.5 178,-87.5 104,-87.5"/>
|
||||
<text text-anchor="middle" x="141" y="-101.8" font-family="monospace" font-size="14.00">MyModel</text>
|
||||
</g>
|
||||
<!-- init_server -->
|
||||
<g id="node2" class="node"><title>init_server</title>
|
||||
<polygon fill="none" stroke="black" points="0,-0.5 0,-36.5 140,-36.5 140,-0.5 0,-0.5"/>
|
||||
<text text-anchor="middle" x="70" y="-14.8" font-family="monospace" font-size="14.00">init.server.lua</text>
|
||||
</g>
|
||||
<!-- my_model->init_server -->
|
||||
<g id="edge1" class="edge"><title>my_model->init_server</title>
|
||||
<path fill="none" stroke="black" d="M126.632,-87.299C116.335,-74.9713 102.308,-58.1787 90.7907,-44.3902"/>
|
||||
<polygon fill="black" stroke="black" points="93.4435,-42.1065 84.3465,-36.6754 88.0711,-46.594 93.4435,-42.1065"/>
|
||||
</g>
|
||||
<!-- foo -->
|
||||
<g id="node3" class="node"><title>foo</title>
|
||||
<polygon fill="none" stroke="black" points="176,-0.5 176,-36.5 250,-36.5 250,-0.5 176,-0.5"/>
|
||||
<text text-anchor="middle" x="213" y="-14.8" font-family="monospace" font-size="14.00">foo.lua</text>
|
||||
</g>
|
||||
<!-- my_model->foo -->
|
||||
<g id="edge2" class="edge"><title>my_model->foo</title>
|
||||
<path fill="none" stroke="black" d="M155.57,-87.299C166.013,-74.9713 180.237,-58.1787 191.917,-44.3902"/>
|
||||
<polygon fill="black" stroke="black" points="194.659,-46.5681 198.451,-36.6754 189.317,-42.0437 194.659,-46.5681"/>
|
||||
</g>
|
||||
</g>
|
||||
</svg>
|
||||
|
After Width: | Height: | Size: 2.0 KiB |
15
docs/images/sync-example-instances.gv
Normal file
@@ -0,0 +1,15 @@
|
||||
digraph "Sync Files" {
|
||||
graph [
|
||||
ranksep = "0.7",
|
||||
nodesep = "0.5",
|
||||
];
|
||||
node [
|
||||
fontname = "monospace",
|
||||
shape = "record",
|
||||
];
|
||||
|
||||
my_model [label = "MyModel (Script)"]
|
||||
foo [label = "foo (ModuleScript)"]
|
||||
|
||||
my_model -> foo
|
||||
}
|
||||
28
docs/images/sync-example-instances.svg
Normal file
@@ -0,0 +1,28 @@
|
||||
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
|
||||
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN"
|
||||
"http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
|
||||
<!-- Generated by graphviz version 2.38.0 (20140413.2041)
|
||||
-->
|
||||
<!-- Title: Sync Files Pages: 1 -->
|
||||
<svg width="173pt" height="132pt"
|
||||
viewBox="0.00 0.00 173.00 132.00" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
|
||||
<g id="graph0" class="graph" transform="scale(1 1) rotate(0) translate(4 128)">
|
||||
<title>Sync Files</title>
|
||||
<polygon fill="white" stroke="none" points="-4,4 -4,-128 169,-128 169,4 -4,4"/>
|
||||
<!-- my_model -->
|
||||
<g id="node1" class="node"><title>my_model</title>
|
||||
<polygon fill="none" stroke="black" points="8,-87.5 8,-123.5 157,-123.5 157,-87.5 8,-87.5"/>
|
||||
<text text-anchor="middle" x="82.5" y="-101.8" font-family="monospace" font-size="14.00">MyModel (Script)</text>
|
||||
</g>
|
||||
<!-- foo -->
|
||||
<g id="node2" class="node"><title>foo</title>
|
||||
<polygon fill="none" stroke="black" points="0,-0.5 0,-36.5 165,-36.5 165,-0.5 0,-0.5"/>
|
||||
<text text-anchor="middle" x="82.5" y="-14.8" font-family="monospace" font-size="14.00">foo (ModuleScript)</text>
|
||||
</g>
|
||||
<!-- my_model->foo -->
|
||||
<g id="edge1" class="edge"><title>my_model->foo</title>
|
||||
<path fill="none" stroke="black" d="M82.5,-87.299C82.5,-75.6626 82.5,-60.0479 82.5,-46.7368"/>
|
||||
<polygon fill="black" stroke="black" points="86.0001,-46.6754 82.5,-36.6754 79.0001,-46.6755 86.0001,-46.6754"/>
|
||||
</g>
|
||||
</g>
|
||||
</svg>
|
||||
|
After Width: | Height: | Size: 1.4 KiB |
17
docs/images/sync-example-json-model.gv
Normal file
@@ -0,0 +1,17 @@
|
||||
digraph "Sync Files" {
|
||||
graph [
|
||||
ranksep = "0.7",
|
||||
nodesep = "0.5",
|
||||
];
|
||||
node [
|
||||
fontname = "monospace",
|
||||
shape = "record",
|
||||
];
|
||||
|
||||
model [label = "My Cool Model (Folder)"]
|
||||
root_part [label = "RootPart (Part)"]
|
||||
send_money [label = "SendMoney (RemoteEvent)"]
|
||||
|
||||
model -> root_part
|
||||
model -> send_money
|
||||
}
|
||||
38
docs/images/sync-example-json-model.svg
Normal file
@@ -0,0 +1,38 @@
|
||||
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
|
||||
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN"
|
||||
"http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
|
||||
<!-- Generated by graphviz version 2.38.0 (20140413.2041)
|
||||
-->
|
||||
<!-- Title: Sync Files Pages: 1 -->
|
||||
<svg width="390pt" height="132pt"
|
||||
viewBox="0.00 0.00 390.00 132.00" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
|
||||
<g id="graph0" class="graph" transform="scale(1 1) rotate(0) translate(4 128)">
|
||||
<title>Sync Files</title>
|
||||
<polygon fill="white" stroke="none" points="-4,4 -4,-128 386,-128 386,4 -4,4"/>
|
||||
<!-- model -->
|
||||
<g id="node1" class="node"><title>model</title>
|
||||
<polygon fill="none" stroke="black" points="75,-87.5 75,-123.5 273,-123.5 273,-87.5 75,-87.5"/>
|
||||
<text text-anchor="middle" x="174" y="-101.8" font-family="monospace" font-size="14.00">My Cool Model (Folder)</text>
|
||||
</g>
|
||||
<!-- root_part -->
|
||||
<g id="node2" class="node"><title>root_part</title>
|
||||
<polygon fill="none" stroke="black" points="0,-0.5 0,-36.5 140,-36.5 140,-0.5 0,-0.5"/>
|
||||
<text text-anchor="middle" x="70" y="-14.8" font-family="monospace" font-size="14.00">RootPart (Part)</text>
|
||||
</g>
|
||||
<!-- model->root_part -->
|
||||
<g id="edge1" class="edge"><title>model->root_part</title>
|
||||
<path fill="none" stroke="black" d="M152.954,-87.299C137.448,-74.6257 116.168,-57.2335 99.0438,-43.2377"/>
|
||||
<polygon fill="black" stroke="black" points="100.972,-40.2938 91.0147,-36.6754 96.5426,-45.7138 100.972,-40.2938"/>
|
||||
</g>
|
||||
<!-- send_money -->
|
||||
<g id="node3" class="node"><title>send_money</title>
|
||||
<polygon fill="none" stroke="black" points="176,-0.5 176,-36.5 382,-36.5 382,-0.5 176,-0.5"/>
|
||||
<text text-anchor="middle" x="279" y="-14.8" font-family="monospace" font-size="14.00">SendMoney (RemoteEvent)</text>
|
||||
</g>
|
||||
<!-- model->send_money -->
|
||||
<g id="edge2" class="edge"><title>model->send_money</title>
|
||||
<path fill="none" stroke="black" d="M195.248,-87.299C210.904,-74.6257 232.388,-57.2335 249.677,-43.2377"/>
|
||||
<polygon fill="black" stroke="black" points="252.213,-45.6878 257.783,-36.6754 247.809,-40.2471 252.213,-45.6878"/>
|
||||
</g>
|
||||
</g>
|
||||
</svg>
|
||||
|
After Width: | Height: | Size: 2.1 KiB |
|
Before Width: | Height: | Size: 1.9 KiB |
@@ -1,10 +1,10 @@
|
||||
This is the documentation home for Rojo.
|
||||
This is the documentation home for Rojo 0.5.x.
|
||||
|
||||
Available versions of these docs:
|
||||
|
||||
* [Latest version (currently 0.5.x)](https://lpghatguy.github.io/rojo)
|
||||
* [0.5.x](https://lpghatguy.github.io/rojo/0.5.x)
|
||||
* [0.4.x](https://lpghatguy.github.io/rojo/0.4.x)
|
||||
* [`master` branch](https://lpghatguy.github.io/rojo/master)
|
||||
|
||||
**Rojo** is a flexible multi-tool designed for creating robust Roblox projects.
|
||||
|
||||
|
||||
45
docs/installation.md
Normal file
@@ -0,0 +1,45 @@
|
||||
[TOC]
|
||||
|
||||
## Overview
|
||||
|
||||
Rojo has two components:
|
||||
|
||||
* The command line interface (CLI)
|
||||
* The Roblox Studio plugin
|
||||
|
||||
!!! info
|
||||
It's important that your installed version of the plugin and CLI are compatible.
|
||||
|
||||
The plugin will show errors in the Roblox Studio output window if there is a version mismatch.
|
||||
|
||||
## Installing the CLI
|
||||
|
||||
### Installing from GitHub
|
||||
If you're on Windows, there are pre-built binaries available from Rojo's [GitHub Releases page](https://github.com/LPGhatguy/rojo/releases).
|
||||
|
||||
The Rojo CLI must be run from the command line, like Terminal.app on MacOS or `cmd.exe` on Windows. It's recommended that you put the Rojo CLI executable on your `PATH` to make this easier.
|
||||
|
||||
### Installing from Cargo
|
||||
If you have Rust installed, the easiest way to get Rojo is with Cargo!
|
||||
|
||||
To install the latest 0.5.0 alpha, use:
|
||||
|
||||
```sh
|
||||
cargo install rojo --version 0.5.0-alpha.3
|
||||
```
|
||||
|
||||
## Installing the Plugin
|
||||
|
||||
### Installing from GitHub
|
||||
The Rojo Roblox Studio plugin is available available from Rojo's [GitHub Releases page](https://github.com/LPGhatguy/rojo/releases).
|
||||
|
||||
Download the attached `rbxm` file and put it into your Roblox Studio plugins folder. You can find that folder by pressing **Plugins Folder** from your Plugins toolbar in Roblox Studio:
|
||||
|
||||

|
||||
{: align="center" }
|
||||
|
||||
### Installing from Roblox.com
|
||||
Visit [Rojo's Roblox.com Plugin page](https://www.roblox.com/library/1997686364/Rojo-0-5-0-alpha-3) in Roblox Studio and press **Install**.
|
||||
|
||||
## Visual Studio Code Extension
|
||||
If you use Visual Studio Code on Windows, you can install [Evaera's unofficial Rojo extension](https://marketplace.visualstudio.com/items?itemName=evaera.vscode-rojo), which will install both halves of Rojo for you. It even has a nifty UI to add partitions and start/stop the Rojo server!
|
||||
45
docs/internals/overview.md
Normal file
@@ -0,0 +1,45 @@
|
||||
This document aims to give a general overview of how Rojo works. It's intended for people who want to contribute to the project as well as anyone who's just curious how the tool works!
|
||||
|
||||
[TOC]
|
||||
|
||||
## CLI
|
||||
|
||||
### RbxTree
|
||||
Rojo uses a library named [`rbx_tree`](https://github.com/LPGhatguy/rbx-tree) as its implementation of the Roblox DOM. It serves as a common format for serialization to all the formats Rojo supports!
|
||||
|
||||
Rojo uses two related libraries to deserialize instances from Roblox's file formats, `rbx_xml` and `rbx_binary`.
|
||||
|
||||
### In-Memory Filesystem (IMFS)
|
||||
Relevant source files:
|
||||
|
||||
* [`server/src/imfs.rs`](https://github.com/LPGhatguy/rojo/blob/master/server/src/imfs.rs)
|
||||
* [`server/src/fs_watcher.rs`](https://github.com/LPGhatguy/rojo/blob/master/server/src/fs_watcher.rs)
|
||||
|
||||
Rojo keeps an in-memory copy of all files that it needs reasons about. This enables taking fast, stateless, tear-tree snapshots of files to turn them into instances.
|
||||
|
||||
Keeping an in-memory copy of file contents will also enable Rojo to debounce changes that are caused by Rojo itself. This'll happen when two-way sync finally happens.
|
||||
|
||||
### Snapshot Reconciler
|
||||
Relevant source files:
|
||||
|
||||
* [`server/src/snapshot_reconciler.rs`](https://github.com/LPGhatguy/rojo/blob/master/server/src/snapshot_reconciler.rs)
|
||||
* [`server/src/rbx_snapshot.rs`](https://github.com/LPGhatguy/rojo/blob/master/server/src/rbx_snapshot.rs)
|
||||
* [`server/src/rbx_session.rs`](https://github.com/LPGhatguy/rojo/blob/master/server/src/rbx_session.rs)
|
||||
|
||||
To simplify incremental updates of instances, Rojo generates lightweight snapshots describing how files map to instances. This means that Rojo can treat file change events similarly to damage painting as opposed to trying to surgically update the correct instances.
|
||||
|
||||
This approach reduces the number of desynchronization bugs, reduces the complexity of important pieces of the codebase, and makes writing plugins a lot easier.
|
||||
|
||||
### HTTP API
|
||||
Relevant source files:
|
||||
|
||||
* [`server/src/web.rs`](https://github.com/LPGhatguy/rojo/blob/master/server/src/web.rs)
|
||||
|
||||
The Rojo live-sync server and Roblox Studio plugin communicate via HTTP.
|
||||
|
||||
Requests sent from the plugin to the server are regular HTTP requests.
|
||||
|
||||
Messages sent from the server to the plugin are delivered via HTTP long-polling. This is an approach that uses long-lived HTTP requests that restart on timeout. It's largely been replaced by WebSockets, but Roblox doesn't have support for them.
|
||||
|
||||
## Roblox Studio Plugin
|
||||
TODO
|
||||
@@ -1,7 +1,9 @@
|
||||
Rojo underwent a large refactor during most of 2018 to enable a bunch of new features and lay groundwork for lots more in 2019. As such, Rojo **0.5.x** projects are not compatible with Rojo **0.4.x** projects.
|
||||
|
||||
[TOC]
|
||||
|
||||
## Supporting Both 0.4.x and 0.5.x
|
||||
Rojo 0.5.x uses a different name for its project format. While 0.4.x used `rojo.json`, 0.5.x uses `roblox-project.json`, which allows them to coexist.
|
||||
Rojo 0.5.x uses a different name for its project format. While 0.4.x used `rojo.json`, 0.5.x uses `default.project.json`, which allows them to coexist.
|
||||
|
||||
If you aren't sure about upgrading or want to upgrade gradually, it's possible to keep both files in the same project without causing problems.
|
||||
|
||||
@@ -50,9 +52,6 @@ Metadata begins with a dollar sign (`$`), like `$className`. This is so that chi
|
||||
|
||||
All other values are considered children, where the key is the instance's name, and the value is an object, repeating the process.
|
||||
|
||||
## Migrating `.model.json` Files
|
||||
No upgrade path yet, stay tuned.
|
||||
|
||||
## Migrating Unknown Files
|
||||
If you used Rojo to sync in files as `StringValue` objects, you'll need to make sure those files end with the `txt` extension to preserve this in Rojo 0.5.x.
|
||||
|
||||
|
||||
100
docs/project-format.md
Normal file
@@ -0,0 +1,100 @@
|
||||
[TOC]
|
||||
|
||||
## Project File
|
||||
|
||||
Rojo projects are JSON files that have the `.project.json` extension. They have these fields:
|
||||
|
||||
* `name`: A string indicating the name of the project.
|
||||
* This is only used for diagnostics.
|
||||
* `tree`: An [Instance Description](#instance-description) describing the root instance of the project.
|
||||
|
||||
## Instance Description
|
||||
Instance Descriptions correspond one-to-one with the actual Roblox Instances in the project. They can be specified directly in the project file or be pulled from the filesystem.
|
||||
|
||||
* `$className`: The ClassName of the Instance being described.
|
||||
* Optional if `$path` is specified.
|
||||
* `$path`: The path on the filesystem to pull files from into the project.
|
||||
* Optional if `$className` is specified.
|
||||
* Paths are relative to the folder containing the project file.
|
||||
* `$properties`: Properties to apply to the instance. Values should be [Instance Property Values](#instance-property-value).
|
||||
* Optional
|
||||
* `$ignoreUnknownInstances`: Whether instances that Rojo doesn't know about should be deleted.
|
||||
* Optional
|
||||
* Default is `false` if `$path` is specified, otherwise `true`.
|
||||
|
||||
All other fields in an Instance Description are turned into instances whose name is the key. These values should also be Instance Descriptions!
|
||||
|
||||
Instance Descriptions are fairly verbose and strict. In the future, it'll be possible for Rojo to infer class names for known services like `Workspace`.
|
||||
|
||||
## Instance Property Value
|
||||
The shape of Instance Property Values is defined by the [rbx_tree](https://github.com/LPGhatguy/rbx-tree) library, so it uses slightly different conventions than the rest of Rojo.
|
||||
|
||||
Each value should be an object with the following required fields:
|
||||
|
||||
* `Type`: The type of property to represent.
|
||||
* [Supported types can be found here](https://github.com/LPGhatguy/rbx-tree#property-type-coverage).
|
||||
* `Value`: The value of the property.
|
||||
* The shape of this field depends on which property type is being used. `Vector3` and `Color3` values are both represented as a list of numbers, for example.
|
||||
|
||||
Instance Property Values are intentionally very strict. Rojo will eventually be able to infer types for you!
|
||||
|
||||
## Example Projects
|
||||
This project bundles up everything in the `src` directory. It'd be suitable for making a plugin or model:
|
||||
|
||||
```json
|
||||
{
|
||||
"name": "AwesomeLibrary",
|
||||
"tree": {
|
||||
"$path": "src"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
This project describes the layout you might use if you were making the next hit simulator game, *Sisyphus Simulator*:
|
||||
|
||||
```json
|
||||
{
|
||||
"name": "Sisyphus Simulator",
|
||||
"tree": {
|
||||
"$className": "DataModel",
|
||||
|
||||
"HttpService": {
|
||||
"$className": "HttpService",
|
||||
"$properties": {
|
||||
"HttpEnabled": {
|
||||
"Type": "Bool",
|
||||
"Value": true
|
||||
}
|
||||
}
|
||||
},
|
||||
|
||||
"ReplicatedStorage": {
|
||||
"$className": "ReplicatedStorage",
|
||||
"$path": "src/ReplicatedStorage"
|
||||
},
|
||||
|
||||
"StarterPlayer": {
|
||||
"$className": "StarterPlayer",
|
||||
|
||||
"StarterPlayerScripts": {
|
||||
"$className": "StarterPlayerScripts",
|
||||
"$path": "src/StarterPlayerScripts"
|
||||
}
|
||||
},
|
||||
|
||||
"Workspace": {
|
||||
"$className": "Workspace",
|
||||
"$properties": {
|
||||
"Gravity": {
|
||||
"Type": "Float32",
|
||||
"Value": 67.3
|
||||
}
|
||||
},
|
||||
|
||||
"Terrain": {
|
||||
"$path": "Terrain.rbxm"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
@@ -1,5 +1,7 @@
|
||||
This page aims to describe how Rojo turns files on the filesystem into Roblox objects.
|
||||
|
||||
[TOC]
|
||||
|
||||
## Overview
|
||||
| File Name | Instance Type |
|
||||
| -------------- | ------------------- |
|
||||
@@ -9,6 +11,20 @@ This page aims to describe how Rojo turns files on the filesystem into Roblox ob
|
||||
| `*.lua` | `ModuleScript` |
|
||||
| `*.csv` | `LocalizationTable` |
|
||||
| `*.txt` | `StringValue` |
|
||||
| `*.model.json` | Any |
|
||||
| `*.rbxm` | Any |
|
||||
| `*.rbxmx` | Any |
|
||||
|
||||
## Limitations
|
||||
Not all property types can be synced by Rojo in real-time due to limitations of the Roblox Studio plugin API. In these cases, you can usually generate a place file and open it when you start working on a project.
|
||||
|
||||
Some common cases you might hit are:
|
||||
|
||||
* Binary data (Terrain, CSG, CollectionService tags)
|
||||
* `MeshPart.MeshId`
|
||||
* `HttpService.HttpEnabled`
|
||||
|
||||
For a list of all property types that Rojo can reason about, both when live-syncing and when building place files, look at [rbx_tree's type coverage chart](https://github.com/LPGhatguy/rbx-tree#property-type-coverage).
|
||||
|
||||
## Folders
|
||||
Any directory on the filesystem will turn into a `Folder` instance unless it contains an 'init' script, described below.
|
||||
@@ -20,16 +36,56 @@ If a directory contains a file named `init.server.lua`, `init.client.lua`, or `i
|
||||
|
||||
For example, these files:
|
||||
|
||||
* my-game
|
||||
* init.client.lua
|
||||
* foo.lua
|
||||

|
||||
{: align="center" }
|
||||
|
||||
Will turn into these instances in Roblox:
|
||||
|
||||

|
||||

|
||||
{: align="center" }
|
||||
|
||||
## Localization Tables
|
||||
Any CSV files are transformed into `LocalizationTable` instances. Rojo expects these files to follow the same format that Roblox does when importing and exporting localization information.
|
||||
|
||||
## Plain Text Files
|
||||
Plain text files (`.txt`) files are transformed into `StringValue` instances. This is useful for bringing in text data that can be read by scripts at runtime.
|
||||
Plain text files (`.txt`) files are transformed into `StringValue` instances. This is useful for bringing in text data that can be read by scripts at runtime.
|
||||
|
||||
## JSON Models
|
||||
Files ending in `.model.json` can be used to describe simple models. They're designed to be hand-written and are useful for instances like `RemoteEvent`.
|
||||
|
||||
A JSON model describing a folder containing a `Part` and a `RemoteEvent` could be described as:
|
||||
|
||||
```json
|
||||
{
|
||||
"Name": "My Cool Model",
|
||||
"ClassName": "Folder",
|
||||
"Children": [
|
||||
{
|
||||
"Name": "RootPart",
|
||||
"ClassName": "Part",
|
||||
"Properties": {
|
||||
"Size": {
|
||||
"Type": "Vector3",
|
||||
"Value": [4, 4, 4]
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"Name": "SendMoney",
|
||||
"ClassName": "RemoteEvent"
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
It would turn into instances in this shape:
|
||||
|
||||

|
||||
{: align="center" }
|
||||
|
||||
## Binary and XML Models
|
||||
Rojo supports both binary (`.rbxm`) and XML (`.rbxmx`) models generated by Roblox Studio or another tool.
|
||||
|
||||
Not all property types are supported for all formats!
|
||||
|
||||
For a rundown of supported types, check out [rbx_tree's type coverage chart](https://github.com/LPGhatguy/rbx-tree#property-type-coverage).
|
||||
@@ -1,16 +1,19 @@
|
||||
There are a number of existing plugins for Roblox that move code from the filesystem into Roblox.
|
||||
|
||||
Besides Rojo, there is:
|
||||
Besides Rojo, you might consider:
|
||||
|
||||
* [Studio Bridge](https://github.com/vocksel/studio-bridge) by [Vocksel](https://github.com/vocksel)
|
||||
* [RbxRefresh](https://github.com/osyrisrblx/RbxRefresh) by [Osyris](https://github.com/osyrisrblx)
|
||||
* [RbxSync](https://github.com/evaera/RbxSync) by [evaera](https://github.com/evaera)
|
||||
* [CodeSync](https://github.com/MemoryPenguin/CodeSync) and [rbx-exteditor](https://github.com/MemoryPenguin/rbx-exteditor) by [MemoryPenguin](https://github.com/MemoryPenguin)
|
||||
* [rbxmk](https://github.com/anaminus/rbxmk) by [Anaminus](https://github.com/anaminus)
|
||||
* [rbxmk by Anaminus](https://github.com/anaminus/rbxmk)
|
||||
* [Rofresh by Osyris](https://github.com/osyrisrblx/rofresh)
|
||||
* [RbxRefresh by Osyris](https://github.com/osyrisrblx/RbxRefresh)
|
||||
* [Studio Bridge by Vocksel](https://github.com/vocksel/studio-bridge)
|
||||
* [Elixir by Vocksel](https://github.com/vocksel/elixir)
|
||||
* [RbxSync by evaera](https://github.com/evaera/RbxSync)
|
||||
* [CodeSync by MemoryPenguin](https://github.com/MemoryPenguin/CodeSync)
|
||||
* [rbx-exteditor by MemoryPenguin](https://github.com/MemoryPenguin/rbx-exteditor)
|
||||
|
||||
So why did I build Rojo?
|
||||
|
||||
Each of these tools solves what is essentially the same problem from a few different angles. The goal of Rojo is to take all of the lessons and ideas learned from these projects and build a tool that can solve the problem for good.
|
||||
Each of these tools solves what is essentially the same problem from a few different angles. The goal of Rojo is to take all of the lessons and ideas learned from these projects and build a tool that can solve this problem for good.
|
||||
|
||||
Additionally:
|
||||
|
||||
|
||||
@@ -3,23 +3,33 @@
|
||||
# Kludged documentation generator to support multiple versions.
|
||||
# Make sure the `site` folder is a checkout of this repository's `gh-pages`
|
||||
# branch.
|
||||
# To use, copy this file to `generate-docs.run` so that Git will leave it alone,
|
||||
# then run `generate-docs.run` in the root of the repository.
|
||||
|
||||
set -e
|
||||
|
||||
CURRENT_BRANCH=$(git rev-parse --abbrev-ref HEAD)
|
||||
REMOTE=$(git remote get-url origin)
|
||||
CHECKOUT="$(mktemp -d)"
|
||||
OUTPUT="$(pwd)/site"
|
||||
|
||||
echo "Building 0.4.x"
|
||||
git checkout v0.4.x
|
||||
git pull
|
||||
mkdocs build --site-dir site/0.4.x
|
||||
if [ -d site ]
|
||||
then
|
||||
cd site
|
||||
git pull
|
||||
else
|
||||
git clone "$REMOTE" site
|
||||
cd site
|
||||
git checkout gh-pages
|
||||
fi
|
||||
|
||||
git clone "$REMOTE" "$CHECKOUT"
|
||||
cd "$CHECKOUT"
|
||||
|
||||
echo "Building master"
|
||||
git checkout master
|
||||
mkdocs build --site-dir site/master
|
||||
mkdocs build --site-dir "$OUTPUT"
|
||||
|
||||
echo "Building 0.5.x"
|
||||
mkdocs build --site-dir site/0.5.x
|
||||
mkdocs build --site-dir "$OUTPUT/0.5.x"
|
||||
|
||||
git checkout "$CURRENT_BRANCH"
|
||||
echo "Building 0.4.x"
|
||||
git checkout v0.4.x
|
||||
mkdocs build --site-dir "$OUTPUT/0.4.x"
|
||||
13
mkdocs.yml
@@ -11,11 +11,16 @@ theme:
|
||||
nav:
|
||||
- Home: index.md
|
||||
- Why Rojo?: why-rojo.md
|
||||
- Getting Started:
|
||||
- Installation: getting-started/installation.md
|
||||
- Creating a Place with Rojo: getting-started/creating-a-place.md
|
||||
- Sync Details: sync-details.md
|
||||
- Installation: installation.md
|
||||
- Creating a Place with Rojo: creating-a-place.md
|
||||
- Migrating from 0.4.x to 0.5.x: migrating-to-epiphany.md
|
||||
- Project Format: project-format.md
|
||||
- Sync Details: sync-details.md
|
||||
- Rojo Internals:
|
||||
- Internals Overview: internals/overview.md
|
||||
|
||||
extra_css:
|
||||
- extra.css
|
||||
|
||||
markdown_extensions:
|
||||
- attr_list
|
||||
|
||||
@@ -5,8 +5,10 @@
|
||||
|
||||
"ReplicatedStorage": {
|
||||
"$className": "ReplicatedStorage",
|
||||
|
||||
"Rojo": {
|
||||
"$className": "Folder",
|
||||
|
||||
"Plugin": {
|
||||
"$path": "src"
|
||||
},
|
||||
@@ -28,8 +30,19 @@
|
||||
}
|
||||
},
|
||||
|
||||
"HttpService": {
|
||||
"$className": "HttpService",
|
||||
"$properties": {
|
||||
"HttpEnabled": {
|
||||
"Type": "Bool",
|
||||
"Value": true
|
||||
}
|
||||
}
|
||||
},
|
||||
|
||||
"TestService": {
|
||||
"$className": "TestService",
|
||||
|
||||
"TestBootstrap": {
|
||||
"$path": "testBootstrap.server.lua"
|
||||
}
|
||||
@@ -11,6 +11,14 @@ ApiContext.__index = ApiContext
|
||||
-- TODO: Audit cases of errors and create enum values for each of them.
|
||||
ApiContext.Error = {
|
||||
ServerIdMismatch = "ServerIdMismatch",
|
||||
|
||||
-- The server gave an unexpected 400-category error, which may be the
|
||||
-- client's fault.
|
||||
ClientError = "ClientError",
|
||||
|
||||
-- The server gave an unexpected 500-category error, which may be the
|
||||
-- server's fault.
|
||||
ServerError = "ServerError",
|
||||
}
|
||||
|
||||
setmetatable(ApiContext.Error, {
|
||||
@@ -19,6 +27,18 @@ setmetatable(ApiContext.Error, {
|
||||
end
|
||||
})
|
||||
|
||||
local function rejectFailedRequests(response)
|
||||
if response.code >= 400 then
|
||||
if response.code < 500 then
|
||||
return Promise.reject(ApiContext.Error.ClientError)
|
||||
else
|
||||
return Promise.reject(ApiContext.Error.ServerError)
|
||||
end
|
||||
end
|
||||
|
||||
return response
|
||||
end
|
||||
|
||||
function ApiContext.new(baseUrl)
|
||||
assert(type(baseUrl) == "string")
|
||||
|
||||
@@ -43,6 +63,7 @@ function ApiContext:connect()
|
||||
local url = ("%s/api/rojo"):format(self.baseUrl)
|
||||
|
||||
return Http.get(url)
|
||||
:andThen(rejectFailedRequests)
|
||||
:andThen(function(response)
|
||||
local body = response:json()
|
||||
|
||||
@@ -102,9 +123,7 @@ function ApiContext:read(ids)
|
||||
local url = ("%s/api/read/%s"):format(self.baseUrl, table.concat(ids, ","))
|
||||
|
||||
return Http.get(url)
|
||||
:catch(function(err)
|
||||
return Promise.reject(err)
|
||||
end)
|
||||
:andThen(rejectFailedRequests)
|
||||
:andThen(function(response)
|
||||
local body = response:json()
|
||||
|
||||
@@ -121,14 +140,19 @@ end
|
||||
function ApiContext:retrieveMessages()
|
||||
local url = ("%s/api/subscribe/%s"):format(self.baseUrl, self.messageCursor)
|
||||
|
||||
return Http.get(url)
|
||||
:catch(function(err)
|
||||
if err.type == HttpError.Error.Timeout then
|
||||
return self:retrieveMessages()
|
||||
end
|
||||
local function sendRequest()
|
||||
return Http.get(url)
|
||||
:catch(function(err)
|
||||
if err.type == HttpError.Error.Timeout then
|
||||
return sendRequest()
|
||||
end
|
||||
|
||||
return Promise.reject(err)
|
||||
end)
|
||||
return Promise.reject(err)
|
||||
end)
|
||||
end
|
||||
|
||||
return sendRequest()
|
||||
:andThen(rejectFailedRequests)
|
||||
:andThen(function(response)
|
||||
local body = response:json()
|
||||
|
||||
|
||||
@@ -9,25 +9,16 @@ local Assets = {
|
||||
},
|
||||
},
|
||||
Slices = {
|
||||
GrayBox = {
|
||||
asset = sheetAsset,
|
||||
offset = Vector2.new(147, 433),
|
||||
size = Vector2.new(38, 36),
|
||||
center = Rect.new(8, 8, 9, 9),
|
||||
},
|
||||
GrayButton02 = {
|
||||
asset = sheetAsset,
|
||||
offset = Vector2.new(0, 98),
|
||||
size = Vector2.new(190, 45),
|
||||
center = Rect.new(16, 16, 17, 17),
|
||||
},
|
||||
GrayButton07 = {
|
||||
asset = sheetAsset,
|
||||
offset = Vector2.new(195, 0),
|
||||
size = Vector2.new(49, 49),
|
||||
center = Rect.new(16, 16, 17, 17),
|
||||
RoundBox = {
|
||||
asset = "rbxassetid://2773204550",
|
||||
offset = Vector2.new(0, 0),
|
||||
size = Vector2.new(32, 32),
|
||||
center = Rect.new(4, 4, 4, 4),
|
||||
},
|
||||
},
|
||||
Images = {
|
||||
Logo = "rbxassetid://2773210620",
|
||||
},
|
||||
StartSession = "",
|
||||
SessionActive = "",
|
||||
Configure = "",
|
||||
|
||||
@@ -71,7 +71,6 @@ function App:init()
|
||||
})
|
||||
|
||||
self.connectButton = nil
|
||||
self.configButton = nil
|
||||
self.currentSession = nil
|
||||
|
||||
self.displayedVersion = DevSettings:isEnabled()
|
||||
@@ -84,7 +83,19 @@ function App:render()
|
||||
|
||||
if self.state.sessionStatus == SessionStatus.Connected then
|
||||
children = {
|
||||
ConnectionActivePanel = e(ConnectionActivePanel),
|
||||
ConnectionActivePanel = e(ConnectionActivePanel, {
|
||||
stopSession = function()
|
||||
Logging.trace("Disconnecting session")
|
||||
|
||||
self.currentSession:disconnect()
|
||||
self.currentSession = nil
|
||||
self:setState({
|
||||
sessionStatus = SessionStatus.Disconnected,
|
||||
})
|
||||
|
||||
Logging.trace("Session terminated by user")
|
||||
end,
|
||||
}),
|
||||
}
|
||||
elseif self.state.sessionStatus == SessionStatus.ConfiguringSession then
|
||||
children = {
|
||||
@@ -96,8 +107,7 @@ function App:render()
|
||||
address = address,
|
||||
port = port,
|
||||
onError = function(message)
|
||||
Logging.warn("%s", tostring(message))
|
||||
Logging.trace("Session terminated due to error")
|
||||
Logging.warn("Rojo session terminated because of an error:\n%s", tostring(message))
|
||||
self.currentSession = nil
|
||||
|
||||
self:setState({
|
||||
@@ -167,15 +177,6 @@ function App:didMount()
|
||||
})
|
||||
end
|
||||
end)
|
||||
|
||||
self.configButton = toolbar:CreateButton(
|
||||
"Configure",
|
||||
"Configure the Rojo plugin",
|
||||
Assets.Configure)
|
||||
self.configButton.ClickableWhenViewportHidden = false
|
||||
self.configButton.Click:Connect(function()
|
||||
self.configButton:SetActive(false)
|
||||
end)
|
||||
end
|
||||
|
||||
function App:didUpdate()
|
||||
|
||||
@@ -4,47 +4,45 @@ local Plugin = Rojo.Plugin
|
||||
local Roact = require(Rojo.Roact)
|
||||
|
||||
local Config = require(Plugin.Config)
|
||||
local Version = require(Plugin.Version)
|
||||
local Assets = require(Plugin.Assets)
|
||||
local Theme = require(Plugin.Theme)
|
||||
local joinBindings = require(Plugin.joinBindings)
|
||||
|
||||
local FitList = require(Plugin.Components.FitList)
|
||||
local FitText = require(Plugin.Components.FitText)
|
||||
local FormButton = require(Plugin.Components.FormButton)
|
||||
local FormTextInput = require(Plugin.Components.FormTextInput)
|
||||
|
||||
local WhiteCross = Assets.Sprites.WhiteCross
|
||||
local GrayBox = Assets.Slices.GrayBox
|
||||
local RoundBox = Assets.Slices.RoundBox
|
||||
|
||||
local e = Roact.createElement
|
||||
|
||||
local TEXT_COLOR = Color3.new(0.05, 0.05, 0.05)
|
||||
local FORM_TEXT_SIZE = 20
|
||||
|
||||
local ConnectPanel = Roact.Component:extend("ConnectPanel")
|
||||
|
||||
function ConnectPanel:init()
|
||||
self.labelSizes = {}
|
||||
self.labelSize, self.setLabelSize = Roact.createBinding(Vector2.new())
|
||||
self.footerSize, self.setFooterSize = Roact.createBinding(Vector2.new())
|
||||
self.footerVersionSize, self.setFooterVersionSize = Roact.createBinding(Vector2.new())
|
||||
|
||||
-- This is constructed in init because 'joinBindings' is a hack and we'd
|
||||
-- leak memory constructing it every render. When this kind of feature lands
|
||||
-- in Roact properly, we can do this inline in render without fear.
|
||||
self.footerRestSize = joinBindings(
|
||||
{
|
||||
self.footerSize,
|
||||
self.footerVersionSize,
|
||||
},
|
||||
function(container, other)
|
||||
return UDim2.new(0, container.X - other.X - 16, 0, 32)
|
||||
end
|
||||
)
|
||||
|
||||
self:setState({
|
||||
address = Config.defaultHost,
|
||||
port = Config.defaultPort,
|
||||
address = "",
|
||||
port = "",
|
||||
})
|
||||
end
|
||||
|
||||
function ConnectPanel:updateLabelSize(name, size)
|
||||
self.labelSizes[name] = size
|
||||
|
||||
local x = 0
|
||||
local y = 0
|
||||
|
||||
for _, size in pairs(self.labelSizes) do
|
||||
x = math.max(x, size.X)
|
||||
y = math.max(y, size.Y)
|
||||
end
|
||||
|
||||
self.setLabelSize(Vector2.new(x, y))
|
||||
end
|
||||
|
||||
function ConnectPanel:render()
|
||||
local startSession = self.props.startSession
|
||||
local cancel = self.props.cancel
|
||||
@@ -52,11 +50,11 @@ function ConnectPanel:render()
|
||||
return e(FitList, {
|
||||
containerKind = "ImageLabel",
|
||||
containerProps = {
|
||||
Image = GrayBox.asset,
|
||||
ImageRectOffset = GrayBox.offset,
|
||||
ImageRectSize = GrayBox.size,
|
||||
Image = RoundBox.asset,
|
||||
ImageRectOffset = RoundBox.offset,
|
||||
ImageRectSize = RoundBox.size,
|
||||
SliceCenter = RoundBox.center,
|
||||
ScaleType = Enum.ScaleType.Slice,
|
||||
SliceCenter = GrayBox.center,
|
||||
BackgroundTransparency = 1,
|
||||
Position = UDim2.new(0.5, 0, 0.5, 0),
|
||||
AnchorPoint = Vector2.new(0.5, 0.5),
|
||||
@@ -65,63 +63,20 @@ function ConnectPanel:render()
|
||||
HorizontalAlignment = Enum.HorizontalAlignment.Center,
|
||||
},
|
||||
}, {
|
||||
Head = e("Frame", {
|
||||
LayoutOrder = 1,
|
||||
Size = UDim2.new(1, 0, 0, 36),
|
||||
BackgroundTransparency = 1,
|
||||
}, {
|
||||
Padding = e("UIPadding", {
|
||||
PaddingTop = UDim.new(0, 8),
|
||||
PaddingBottom = UDim.new(0, 8),
|
||||
PaddingLeft = UDim.new(0, 8),
|
||||
PaddingRight = UDim.new(0, 8),
|
||||
}),
|
||||
|
||||
Title = e("TextLabel", {
|
||||
Font = Enum.Font.SourceSansBold,
|
||||
TextSize = 22,
|
||||
Text = "Start New Rojo Session",
|
||||
Size = UDim2.new(1, 0, 1, 0),
|
||||
TextXAlignment = Enum.TextXAlignment.Left,
|
||||
BackgroundTransparency = 1,
|
||||
TextColor3 = TEXT_COLOR,
|
||||
}),
|
||||
|
||||
Close = e("ImageButton", {
|
||||
Image = WhiteCross.asset,
|
||||
ImageRectOffset = WhiteCross.offset,
|
||||
ImageRectSize = WhiteCross.size,
|
||||
Size = UDim2.new(0, 18, 0, 18),
|
||||
Position = UDim2.new(1, 0, 0.5, 0),
|
||||
AnchorPoint = Vector2.new(1, 0.5),
|
||||
ImageColor3 = TEXT_COLOR,
|
||||
BackgroundTransparency = 1,
|
||||
[Roact.Event.Activated] = function()
|
||||
cancel()
|
||||
end,
|
||||
}),
|
||||
}),
|
||||
|
||||
Border = e("Frame", {
|
||||
BorderSizePixel = 0,
|
||||
BackgroundColor3 = Color3.new(0.7, 0.7, 0.7),
|
||||
Size = UDim2.new(1, -4, 0, 2),
|
||||
LayoutOrder = 2,
|
||||
}),
|
||||
|
||||
Body = e(FitList, {
|
||||
Inputs = e(FitList, {
|
||||
containerProps = {
|
||||
BackgroundTransparency = 1,
|
||||
LayoutOrder = 3,
|
||||
LayoutOrder = 1,
|
||||
},
|
||||
layoutProps = {
|
||||
FillDirection = Enum.FillDirection.Horizontal,
|
||||
Padding = UDim.new(0, 8),
|
||||
},
|
||||
paddingProps = {
|
||||
PaddingTop = UDim.new(0, 8),
|
||||
PaddingBottom = UDim.new(0, 8),
|
||||
PaddingLeft = UDim.new(0, 8),
|
||||
PaddingRight = UDim.new(0, 8),
|
||||
PaddingTop = UDim.new(0, 20),
|
||||
PaddingBottom = UDim.new(0, 10),
|
||||
PaddingLeft = UDim.new(0, 24),
|
||||
PaddingRight = UDim.new(0, 24),
|
||||
},
|
||||
}, {
|
||||
Address = e(FitList, {
|
||||
@@ -130,34 +85,25 @@ function ConnectPanel:render()
|
||||
BackgroundTransparency = 1,
|
||||
},
|
||||
layoutProps = {
|
||||
FillDirection = Enum.FillDirection.Horizontal,
|
||||
Padding = UDim.new(0, 8),
|
||||
Padding = UDim.new(0, 4),
|
||||
},
|
||||
}, {
|
||||
Label = e(FitText, {
|
||||
MinSize = Vector2.new(0, 24),
|
||||
Kind = "TextLabel",
|
||||
LayoutOrder = 1,
|
||||
BackgroundTransparency = 1,
|
||||
TextXAlignment = Enum.TextXAlignment.Left,
|
||||
Font = Enum.Font.SourceSansBold,
|
||||
TextSize = FORM_TEXT_SIZE,
|
||||
Font = Theme.TitleFont,
|
||||
TextSize = 20,
|
||||
Text = "Address",
|
||||
TextColor3 = TEXT_COLOR,
|
||||
|
||||
[Roact.Change.AbsoluteSize] = function(rbx)
|
||||
self:updateLabelSize("address", rbx.AbsoluteSize)
|
||||
end,
|
||||
}, {
|
||||
Sizing = e("UISizeConstraint", {
|
||||
MinSize = self.labelSize,
|
||||
}),
|
||||
TextColor3 = Theme.AccentColor,
|
||||
}),
|
||||
|
||||
Input = e(FormTextInput, {
|
||||
layoutOrder = 2,
|
||||
size = UDim2.new(0, 300, 0, 24),
|
||||
width = UDim.new(0, 220),
|
||||
value = self.state.address,
|
||||
placeholderValue = Config.defaultHost,
|
||||
onValueChange = function(newValue)
|
||||
self:setState({
|
||||
address = newValue,
|
||||
@@ -172,34 +118,25 @@ function ConnectPanel:render()
|
||||
BackgroundTransparency = 1,
|
||||
},
|
||||
layoutProps = {
|
||||
FillDirection = Enum.FillDirection.Horizontal,
|
||||
Padding = UDim.new(0, 8),
|
||||
Padding = UDim.new(0, 4),
|
||||
},
|
||||
}, {
|
||||
Label = e(FitText, {
|
||||
MinSize = Vector2.new(0, 24),
|
||||
Kind = "TextLabel",
|
||||
LayoutOrder = 1,
|
||||
BackgroundTransparency = 1,
|
||||
TextXAlignment = Enum.TextXAlignment.Left,
|
||||
Font = Enum.Font.SourceSansBold,
|
||||
TextSize = FORM_TEXT_SIZE,
|
||||
Font = Theme.TitleFont,
|
||||
TextSize = 20,
|
||||
Text = "Port",
|
||||
TextColor3 = TEXT_COLOR,
|
||||
|
||||
[Roact.Change.AbsoluteSize] = function(rbx)
|
||||
self:updateLabelSize("port", rbx.AbsoluteSize)
|
||||
end,
|
||||
}, {
|
||||
Sizing = e("UISizeConstraint", {
|
||||
MinSize = self.labelSize,
|
||||
}),
|
||||
TextColor3 = Theme.AccentColor,
|
||||
}),
|
||||
|
||||
Input = e(FormTextInput, {
|
||||
layoutOrder = 2,
|
||||
size = UDim2.new(0, 300, 0, 24),
|
||||
width = UDim.new(0, 80),
|
||||
value = self.state.port,
|
||||
placeholderValue = Config.defaultPort,
|
||||
onValueChange = function(newValue)
|
||||
self:setState({
|
||||
port = newValue,
|
||||
@@ -207,36 +144,117 @@ function ConnectPanel:render()
|
||||
end,
|
||||
}),
|
||||
}),
|
||||
}),
|
||||
|
||||
Buttons = e(FitList, {
|
||||
containerProps = {
|
||||
LayoutOrder = 3,
|
||||
BackgroundTransparency = 1,
|
||||
},
|
||||
layoutProps = {
|
||||
FillDirection = Enum.FillDirection.Horizontal,
|
||||
Padding = UDim.new(0, 8),
|
||||
},
|
||||
Buttons = e(FitList, {
|
||||
fitAxes = "Y",
|
||||
containerProps = {
|
||||
BackgroundTransparency = 1,
|
||||
LayoutOrder = 2,
|
||||
Size = UDim2.new(1, 0, 0, 0),
|
||||
},
|
||||
layoutProps = {
|
||||
FillDirection = Enum.FillDirection.Horizontal,
|
||||
HorizontalAlignment = Enum.HorizontalAlignment.Right,
|
||||
Padding = UDim.new(0, 8),
|
||||
},
|
||||
paddingProps = {
|
||||
PaddingTop = UDim.new(0, 0),
|
||||
PaddingBottom = UDim.new(0, 20),
|
||||
PaddingLeft = UDim.new(0, 24),
|
||||
PaddingRight = UDim.new(0, 24),
|
||||
},
|
||||
}, {
|
||||
e(FormButton, {
|
||||
layoutOrder = 1,
|
||||
text = "Cancel",
|
||||
onClick = function()
|
||||
if cancel ~= nil then
|
||||
cancel()
|
||||
end
|
||||
end,
|
||||
secondary = true,
|
||||
}),
|
||||
|
||||
e(FormButton, {
|
||||
layoutOrder = 2,
|
||||
text = "Connect",
|
||||
onClick = function()
|
||||
if startSession ~= nil then
|
||||
local address = self.state.address
|
||||
if address:len() == 0 then
|
||||
address = Config.defaultHost
|
||||
end
|
||||
|
||||
local port = self.state.port
|
||||
if port:len() == 0 then
|
||||
port = Config.defaultPort
|
||||
end
|
||||
|
||||
startSession(address, port)
|
||||
end
|
||||
end,
|
||||
}),
|
||||
}),
|
||||
|
||||
Footer = e(FitList, {
|
||||
fitAxes = "Y",
|
||||
containerKind = "ImageLabel",
|
||||
containerProps = {
|
||||
Image = RoundBox.asset,
|
||||
ImageRectOffset = RoundBox.offset + Vector2.new(0, RoundBox.size.Y / 2),
|
||||
ImageRectSize = RoundBox.size * Vector2.new(1, 0.5),
|
||||
SliceCenter = RoundBox.center,
|
||||
ScaleType = Enum.ScaleType.Slice,
|
||||
ImageColor3 = Theme.SecondaryColor,
|
||||
Size = UDim2.new(1, 0, 0, 0),
|
||||
LayoutOrder = 3,
|
||||
BackgroundTransparency = 1,
|
||||
|
||||
[Roact.Change.AbsoluteSize] = function(rbx)
|
||||
self.setFooterSize(rbx.AbsoluteSize)
|
||||
end,
|
||||
},
|
||||
layoutProps = {
|
||||
FillDirection = Enum.FillDirection.Horizontal,
|
||||
HorizontalAlignment = Enum.HorizontalAlignment.Center,
|
||||
VerticalAlignment = Enum.VerticalAlignment.Center,
|
||||
},
|
||||
paddingProps = {
|
||||
PaddingTop = UDim.new(0, 4),
|
||||
PaddingBottom = UDim.new(0, 4),
|
||||
PaddingLeft = UDim.new(0, 8),
|
||||
PaddingRight = UDim.new(0, 8),
|
||||
},
|
||||
}, {
|
||||
LogoContainer = e("Frame", {
|
||||
BackgroundTransparency = 1,
|
||||
|
||||
Size = self.footerRestSize,
|
||||
}, {
|
||||
e(FormButton, {
|
||||
text = "Start",
|
||||
onClick = function()
|
||||
if startSession ~= nil then
|
||||
startSession(self.state.address, self.state.port)
|
||||
end
|
||||
end,
|
||||
Logo = e("ImageLabel", {
|
||||
Image = Assets.Images.Logo,
|
||||
Size = UDim2.new(0, 80, 0, 40),
|
||||
ScaleType = Enum.ScaleType.Fit,
|
||||
BackgroundTransparency = 1,
|
||||
Position = UDim2.new(0, 0, 1, -10),
|
||||
AnchorPoint = Vector2.new(0, 1),
|
||||
}),
|
||||
}),
|
||||
|
||||
e(FormButton, {
|
||||
text = "Cancel",
|
||||
onClick = function()
|
||||
if cancel ~= nil then
|
||||
cancel()
|
||||
end
|
||||
end,
|
||||
}),
|
||||
})
|
||||
})
|
||||
Version = e(FitText, {
|
||||
Font = Theme.TitleFont,
|
||||
TextSize = 18,
|
||||
Text = Version.display(Config.version),
|
||||
TextXAlignment = Enum.TextXAlignment.Right,
|
||||
TextColor3 = Theme.LightTextColor,
|
||||
BackgroundTransparency = 1,
|
||||
|
||||
[Roact.Change.AbsoluteSize] = function(rbx)
|
||||
self.setFooterVersionSize(rbx.AbsoluteSize)
|
||||
end,
|
||||
}),
|
||||
}),
|
||||
})
|
||||
end
|
||||
|
||||
|
||||
@@ -1,38 +1,66 @@
|
||||
local Roact = require(script:FindFirstAncestor("Rojo").Roact)
|
||||
|
||||
local Assets = require(script.Parent.Parent.Assets)
|
||||
local Plugin = script:FindFirstAncestor("Plugin")
|
||||
|
||||
local FitList = require(script.Parent.FitList)
|
||||
local FitText = require(script.Parent.FitText)
|
||||
local Theme = require(Plugin.Theme)
|
||||
local Assets = require(Plugin.Assets)
|
||||
|
||||
local FitList = require(Plugin.Components.FitList)
|
||||
local FitText = require(Plugin.Components.FitText)
|
||||
|
||||
local e = Roact.createElement
|
||||
|
||||
local GrayBox = Assets.Slices.GrayBox
|
||||
local RoundBox = Assets.Slices.RoundBox
|
||||
local WhiteCross = Assets.Sprites.WhiteCross
|
||||
|
||||
local ConnectionActivePanel = Roact.Component:extend("ConnectionActivePanel")
|
||||
local function ConnectionActivePanel(props)
|
||||
local stopSession = props.stopSession
|
||||
|
||||
function ConnectionActivePanel:render()
|
||||
return e(FitList, {
|
||||
containerKind = "ImageButton",
|
||||
containerKind = "ImageLabel",
|
||||
containerProps = {
|
||||
Image = GrayBox.asset,
|
||||
ImageRectOffset = GrayBox.offset,
|
||||
ImageRectSize = GrayBox.size,
|
||||
SliceCenter = GrayBox.center,
|
||||
Image = RoundBox.asset,
|
||||
ImageRectOffset = RoundBox.offset + Vector2.new(0, RoundBox.size.Y / 2),
|
||||
ImageRectSize = RoundBox.size * Vector2.new(1, 0.5),
|
||||
SliceCenter = Rect.new(4, 4, 4, 4),
|
||||
ScaleType = Enum.ScaleType.Slice,
|
||||
BackgroundTransparency = 1,
|
||||
Position = UDim2.new(0.5, 0, 0, 0),
|
||||
AnchorPoint = Vector2.new(0.5, 0),
|
||||
},
|
||||
layoutProps = {
|
||||
FillDirection = Enum.FillDirection.Horizontal,
|
||||
VerticalAlignment = Enum.VerticalAlignment.Center,
|
||||
},
|
||||
}, {
|
||||
Text = e(FitText, {
|
||||
Padding = Vector2.new(12, 6),
|
||||
Font = Enum.Font.SourceSans,
|
||||
Font = Theme.ButtonFont,
|
||||
TextSize = 18,
|
||||
Text = "Rojo Connected",
|
||||
TextColor3 = Color3.new(0.05, 0.05, 0.05),
|
||||
TextColor3 = Theme.PrimaryColor,
|
||||
BackgroundTransparency = 1,
|
||||
}),
|
||||
|
||||
CloseContainer = e("ImageButton", {
|
||||
Size = UDim2.new(0, 30, 0, 30),
|
||||
BackgroundTransparency = 1,
|
||||
|
||||
[Roact.Event.Activated] = function()
|
||||
stopSession()
|
||||
end,
|
||||
}, {
|
||||
CloseImage = e("ImageLabel", {
|
||||
Size = UDim2.new(0, 16, 0, 16),
|
||||
Position = UDim2.new(0.5, 0, 0.5, 0),
|
||||
AnchorPoint = Vector2.new(0.5, 0.5),
|
||||
Image = WhiteCross.asset,
|
||||
ImageRectOffset = WhiteCross.offset,
|
||||
ImageRectSize = WhiteCross.size,
|
||||
ImageColor3 = Theme.PrimaryColor,
|
||||
BackgroundTransparency = 1,
|
||||
}),
|
||||
}),
|
||||
})
|
||||
end
|
||||
|
||||
|
||||
@@ -12,6 +12,7 @@ end
|
||||
|
||||
function FitList:render()
|
||||
local containerKind = self.props.containerKind or "Frame"
|
||||
local fitAxes = self.props.fitAxes or "XY"
|
||||
local containerProps = self.props.containerProps
|
||||
local layoutProps = self.props.layoutProps
|
||||
local paddingProps = self.props.paddingProps
|
||||
@@ -25,15 +26,27 @@ function FitList:render()
|
||||
["$Layout"] = e("UIListLayout", Dictionary.merge({
|
||||
SortOrder = Enum.SortOrder.LayoutOrder,
|
||||
[Roact.Change.AbsoluteContentSize] = function(instance)
|
||||
local size = instance.AbsoluteContentSize
|
||||
local contentSize = instance.AbsoluteContentSize
|
||||
|
||||
if paddingProps ~= nil then
|
||||
size = size + Vector2.new(
|
||||
contentSize = contentSize + Vector2.new(
|
||||
paddingProps.PaddingLeft.Offset + paddingProps.PaddingRight.Offset,
|
||||
paddingProps.PaddingTop.Offset + paddingProps.PaddingBottom.Offset)
|
||||
end
|
||||
|
||||
self.setSize(UDim2.new(0, size.X, 0, size.Y))
|
||||
local combinedSize
|
||||
|
||||
if fitAxes == "X" then
|
||||
combinedSize = UDim2.new(0, contentSize.X, containerProps.Size.Y.Scale, containerProps.Size.Y.Offset)
|
||||
elseif fitAxes == "Y" then
|
||||
combinedSize = UDim2.new(containerProps.Size.X.Scale, containerProps.Size.X.Offset, 0, contentSize.Y)
|
||||
elseif fitAxes == "XY" then
|
||||
combinedSize = UDim2.new(0, contentSize.X, 0, contentSize.Y)
|
||||
else
|
||||
error("Invalid fitAxes value")
|
||||
end
|
||||
|
||||
self.setSize(combinedSize)
|
||||
end,
|
||||
}, layoutProps)),
|
||||
|
||||
|
||||
@@ -4,28 +4,41 @@ local Plugin = Rojo.Plugin
|
||||
local Roact = require(Rojo.Roact)
|
||||
|
||||
local Assets = require(Plugin.Assets)
|
||||
local Theme = require(Plugin.Theme)
|
||||
local FitList = require(Plugin.Components.FitList)
|
||||
local FitText = require(Plugin.Components.FitText)
|
||||
|
||||
local e = Roact.createElement
|
||||
|
||||
local GrayButton07 = Assets.Slices.GrayButton07
|
||||
local RoundBox = Assets.Slices.RoundBox
|
||||
|
||||
local function FormButton(props)
|
||||
local text = props.text
|
||||
local layoutOrder = props.layoutOrder
|
||||
local onClick = props.onClick
|
||||
|
||||
local textColor
|
||||
local backgroundColor
|
||||
|
||||
if props.secondary then
|
||||
textColor = Theme.AccentColor
|
||||
backgroundColor = Theme.SecondaryColor
|
||||
else
|
||||
textColor = Theme.SecondaryColor
|
||||
backgroundColor = Theme.AccentColor
|
||||
end
|
||||
|
||||
return e(FitList, {
|
||||
containerKind = "ImageButton",
|
||||
containerProps = {
|
||||
LayoutOrder = layoutOrder,
|
||||
BackgroundTransparency = 1,
|
||||
Image = GrayButton07.asset,
|
||||
ImageRectOffset = GrayButton07.offset,
|
||||
ImageRectSize = GrayButton07.size,
|
||||
Image = RoundBox.asset,
|
||||
ImageRectOffset = RoundBox.offset,
|
||||
ImageRectSize = RoundBox.size,
|
||||
SliceCenter = RoundBox.center,
|
||||
ScaleType = Enum.ScaleType.Slice,
|
||||
SliceCenter = GrayButton07.center,
|
||||
ImageColor3 = backgroundColor,
|
||||
|
||||
[Roact.Event.Activated] = function()
|
||||
if onClick ~= nil then
|
||||
@@ -37,10 +50,10 @@ local function FormButton(props)
|
||||
Text = e(FitText, {
|
||||
Kind = "TextLabel",
|
||||
Text = text,
|
||||
TextSize = 22,
|
||||
Font = Enum.Font.SourceSansBold,
|
||||
Padding = Vector2.new(14, 6),
|
||||
TextColor3 = Color3.new(0.05, 0.05, 0.05),
|
||||
TextSize = 18,
|
||||
TextColor3 = textColor,
|
||||
Font = Theme.ButtonFont,
|
||||
Padding = Vector2.new(16, 8),
|
||||
BackgroundTransparency = 1,
|
||||
}),
|
||||
})
|
||||
|
||||
@@ -4,42 +4,75 @@ local Plugin = Rojo.Plugin
|
||||
local Roact = require(Rojo.Roact)
|
||||
|
||||
local Assets = require(Plugin.Assets)
|
||||
local Theme = require(Plugin.Theme)
|
||||
|
||||
local e = Roact.createElement
|
||||
|
||||
local GrayBox = Assets.Slices.GrayBox
|
||||
local RoundBox = Assets.Slices.RoundBox
|
||||
|
||||
local function FormTextInput(props)
|
||||
local value = props.value
|
||||
local onValueChange = props.onValueChange
|
||||
local layoutOrder = props.layoutOrder
|
||||
local size = props.size
|
||||
local TEXT_SIZE = 22
|
||||
local PADDING = 8
|
||||
|
||||
local FormTextInput = Roact.Component:extend("FormTextInput")
|
||||
|
||||
function FormTextInput:init()
|
||||
self:setState({
|
||||
focused = false,
|
||||
})
|
||||
end
|
||||
|
||||
function FormTextInput:render()
|
||||
local value = self.props.value
|
||||
local placeholderValue = self.props.placeholderValue
|
||||
local onValueChange = self.props.onValueChange
|
||||
local layoutOrder = self.props.layoutOrder
|
||||
local width = self.props.width
|
||||
|
||||
local shownPlaceholder
|
||||
if self.state.focused then
|
||||
shownPlaceholder = ""
|
||||
else
|
||||
shownPlaceholder = placeholderValue
|
||||
end
|
||||
|
||||
return e("ImageLabel", {
|
||||
LayoutOrder = layoutOrder,
|
||||
Image = GrayBox.asset,
|
||||
ImageRectOffset = GrayBox.offset,
|
||||
ImageRectSize = GrayBox.size,
|
||||
Image = RoundBox.asset,
|
||||
ImageRectOffset = RoundBox.offset,
|
||||
ImageRectSize = RoundBox.size,
|
||||
ScaleType = Enum.ScaleType.Slice,
|
||||
SliceCenter = GrayBox.center,
|
||||
Size = size,
|
||||
SliceCenter = RoundBox.center,
|
||||
ImageColor3 = Theme.SecondaryColor,
|
||||
Size = UDim2.new(width.Scale, width.Offset, 0, TEXT_SIZE + PADDING * 2),
|
||||
BackgroundTransparency = 1,
|
||||
}, {
|
||||
InputInner = e("TextBox", {
|
||||
BackgroundTransparency = 1,
|
||||
Size = UDim2.new(1, -8, 1, -8),
|
||||
Size = UDim2.new(1, -PADDING * 2, 1, -PADDING * 2),
|
||||
Position = UDim2.new(0.5, 0, 0.5, 0),
|
||||
AnchorPoint = Vector2.new(0.5, 0.5),
|
||||
Font = Enum.Font.SourceSans,
|
||||
Font = Theme.InputFont,
|
||||
ClearTextOnFocus = false,
|
||||
TextXAlignment = Enum.TextXAlignment.Left,
|
||||
TextSize = 20,
|
||||
TextXAlignment = Enum.TextXAlignment.Center,
|
||||
TextSize = TEXT_SIZE,
|
||||
Text = value,
|
||||
TextColor3 = Color3.new(0.05, 0.05, 0.05),
|
||||
PlaceholderText = shownPlaceholder,
|
||||
PlaceholderColor3 = Theme.AccentLightColor,
|
||||
TextColor3 = Theme.AccentColor,
|
||||
|
||||
[Roact.Change.Text] = function(rbx)
|
||||
onValueChange(rbx.Text)
|
||||
end,
|
||||
[Roact.Event.Focused] = function()
|
||||
self:setState({
|
||||
focused = true,
|
||||
})
|
||||
end,
|
||||
[Roact.Event.FocusLost] = function()
|
||||
self:setState({
|
||||
focused = false,
|
||||
})
|
||||
end,
|
||||
}),
|
||||
})
|
||||
end
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
return {
|
||||
codename = "Epiphany",
|
||||
version = {0, 5, 0, "-alpha.0"},
|
||||
version = {0, 5, 0, "-alpha.4"},
|
||||
expectedServerVersionString = "0.5.0 or newer",
|
||||
protocolVersion = 2,
|
||||
defaultHost = "localhost",
|
||||
|
||||
@@ -1,12 +1,22 @@
|
||||
local Config = require(script.Parent.Config)
|
||||
|
||||
local VALUES = {
|
||||
LogLevel = {
|
||||
type = "IntValue",
|
||||
defaultUserValue = 2,
|
||||
defaultDevValue = 3,
|
||||
},
|
||||
}
|
||||
|
||||
local CONTAINER_NAME = "RojoDevSettings" .. Config.codename
|
||||
|
||||
local function getValueContainer()
|
||||
return game:FindFirstChild("RojoDev-" .. Config.codename)
|
||||
return game:FindFirstChild(CONTAINER_NAME)
|
||||
end
|
||||
|
||||
local valueContainer = getValueContainer()
|
||||
|
||||
local function getValue(name)
|
||||
local function getStoredValue(name)
|
||||
if valueContainer == nil then
|
||||
return nil
|
||||
end
|
||||
@@ -20,7 +30,7 @@ local function getValue(name)
|
||||
return valueObject.Value
|
||||
end
|
||||
|
||||
local function setValue(name, kind, value)
|
||||
local function setStoredValue(name, kind, value)
|
||||
local object = valueContainer:FindFirstChild(name)
|
||||
|
||||
if object == nil then
|
||||
@@ -37,11 +47,13 @@ local function createAllValues()
|
||||
|
||||
if valueContainer == nil then
|
||||
valueContainer = Instance.new("Folder")
|
||||
valueContainer.Name = "RojoDev-" .. Config.codename
|
||||
valueContainer.Name = CONTAINER_NAME
|
||||
valueContainer.Parent = game
|
||||
end
|
||||
|
||||
setValue("LogLevel", "IntValue", getValue("LogLevel") or 2)
|
||||
for name, value in pairs(VALUES) do
|
||||
setStoredValue(name, value.type, value.defaultDevValue)
|
||||
end
|
||||
end
|
||||
|
||||
_G[("ROJO_%s_DEV_CREATE"):format(Config.codename:upper())] = createAllValues
|
||||
@@ -53,7 +65,7 @@ function DevSettings:isEnabled()
|
||||
end
|
||||
|
||||
function DevSettings:getLogLevel()
|
||||
return getValue("LogLevel")
|
||||
return getStoredValue("LogLevel") or VALUES.LogLevel.defaultUserValue
|
||||
end
|
||||
|
||||
return DevSettings
|
||||
@@ -31,4 +31,4 @@ function HttpResponse:json()
|
||||
return HttpService:JSONDecode(self.body)
|
||||
end
|
||||
|
||||
return HttpResponse
|
||||
return HttpResponse
|
||||
@@ -1,7 +1,5 @@
|
||||
local DevSettings = require(script.Parent.DevSettings)
|
||||
|
||||
local testLogLevel = nil
|
||||
|
||||
local Level = {
|
||||
Error = 0,
|
||||
Warning = 1,
|
||||
@@ -9,17 +7,14 @@ local Level = {
|
||||
Trace = 3,
|
||||
}
|
||||
|
||||
local testLogLevel = nil
|
||||
|
||||
local function getLogLevel()
|
||||
if testLogLevel ~= nil then
|
||||
return testLogLevel
|
||||
end
|
||||
|
||||
local devValue = DevSettings:getLogLevel()
|
||||
if devValue ~= nil then
|
||||
return devValue
|
||||
end
|
||||
|
||||
return Level.Info
|
||||
return DevSettings:getLogLevel()
|
||||
end
|
||||
|
||||
local function addTags(tag, message)
|
||||
|
||||
@@ -22,18 +22,18 @@ function Session.new(config)
|
||||
api:connect()
|
||||
:andThen(function()
|
||||
if self.disconnected then
|
||||
return Promise.resolve()
|
||||
return
|
||||
end
|
||||
|
||||
return api:read({api.rootInstanceId})
|
||||
:andThen(function(response)
|
||||
if self.disconnected then
|
||||
return Promise.resolve()
|
||||
end
|
||||
end)
|
||||
:andThen(function(response)
|
||||
if self.disconnected then
|
||||
return
|
||||
end
|
||||
|
||||
self.reconciler:reconcile(response.instances, api.rootInstanceId, game)
|
||||
return self:__processMessages()
|
||||
end)
|
||||
self.reconciler:reconcile(response.instances, api.rootInstanceId, game)
|
||||
return self:__processMessages()
|
||||
end)
|
||||
:catch(function(message)
|
||||
self.disconnected = true
|
||||
|
||||
20
plugin/src/Theme.lua
Normal file
@@ -0,0 +1,20 @@
|
||||
local Theme = {
|
||||
ButtonFont = Enum.Font.GothamSemibold,
|
||||
InputFont = Enum.Font.Code,
|
||||
TitleFont = Enum.Font.GothamBold,
|
||||
MainFont = Enum.Font.Gotham,
|
||||
|
||||
AccentColor = Color3.fromRGB(136, 0, 27),
|
||||
AccentLightColor = Color3.fromRGB(210, 145, 157),
|
||||
PrimaryColor = Color3.fromRGB(20, 20, 20),
|
||||
SecondaryColor = Color3.fromRGB(235, 235, 235),
|
||||
LightTextColor = Color3.fromRGB(140, 140, 140),
|
||||
}
|
||||
|
||||
setmetatable(Theme, {
|
||||
__index = function(_, key)
|
||||
error(("%s is not a valid member of Theme"):format(key), 2)
|
||||
end
|
||||
})
|
||||
|
||||
return Theme
|
||||
34
plugin/src/joinBindings.lua
Normal file
@@ -0,0 +1,34 @@
|
||||
--[[
|
||||
joinBindings is a crazy hack that allows combining multiple Roact bindings
|
||||
in the same spirit as `map`.
|
||||
|
||||
It's implemented in terms of Roact internals that will probably break at
|
||||
some point; please don't do that or use this module in your own code!
|
||||
]]
|
||||
|
||||
local Binding = require(script:FindFirstAncestor("Rojo").Roact.Binding)
|
||||
|
||||
local function evaluate(fun, bindings)
|
||||
local input = {}
|
||||
|
||||
for index, binding in ipairs(bindings) do
|
||||
input[index] = binding:getValue()
|
||||
end
|
||||
|
||||
return fun(unpack(input, 1, #bindings))
|
||||
end
|
||||
|
||||
local function joinBindings(bindings, joinFunction)
|
||||
local initialValue = evaluate(joinFunction, bindings)
|
||||
local binding, setValue = Binding.create(initialValue)
|
||||
|
||||
for _, binding in ipairs(bindings) do
|
||||
Binding.subscribe(binding, function()
|
||||
setValue(evaluate(joinFunction, bindings))
|
||||
end)
|
||||
end
|
||||
|
||||
return binding
|
||||
end
|
||||
|
||||
return joinBindings
|
||||
@@ -1,6 +1,6 @@
|
||||
[package]
|
||||
name = "rojo"
|
||||
version = "0.5.0-alpha.0"
|
||||
version = "0.5.0-alpha.4"
|
||||
authors = ["Lucien Greathouse <me@lpghatguy.com>"]
|
||||
description = "A tool to create robust Roblox projects"
|
||||
license = "MIT"
|
||||
@@ -22,12 +22,15 @@ bundle-plugin = []
|
||||
[dependencies]
|
||||
clap = "2.27"
|
||||
csv = "1.0"
|
||||
env_logger = "0.5"
|
||||
env_logger = "0.6"
|
||||
failure = "0.1.3"
|
||||
log = "0.4"
|
||||
maplit = "1.0.1"
|
||||
notify = "4.0"
|
||||
rand = "0.4"
|
||||
rbx_binary = "0.2.0"
|
||||
rbx_tree = "0.2.0"
|
||||
rbx_xml = "0.2.0"
|
||||
regex = "1.0"
|
||||
reqwest = "0.9.5"
|
||||
rouille = "2.1"
|
||||
@@ -35,11 +38,10 @@ serde = "1.0"
|
||||
serde_derive = "1.0"
|
||||
serde_json = "1.0"
|
||||
uuid = { version = "0.7", features = ["v4", "serde"] }
|
||||
rbx_tree = "0.1.0"
|
||||
rbx_xml = "0.1.0"
|
||||
rbx_binary = "0.1.0"
|
||||
|
||||
[dev-dependencies]
|
||||
tempfile = "3.0"
|
||||
walkdir = "2.1"
|
||||
lazy_static = "1.2"
|
||||
lazy_static = "1.2"
|
||||
pretty_assertions = "0.5.1"
|
||||
paste = "0.1"
|
||||
54
server/assets/index.html
Normal file
@@ -0,0 +1,54 @@
|
||||
<!DOCTYPE html>
|
||||
<html>
|
||||
<head>
|
||||
<title>Rojo</title>
|
||||
<style>
|
||||
* {
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
font: inherit;
|
||||
}
|
||||
|
||||
html {
|
||||
font-family: sans-serif;
|
||||
height: 100%;
|
||||
}
|
||||
|
||||
body {
|
||||
height: 100%;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
justify-content: center;
|
||||
}
|
||||
|
||||
.main {
|
||||
padding: 1rem;
|
||||
text-align: center;
|
||||
margin: 0 auto;
|
||||
width: 100%;
|
||||
max-width: 60rem;
|
||||
background-color: #efefef;
|
||||
border: 1px solid #666;
|
||||
border-radius: 4px;
|
||||
}
|
||||
|
||||
.title {
|
||||
font-size: 2rem;
|
||||
font-weight: bold;
|
||||
}
|
||||
|
||||
.docs {
|
||||
font-size: 1.5rem;
|
||||
font-weight: bold;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
|
||||
<div class="main">
|
||||
<h1 class="title">Rojo Live Sync is up and running!</h1>
|
||||
<a class="docs" href="https://lpghatguy.github.io/rojo">Rojo Documentation</a>
|
||||
</div>
|
||||
|
||||
</body>
|
||||
</html>
|
||||
@@ -1,12 +1,12 @@
|
||||
#[macro_use] extern crate log;
|
||||
|
||||
use std::{
|
||||
path::{Path, PathBuf},
|
||||
env,
|
||||
panic,
|
||||
path::{Path, PathBuf},
|
||||
process,
|
||||
};
|
||||
|
||||
use clap::clap_app;
|
||||
use log::error;
|
||||
use clap::{clap_app, ArgMatches};
|
||||
|
||||
use librojo::commands;
|
||||
|
||||
@@ -20,11 +20,16 @@ fn make_path_absolute(value: &Path) -> PathBuf {
|
||||
}
|
||||
|
||||
fn main() {
|
||||
env_logger::Builder::from_default_env()
|
||||
.default_format_timestamp(false)
|
||||
.init();
|
||||
{
|
||||
let log_env = env_logger::Env::default()
|
||||
.default_filter_or("warn");
|
||||
|
||||
let mut app = clap_app!(Rojo =>
|
||||
env_logger::Builder::from_env(log_env)
|
||||
.default_format_timestamp(false)
|
||||
.init();
|
||||
}
|
||||
|
||||
let app = clap_app!(Rojo =>
|
||||
(version: env!("CARGO_PKG_VERSION"))
|
||||
(author: env!("CARGO_PKG_AUTHORS"))
|
||||
(about: env!("CARGO_PKG_DESCRIPTION"))
|
||||
@@ -56,117 +61,144 @@ fn main() {
|
||||
)
|
||||
);
|
||||
|
||||
// `get_matches` consumes self for some reason.
|
||||
let matches = app.clone().get_matches();
|
||||
let matches = app.get_matches();
|
||||
|
||||
match matches.subcommand() {
|
||||
("init", Some(sub_matches)) => {
|
||||
let fuzzy_project_path = make_path_absolute(Path::new(sub_matches.value_of("PATH").unwrap_or("")));
|
||||
let kind = sub_matches.value_of("kind");
|
||||
let result = panic::catch_unwind(|| match matches.subcommand() {
|
||||
("init", Some(sub_matches)) => start_init(sub_matches),
|
||||
("serve", Some(sub_matches)) => start_serve(sub_matches),
|
||||
("build", Some(sub_matches)) => start_build(sub_matches),
|
||||
("upload", Some(sub_matches)) => start_upload(sub_matches),
|
||||
_ => eprintln!("Usage: rojo <SUBCOMMAND>\nUse 'rojo help' for more help."),
|
||||
});
|
||||
|
||||
let options = commands::InitOptions {
|
||||
fuzzy_project_path,
|
||||
kind,
|
||||
};
|
||||
if let Err(error) = result {
|
||||
let message = match error.downcast_ref::<&str>() {
|
||||
Some(message) => message.to_string(),
|
||||
None => match error.downcast_ref::<String>() {
|
||||
Some(message) => message.clone(),
|
||||
None => "<no message>".to_string(),
|
||||
},
|
||||
};
|
||||
|
||||
match commands::init(&options) {
|
||||
Ok(_) => {},
|
||||
Err(e) => {
|
||||
error!("{}", e);
|
||||
process::exit(1);
|
||||
},
|
||||
}
|
||||
show_crash_message(&message);
|
||||
process::exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
fn show_crash_message(message: &str) {
|
||||
error!("Rojo crashed!");
|
||||
error!("This is a bug in Rojo.");
|
||||
error!("");
|
||||
error!("Please consider filing a bug: https://github.com/LPGhatguy/rojo/issues");
|
||||
error!("");
|
||||
error!("Details: {}", message);
|
||||
}
|
||||
|
||||
fn start_init(sub_matches: &ArgMatches) {
|
||||
let fuzzy_project_path = make_path_absolute(Path::new(sub_matches.value_of("PATH").unwrap_or("")));
|
||||
let kind = sub_matches.value_of("kind");
|
||||
|
||||
let options = commands::InitOptions {
|
||||
fuzzy_project_path,
|
||||
kind,
|
||||
};
|
||||
|
||||
match commands::init(&options) {
|
||||
Ok(_) => {},
|
||||
Err(e) => {
|
||||
error!("{}", e);
|
||||
process::exit(1);
|
||||
},
|
||||
("serve", Some(sub_matches)) => {
|
||||
let fuzzy_project_path = match sub_matches.value_of("PROJECT") {
|
||||
Some(v) => make_path_absolute(Path::new(v)),
|
||||
None => std::env::current_dir().unwrap(),
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
let port = match sub_matches.value_of("port") {
|
||||
Some(v) => match v.parse::<u16>() {
|
||||
Ok(port) => Some(port),
|
||||
Err(_) => {
|
||||
error!("Invalid port {}", v);
|
||||
process::exit(1);
|
||||
},
|
||||
},
|
||||
None => None,
|
||||
};
|
||||
fn start_serve(sub_matches: &ArgMatches) {
|
||||
let fuzzy_project_path = match sub_matches.value_of("PROJECT") {
|
||||
Some(v) => make_path_absolute(Path::new(v)),
|
||||
None => std::env::current_dir().unwrap(),
|
||||
};
|
||||
|
||||
let options = commands::ServeOptions {
|
||||
fuzzy_project_path,
|
||||
port,
|
||||
};
|
||||
|
||||
match commands::serve(&options) {
|
||||
Ok(_) => {},
|
||||
Err(e) => {
|
||||
error!("{}", e);
|
||||
process::exit(1);
|
||||
},
|
||||
}
|
||||
let port = match sub_matches.value_of("port") {
|
||||
Some(v) => match v.parse::<u16>() {
|
||||
Ok(port) => Some(port),
|
||||
Err(_) => {
|
||||
error!("Invalid port {}", v);
|
||||
process::exit(1);
|
||||
},
|
||||
},
|
||||
("build", Some(sub_matches)) => {
|
||||
let fuzzy_project_path = match sub_matches.value_of("PROJECT") {
|
||||
Some(v) => make_path_absolute(Path::new(v)),
|
||||
None => std::env::current_dir().unwrap(),
|
||||
};
|
||||
None => None,
|
||||
};
|
||||
|
||||
let output_file = make_path_absolute(Path::new(sub_matches.value_of("output").unwrap()));
|
||||
let options = commands::ServeOptions {
|
||||
fuzzy_project_path,
|
||||
port,
|
||||
};
|
||||
|
||||
let options = commands::BuildOptions {
|
||||
fuzzy_project_path,
|
||||
output_file,
|
||||
output_kind: None, // TODO: Accept from argument
|
||||
};
|
||||
|
||||
match commands::build(&options) {
|
||||
Ok(_) => {},
|
||||
Err(e) => {
|
||||
error!("{}", e);
|
||||
process::exit(1);
|
||||
},
|
||||
}
|
||||
match commands::serve(&options) {
|
||||
Ok(_) => {},
|
||||
Err(e) => {
|
||||
error!("{}", e);
|
||||
process::exit(1);
|
||||
},
|
||||
("upload", Some(sub_matches)) => {
|
||||
let fuzzy_project_path = match sub_matches.value_of("PROJECT") {
|
||||
Some(v) => make_path_absolute(Path::new(v)),
|
||||
None => std::env::current_dir().unwrap(),
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
let kind = sub_matches.value_of("kind");
|
||||
let security_cookie = sub_matches.value_of("cookie").unwrap();
|
||||
fn start_build(sub_matches: &ArgMatches) {
|
||||
let fuzzy_project_path = match sub_matches.value_of("PROJECT") {
|
||||
Some(v) => make_path_absolute(Path::new(v)),
|
||||
None => std::env::current_dir().unwrap(),
|
||||
};
|
||||
|
||||
let asset_id: u64 = {
|
||||
let arg = sub_matches.value_of("asset_id").unwrap();
|
||||
let output_file = make_path_absolute(Path::new(sub_matches.value_of("output").unwrap()));
|
||||
|
||||
match arg.parse() {
|
||||
Ok(v) => v,
|
||||
Err(_) => {
|
||||
error!("Invalid place ID {}", arg);
|
||||
process::exit(1);
|
||||
},
|
||||
}
|
||||
};
|
||||
let options = commands::BuildOptions {
|
||||
fuzzy_project_path,
|
||||
output_file,
|
||||
output_kind: None, // TODO: Accept from argument
|
||||
};
|
||||
|
||||
let options = commands::UploadOptions {
|
||||
fuzzy_project_path,
|
||||
security_cookie: security_cookie.to_string(),
|
||||
asset_id,
|
||||
kind,
|
||||
};
|
||||
|
||||
match commands::upload(&options) {
|
||||
Ok(_) => {},
|
||||
Err(e) => {
|
||||
error!("{}", e);
|
||||
process::exit(1);
|
||||
},
|
||||
}
|
||||
match commands::build(&options) {
|
||||
Ok(_) => {},
|
||||
Err(e) => {
|
||||
error!("{}", e);
|
||||
process::exit(1);
|
||||
},
|
||||
_ => {
|
||||
app.print_help().expect("Could not print help text to stdout!");
|
||||
}
|
||||
}
|
||||
|
||||
fn start_upload(sub_matches: &ArgMatches) {
|
||||
let fuzzy_project_path = match sub_matches.value_of("PROJECT") {
|
||||
Some(v) => make_path_absolute(Path::new(v)),
|
||||
None => std::env::current_dir().unwrap(),
|
||||
};
|
||||
|
||||
let kind = sub_matches.value_of("kind");
|
||||
let security_cookie = sub_matches.value_of("cookie").unwrap();
|
||||
|
||||
let asset_id: u64 = {
|
||||
let arg = sub_matches.value_of("asset_id").unwrap();
|
||||
|
||||
match arg.parse() {
|
||||
Ok(v) => v,
|
||||
Err(_) => {
|
||||
error!("Invalid place ID {}", arg);
|
||||
process::exit(1);
|
||||
},
|
||||
}
|
||||
};
|
||||
|
||||
let options = commands::UploadOptions {
|
||||
fuzzy_project_path,
|
||||
security_cookie: security_cookie.to_string(),
|
||||
asset_id,
|
||||
kind,
|
||||
};
|
||||
|
||||
match commands::upload(&options) {
|
||||
Ok(_) => {},
|
||||
Err(e) => {
|
||||
error!("{}", e);
|
||||
process::exit(1);
|
||||
},
|
||||
}
|
||||
}
|
||||
@@ -4,12 +4,13 @@ use std::{
|
||||
io,
|
||||
};
|
||||
|
||||
use log::info;
|
||||
use failure::Fail;
|
||||
|
||||
use crate::{
|
||||
rbx_session::construct_oneoff_tree,
|
||||
project::{Project, ProjectLoadFuzzyError},
|
||||
imfs::Imfs,
|
||||
imfs::{Imfs, FsError},
|
||||
};
|
||||
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||
@@ -54,32 +55,19 @@ pub enum BuildError {
|
||||
XmlModelEncodeError(rbx_xml::EncodeError),
|
||||
|
||||
#[fail(display = "Binary model file error")]
|
||||
BinaryModelEncodeError(rbx_binary::EncodeError)
|
||||
BinaryModelEncodeError(rbx_binary::EncodeError),
|
||||
|
||||
#[fail(display = "{}", _0)]
|
||||
FsError(#[fail(cause)] FsError),
|
||||
}
|
||||
|
||||
impl From<ProjectLoadFuzzyError> for BuildError {
|
||||
fn from(error: ProjectLoadFuzzyError) -> BuildError {
|
||||
BuildError::ProjectLoadError(error)
|
||||
}
|
||||
}
|
||||
|
||||
impl From<io::Error> for BuildError {
|
||||
fn from(error: io::Error) -> BuildError {
|
||||
BuildError::IoError(error)
|
||||
}
|
||||
}
|
||||
|
||||
impl From<rbx_xml::EncodeError> for BuildError {
|
||||
fn from(error: rbx_xml::EncodeError) -> BuildError {
|
||||
BuildError::XmlModelEncodeError(error)
|
||||
}
|
||||
}
|
||||
|
||||
impl From<rbx_binary::EncodeError> for BuildError {
|
||||
fn from(error: rbx_binary::EncodeError) -> BuildError {
|
||||
BuildError::BinaryModelEncodeError(error)
|
||||
}
|
||||
}
|
||||
impl_from!(BuildError {
|
||||
ProjectLoadFuzzyError => ProjectLoadError,
|
||||
io::Error => IoError,
|
||||
rbx_xml::EncodeError => XmlModelEncodeError,
|
||||
rbx_binary::EncodeError => BinaryModelEncodeError,
|
||||
FsError => FsError,
|
||||
});
|
||||
|
||||
pub fn build(options: &BuildOptions) -> Result<(), BuildError> {
|
||||
let output_kind = options.output_kind
|
||||
@@ -91,6 +79,7 @@ pub fn build(options: &BuildOptions) -> Result<(), BuildError> {
|
||||
info!("Looking for project at {}", options.fuzzy_project_path.display());
|
||||
|
||||
let project = Project::load_fuzzy(&options.fuzzy_project_path)?;
|
||||
project.check_compatibility();
|
||||
|
||||
info!("Found project at {}", project.file_location.display());
|
||||
info!("Using project {:#?}", project);
|
||||
|
||||
@@ -15,11 +15,9 @@ pub enum InitError {
|
||||
ProjectInitError(#[fail(cause)] ProjectInitError)
|
||||
}
|
||||
|
||||
impl From<ProjectInitError> for InitError {
|
||||
fn from(error: ProjectInitError) -> InitError {
|
||||
InitError::ProjectInitError(error)
|
||||
}
|
||||
}
|
||||
impl_from!(InitError {
|
||||
ProjectInitError => ProjectInitError,
|
||||
});
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct InitOptions<'a> {
|
||||
|
||||
@@ -3,12 +3,14 @@ use std::{
|
||||
sync::Arc,
|
||||
};
|
||||
|
||||
use log::info;
|
||||
use failure::Fail;
|
||||
|
||||
use crate::{
|
||||
project::{Project, ProjectLoadFuzzyError},
|
||||
web::Server,
|
||||
session::Session,
|
||||
imfs::FsError,
|
||||
live_session::LiveSession,
|
||||
};
|
||||
|
||||
const DEFAULT_PORT: u16 = 34872;
|
||||
@@ -23,24 +25,27 @@ pub struct ServeOptions {
|
||||
pub enum ServeError {
|
||||
#[fail(display = "Project load error: {}", _0)]
|
||||
ProjectLoadError(#[fail(cause)] ProjectLoadFuzzyError),
|
||||
|
||||
#[fail(display = "{}", _0)]
|
||||
FsError(#[fail(cause)] FsError),
|
||||
}
|
||||
|
||||
impl From<ProjectLoadFuzzyError> for ServeError {
|
||||
fn from(error: ProjectLoadFuzzyError) -> ServeError {
|
||||
ServeError::ProjectLoadError(error)
|
||||
}
|
||||
}
|
||||
impl_from!(ServeError {
|
||||
ProjectLoadFuzzyError => ProjectLoadError,
|
||||
FsError => FsError,
|
||||
});
|
||||
|
||||
pub fn serve(options: &ServeOptions) -> Result<(), ServeError> {
|
||||
info!("Looking for project at {}", options.fuzzy_project_path.display());
|
||||
|
||||
let project = Arc::new(Project::load_fuzzy(&options.fuzzy_project_path)?);
|
||||
project.check_compatibility();
|
||||
|
||||
info!("Found project at {}", project.file_location.display());
|
||||
info!("Using project {:#?}", project);
|
||||
|
||||
let session = Arc::new(Session::new(Arc::clone(&project)).unwrap());
|
||||
let server = Server::new(Arc::clone(&session));
|
||||
let live_session = Arc::new(LiveSession::new(Arc::clone(&project))?);
|
||||
let server = Server::new(Arc::clone(&live_session));
|
||||
|
||||
let port = options.port
|
||||
.or(project.serve_port)
|
||||
|
||||
@@ -3,6 +3,7 @@ use std::{
|
||||
io,
|
||||
};
|
||||
|
||||
use log::info;
|
||||
use failure::Fail;
|
||||
|
||||
use reqwest::header::{ACCEPT, USER_AGENT, CONTENT_TYPE, COOKIE};
|
||||
@@ -10,7 +11,7 @@ use reqwest::header::{ACCEPT, USER_AGENT, CONTENT_TYPE, COOKIE};
|
||||
use crate::{
|
||||
rbx_session::construct_oneoff_tree,
|
||||
project::{Project, ProjectLoadFuzzyError},
|
||||
imfs::Imfs,
|
||||
imfs::{Imfs, FsError},
|
||||
};
|
||||
|
||||
#[derive(Debug, Fail)]
|
||||
@@ -32,31 +33,18 @@ pub enum UploadError {
|
||||
|
||||
#[fail(display = "XML model file error")]
|
||||
XmlModelEncodeError(rbx_xml::EncodeError),
|
||||
|
||||
#[fail(display = "{}", _0)]
|
||||
FsError(#[fail(cause)] FsError),
|
||||
}
|
||||
|
||||
impl From<ProjectLoadFuzzyError> for UploadError {
|
||||
fn from(error: ProjectLoadFuzzyError) -> UploadError {
|
||||
UploadError::ProjectLoadError(error)
|
||||
}
|
||||
}
|
||||
|
||||
impl From<io::Error> for UploadError {
|
||||
fn from(error: io::Error) -> UploadError {
|
||||
UploadError::IoError(error)
|
||||
}
|
||||
}
|
||||
|
||||
impl From<reqwest::Error> for UploadError {
|
||||
fn from(error: reqwest::Error) -> UploadError {
|
||||
UploadError::HttpError(error)
|
||||
}
|
||||
}
|
||||
|
||||
impl From<rbx_xml::EncodeError> for UploadError {
|
||||
fn from(error: rbx_xml::EncodeError) -> UploadError {
|
||||
UploadError::XmlModelEncodeError(error)
|
||||
}
|
||||
}
|
||||
impl_from!(UploadError {
|
||||
ProjectLoadFuzzyError => ProjectLoadError,
|
||||
io::Error => IoError,
|
||||
reqwest::Error => HttpError,
|
||||
rbx_xml::EncodeError => XmlModelEncodeError,
|
||||
FsError => FsError,
|
||||
});
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct UploadOptions<'a> {
|
||||
@@ -72,6 +60,7 @@ pub fn upload(options: &UploadOptions) -> Result<(), UploadError> {
|
||||
info!("Looking for project at {}", options.fuzzy_project_path.display());
|
||||
|
||||
let project = Project::load_fuzzy(&options.fuzzy_project_path)?;
|
||||
project.check_compatibility();
|
||||
|
||||
info!("Found project at {}", project.file_location.display());
|
||||
info!("Using project {:#?}", project);
|
||||
|
||||
@@ -1,9 +1,12 @@
|
||||
use std::{
|
||||
sync::{mpsc, Arc, Mutex},
|
||||
time::Duration,
|
||||
path::Path,
|
||||
ops::Deref,
|
||||
thread,
|
||||
};
|
||||
|
||||
use log::{warn, trace};
|
||||
use notify::{
|
||||
self,
|
||||
DebouncedEvent,
|
||||
@@ -19,97 +22,122 @@ use crate::{
|
||||
|
||||
const WATCH_TIMEOUT: Duration = Duration::from_millis(100);
|
||||
|
||||
fn handle_event(imfs: &Mutex<Imfs>, rbx_session: &Mutex<RbxSession>, event: DebouncedEvent) {
|
||||
/// Watches for changes on the filesystem and links together the in-memory
|
||||
/// filesystem and in-memory Roblox tree.
|
||||
pub struct FsWatcher {
|
||||
watcher: RecommendedWatcher,
|
||||
}
|
||||
|
||||
impl FsWatcher {
|
||||
/// Start a new FS watcher, watching all of the roots currently attached to
|
||||
/// the given Imfs.
|
||||
///
|
||||
/// `rbx_session` is optional to make testing easier. If it isn't `None`,
|
||||
/// events will be passed to it after they're given to the Imfs.
|
||||
pub fn start(imfs: Arc<Mutex<Imfs>>, rbx_session: Option<Arc<Mutex<RbxSession>>>) -> FsWatcher {
|
||||
let (watch_tx, watch_rx) = mpsc::channel();
|
||||
|
||||
let mut watcher = notify::watcher(watch_tx, WATCH_TIMEOUT)
|
||||
.expect("Could not create filesystem watcher");
|
||||
|
||||
{
|
||||
let imfs = imfs.lock().unwrap();
|
||||
|
||||
for root_path in imfs.get_roots() {
|
||||
trace!("Watching path {}", root_path.display());
|
||||
watcher.watch(root_path, RecursiveMode::Recursive)
|
||||
.expect("Could not watch directory");
|
||||
}
|
||||
}
|
||||
|
||||
{
|
||||
let imfs = Arc::clone(&imfs);
|
||||
let rbx_session = rbx_session.as_ref().map(Arc::clone);
|
||||
|
||||
thread::spawn(move || {
|
||||
trace!("Watcher thread started");
|
||||
while let Ok(event) = watch_rx.recv() {
|
||||
// handle_fs_event expects an Option<&Mutex<T>>, but we have
|
||||
// an Option<Arc<Mutex<T>>>, so we coerce with Deref.
|
||||
let session_ref = rbx_session.as_ref().map(Deref::deref);
|
||||
|
||||
handle_fs_event(&imfs, session_ref, event);
|
||||
}
|
||||
trace!("Watcher thread stopped");
|
||||
});
|
||||
}
|
||||
|
||||
FsWatcher {
|
||||
watcher,
|
||||
}
|
||||
}
|
||||
|
||||
pub fn stop_watching_path(&mut self, path: &Path) {
|
||||
match self.watcher.unwatch(path) {
|
||||
Ok(_) => {},
|
||||
Err(e) => {
|
||||
warn!("Could not unwatch path {}: {}", path.display(), e);
|
||||
},
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn handle_fs_event(imfs: &Mutex<Imfs>, rbx_session: Option<&Mutex<RbxSession>>, event: DebouncedEvent) {
|
||||
match event {
|
||||
DebouncedEvent::Create(path) => {
|
||||
trace!("Path created: {}", path.display());
|
||||
|
||||
{
|
||||
let mut imfs = imfs.lock().unwrap();
|
||||
imfs.path_created(&path).unwrap();
|
||||
}
|
||||
|
||||
{
|
||||
if let Some(rbx_session) = rbx_session {
|
||||
let mut rbx_session = rbx_session.lock().unwrap();
|
||||
rbx_session.path_created(&path);
|
||||
}
|
||||
},
|
||||
DebouncedEvent::Write(path) => {
|
||||
trace!("Path created: {}", path.display());
|
||||
|
||||
{
|
||||
let mut imfs = imfs.lock().unwrap();
|
||||
imfs.path_updated(&path).unwrap();
|
||||
}
|
||||
|
||||
{
|
||||
if let Some(rbx_session) = rbx_session {
|
||||
let mut rbx_session = rbx_session.lock().unwrap();
|
||||
rbx_session.path_updated(&path);
|
||||
}
|
||||
},
|
||||
DebouncedEvent::Remove(path) => {
|
||||
trace!("Path removed: {}", path.display());
|
||||
|
||||
{
|
||||
let mut imfs = imfs.lock().unwrap();
|
||||
imfs.path_removed(&path).unwrap();
|
||||
}
|
||||
|
||||
{
|
||||
if let Some(rbx_session) = rbx_session {
|
||||
let mut rbx_session = rbx_session.lock().unwrap();
|
||||
rbx_session.path_removed(&path);
|
||||
}
|
||||
},
|
||||
DebouncedEvent::Rename(from_path, to_path) => {
|
||||
trace!("Path renamed: {} to {}", from_path.display(), to_path.display());
|
||||
|
||||
{
|
||||
let mut imfs = imfs.lock().unwrap();
|
||||
imfs.path_moved(&from_path, &to_path).unwrap();
|
||||
}
|
||||
|
||||
{
|
||||
if let Some(rbx_session) = rbx_session {
|
||||
let mut rbx_session = rbx_session.lock().unwrap();
|
||||
rbx_session.path_renamed(&from_path, &to_path);
|
||||
}
|
||||
},
|
||||
_ => {},
|
||||
}
|
||||
}
|
||||
|
||||
/// Watches for changes on the filesystem and links together the in-memory
|
||||
/// filesystem and in-memory Roblox tree.
|
||||
pub struct FsWatcher {
|
||||
#[allow(unused)]
|
||||
watchers: Vec<RecommendedWatcher>,
|
||||
}
|
||||
|
||||
impl FsWatcher {
|
||||
pub fn start(imfs: Arc<Mutex<Imfs>>, rbx_session: Arc<Mutex<RbxSession>>) -> FsWatcher {
|
||||
let mut watchers = Vec::new();
|
||||
|
||||
{
|
||||
let imfs_temp = imfs.lock().unwrap();
|
||||
|
||||
for root_path in imfs_temp.get_roots() {
|
||||
let (watch_tx, watch_rx) = mpsc::channel();
|
||||
|
||||
let mut watcher = notify::watcher(watch_tx, WATCH_TIMEOUT)
|
||||
.expect("Could not create `notify` watcher");
|
||||
|
||||
watcher.watch(root_path, RecursiveMode::Recursive)
|
||||
.expect("Could not watch directory");
|
||||
|
||||
watchers.push(watcher);
|
||||
|
||||
let imfs = Arc::clone(&imfs);
|
||||
let rbx_session = Arc::clone(&rbx_session);
|
||||
let root_path = root_path.clone();
|
||||
|
||||
thread::spawn(move || {
|
||||
info!("Watcher thread ({}) started", root_path.display());
|
||||
while let Ok(event) = watch_rx.recv() {
|
||||
handle_event(&imfs, &rbx_session, event);
|
||||
}
|
||||
info!("Watcher thread ({}) stopped", root_path.display());
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
FsWatcher {
|
||||
watchers,
|
||||
}
|
||||
other => {
|
||||
trace!("Unhandled FS event: {:?}", other);
|
||||
},
|
||||
}
|
||||
}
|
||||
@@ -1,30 +1,58 @@
|
||||
use std::{
|
||||
collections::{HashMap, HashSet},
|
||||
path::{Path, PathBuf},
|
||||
path::{self, Path, PathBuf},
|
||||
fmt,
|
||||
fs,
|
||||
io,
|
||||
};
|
||||
|
||||
use failure::Fail;
|
||||
use serde_derive::{Serialize, Deserialize};
|
||||
|
||||
use crate::project::{Project, ProjectNode};
|
||||
|
||||
fn add_sync_points(imfs: &mut Imfs, project_node: &ProjectNode) -> io::Result<()> {
|
||||
match project_node {
|
||||
ProjectNode::Instance(node) => {
|
||||
for child in node.children.values() {
|
||||
add_sync_points(imfs, child)?;
|
||||
}
|
||||
},
|
||||
ProjectNode::SyncPoint(node) => {
|
||||
imfs.add_root(&node.path)?;
|
||||
},
|
||||
/// A wrapper around io::Error that also attaches the path associated with the
|
||||
/// error.
|
||||
#[derive(Debug, Fail)]
|
||||
pub struct FsError {
|
||||
#[fail(cause)]
|
||||
inner: io::Error,
|
||||
path: PathBuf,
|
||||
}
|
||||
|
||||
impl FsError {
|
||||
fn new<P: Into<PathBuf>>(inner: io::Error, path: P) -> FsError {
|
||||
FsError {
|
||||
inner,
|
||||
path: path.into(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl fmt::Display for FsError {
|
||||
fn fmt(&self, output: &mut fmt::Formatter) -> fmt::Result {
|
||||
write!(output, "{}: {}", self.path.display(), self.inner)
|
||||
}
|
||||
}
|
||||
|
||||
fn add_sync_points(imfs: &mut Imfs, node: &ProjectNode) -> Result<(), FsError> {
|
||||
if let Some(path) = &node.path {
|
||||
imfs.add_root(path)?;
|
||||
}
|
||||
|
||||
for child in node.children.values() {
|
||||
add_sync_points(imfs, child)?;
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// The in-memory filesystem keeps a mirror of all files being watcher by Rojo
|
||||
/// The in-memory filesystem keeps a mirror of all files being watched by Rojo
|
||||
/// in order to deduplicate file changes in the case of bidirectional syncing
|
||||
/// from Roblox Studio.
|
||||
///
|
||||
/// It also enables Rojo to quickly generate React-like snapshots to make
|
||||
/// reasoning about instances and how they relate to files easier.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct Imfs {
|
||||
items: HashMap<PathBuf, ImfsItem>,
|
||||
@@ -39,7 +67,7 @@ impl Imfs {
|
||||
}
|
||||
}
|
||||
|
||||
pub fn add_roots_from_project(&mut self, project: &Project) -> io::Result<()> {
|
||||
pub fn add_roots_from_project(&mut self, project: &Project) -> Result<(), FsError> {
|
||||
add_sync_points(self, &project.tree)
|
||||
}
|
||||
|
||||
@@ -58,30 +86,42 @@ impl Imfs {
|
||||
self.items.get(path)
|
||||
}
|
||||
|
||||
pub fn add_root(&mut self, path: &Path) -> io::Result<()> {
|
||||
pub fn add_root(&mut self, path: &Path) -> Result<(), FsError> {
|
||||
debug_assert!(path.is_absolute());
|
||||
debug_assert!(!self.is_within_roots(path));
|
||||
|
||||
self.roots.insert(path.to_path_buf());
|
||||
|
||||
self.read_from_disk(path)
|
||||
self.descend_and_read_from_disk(path)
|
||||
}
|
||||
|
||||
pub fn path_created(&mut self, path: &Path) -> io::Result<()> {
|
||||
pub fn remove_root(&mut self, path: &Path) {
|
||||
debug_assert!(path.is_absolute());
|
||||
|
||||
if self.roots.get(path).is_some() {
|
||||
self.remove_item(path);
|
||||
|
||||
if let Some(parent_path) = path.parent() {
|
||||
self.unlink_child(parent_path, path);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
pub fn path_created(&mut self, path: &Path) -> Result<(), FsError> {
|
||||
debug_assert!(path.is_absolute());
|
||||
debug_assert!(self.is_within_roots(path));
|
||||
|
||||
self.read_from_disk(path)
|
||||
self.descend_and_read_from_disk(path)
|
||||
}
|
||||
|
||||
pub fn path_updated(&mut self, path: &Path) -> io::Result<()> {
|
||||
pub fn path_updated(&mut self, path: &Path) -> Result<(), FsError> {
|
||||
debug_assert!(path.is_absolute());
|
||||
debug_assert!(self.is_within_roots(path));
|
||||
|
||||
self.read_from_disk(path)
|
||||
self.descend_and_read_from_disk(path)
|
||||
}
|
||||
|
||||
pub fn path_removed(&mut self, path: &Path) -> io::Result<()> {
|
||||
pub fn path_removed(&mut self, path: &Path) -> Result<(), FsError> {
|
||||
debug_assert!(path.is_absolute());
|
||||
debug_assert!(self.is_within_roots(path));
|
||||
|
||||
@@ -94,12 +134,7 @@ impl Imfs {
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub fn path_moved(&mut self, from_path: &Path, to_path: &Path) -> io::Result<()> {
|
||||
debug_assert!(from_path.is_absolute());
|
||||
debug_assert!(self.is_within_roots(from_path));
|
||||
debug_assert!(to_path.is_absolute());
|
||||
debug_assert!(self.is_within_roots(to_path));
|
||||
|
||||
pub fn path_moved(&mut self, from_path: &Path, to_path: &Path) -> Result<(), FsError> {
|
||||
self.path_removed(from_path)?;
|
||||
self.path_created(to_path)?;
|
||||
Ok(())
|
||||
@@ -130,9 +165,7 @@ impl Imfs {
|
||||
Some(ImfsItem::Directory(directory)) => {
|
||||
directory.children.remove(child);
|
||||
},
|
||||
_ => {
|
||||
panic!("Tried to unlink child of path that wasn't a directory!");
|
||||
},
|
||||
_ => {},
|
||||
}
|
||||
}
|
||||
|
||||
@@ -151,11 +184,44 @@ impl Imfs {
|
||||
}
|
||||
}
|
||||
|
||||
fn read_from_disk(&mut self, path: &Path) -> io::Result<()> {
|
||||
let metadata = fs::metadata(path)?;
|
||||
fn descend_and_read_from_disk(&mut self, path: &Path) -> Result<(), FsError> {
|
||||
let root_path = self.get_root_path(path)
|
||||
.expect("Tried to descent and read for path that wasn't within roots!");
|
||||
|
||||
// If this path is a root, we should read the entire thing.
|
||||
if root_path == path {
|
||||
self.read_from_disk(path)?;
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
let relative_path = path.strip_prefix(root_path).unwrap();
|
||||
let mut current_path = root_path.to_path_buf();
|
||||
|
||||
for component in relative_path.components() {
|
||||
match component {
|
||||
path::Component::Normal(name) => {
|
||||
let next_path = current_path.join(name);
|
||||
|
||||
if self.items.contains_key(&next_path) {
|
||||
current_path = next_path;
|
||||
} else {
|
||||
break;
|
||||
}
|
||||
},
|
||||
_ => unreachable!(),
|
||||
}
|
||||
}
|
||||
|
||||
self.read_from_disk(¤t_path)
|
||||
}
|
||||
|
||||
fn read_from_disk(&mut self, path: &Path) -> Result<(), FsError> {
|
||||
let metadata = fs::metadata(path)
|
||||
.map_err(|e| FsError::new(e, path))?;
|
||||
|
||||
if metadata.is_file() {
|
||||
let contents = fs::read(path)?;
|
||||
let contents = fs::read(path)
|
||||
.map_err(|e| FsError::new(e, path))?;
|
||||
let item = ImfsItem::File(ImfsFile {
|
||||
path: path.to_path_buf(),
|
||||
contents,
|
||||
@@ -176,8 +242,13 @@ impl Imfs {
|
||||
|
||||
self.items.insert(path.to_path_buf(), item);
|
||||
|
||||
for entry in fs::read_dir(path)? {
|
||||
let entry = entry?;
|
||||
let dir_children = fs::read_dir(path)
|
||||
.map_err(|e| FsError::new(e, path))?;
|
||||
|
||||
for entry in dir_children {
|
||||
let entry = entry
|
||||
.map_err(|e| FsError::new(e, path))?;
|
||||
|
||||
let child_path = entry.path();
|
||||
|
||||
self.read_from_disk(&child_path)?;
|
||||
@@ -193,6 +264,16 @@ impl Imfs {
|
||||
}
|
||||
}
|
||||
|
||||
fn get_root_path<'a>(&'a self, path: &Path) -> Option<&'a Path> {
|
||||
for root_path in &self.roots {
|
||||
if path.starts_with(root_path) {
|
||||
return Some(root_path)
|
||||
}
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
|
||||
fn is_within_roots(&self, path: &Path) -> bool {
|
||||
for root_path in &self.roots {
|
||||
if path.starts_with(root_path) {
|
||||
|
||||
18
server/src/impl_from.rs
Normal file
@@ -0,0 +1,18 @@
|
||||
/// Implements 'From' for a list of variants, intended for use with error enums
|
||||
/// that are wrapping a number of errors from other methods.
|
||||
#[macro_export]
|
||||
macro_rules! impl_from {
|
||||
(
|
||||
$enum_name: ident {
|
||||
$($error_type: ty => $variant_name: ident),* $(,)*
|
||||
}
|
||||
) => {
|
||||
$(
|
||||
impl From<$error_type> for $enum_name {
|
||||
fn from(error: $error_type) -> $enum_name {
|
||||
$enum_name::$variant_name(error)
|
||||
}
|
||||
}
|
||||
)*
|
||||
}
|
||||
}
|
||||
@@ -1,23 +1,20 @@
|
||||
// Macros
|
||||
#[macro_use]
|
||||
extern crate log;
|
||||
pub mod impl_from;
|
||||
|
||||
#[macro_use]
|
||||
extern crate serde_derive;
|
||||
|
||||
#[cfg(test)]
|
||||
extern crate tempfile;
|
||||
|
||||
// pub mod roblox_studio;
|
||||
// Other modules
|
||||
pub mod commands;
|
||||
pub mod fs_watcher;
|
||||
pub mod imfs;
|
||||
pub mod live_session;
|
||||
pub mod message_queue;
|
||||
pub mod path_map;
|
||||
pub mod path_serializer;
|
||||
pub mod project;
|
||||
pub mod rbx_session;
|
||||
pub mod rbx_snapshot;
|
||||
pub mod session;
|
||||
pub mod session_id;
|
||||
pub mod snapshot_reconciler;
|
||||
pub mod visualize;
|
||||
pub mod web;
|
||||
pub mod web_util;
|
||||
@@ -1,19 +1,19 @@
|
||||
use std::{
|
||||
sync::{Arc, Mutex},
|
||||
io,
|
||||
};
|
||||
|
||||
use crate::{
|
||||
fs_watcher::FsWatcher,
|
||||
imfs::{Imfs, FsError},
|
||||
message_queue::MessageQueue,
|
||||
project::Project,
|
||||
imfs::Imfs,
|
||||
session_id::SessionId,
|
||||
rbx_session::RbxSession,
|
||||
rbx_snapshot::InstanceChanges,
|
||||
fs_watcher::FsWatcher,
|
||||
session_id::SessionId,
|
||||
snapshot_reconciler::InstanceChanges,
|
||||
};
|
||||
|
||||
pub struct Session {
|
||||
/// Contains all of the state for a Rojo live-sync session.
|
||||
pub struct LiveSession {
|
||||
pub project: Arc<Project>,
|
||||
pub session_id: SessionId,
|
||||
pub message_queue: Arc<MessageQueue<InstanceChanges>>,
|
||||
@@ -22,8 +22,8 @@ pub struct Session {
|
||||
_fs_watcher: FsWatcher,
|
||||
}
|
||||
|
||||
impl Session {
|
||||
pub fn new(project: Arc<Project>) -> io::Result<Session> {
|
||||
impl LiveSession {
|
||||
pub fn new(project: Arc<Project>) -> Result<LiveSession, FsError> {
|
||||
let imfs = {
|
||||
let mut imfs = Imfs::new();
|
||||
imfs.add_roots_from_project(&project)?;
|
||||
@@ -40,12 +40,12 @@ impl Session {
|
||||
|
||||
let fs_watcher = FsWatcher::start(
|
||||
Arc::clone(&imfs),
|
||||
Arc::clone(&rbx_session),
|
||||
Some(Arc::clone(&rbx_session)),
|
||||
);
|
||||
|
||||
let session_id = SessionId::new();
|
||||
|
||||
Ok(Session {
|
||||
Ok(LiveSession {
|
||||
project,
|
||||
session_id,
|
||||
message_queue,
|
||||
@@ -19,6 +19,10 @@ pub fn get_listener_id() -> ListenerId {
|
||||
ListenerId(LAST_ID.fetch_add(1, Ordering::SeqCst))
|
||||
}
|
||||
|
||||
/// A message queue with persistent history that can be subscribed to.
|
||||
///
|
||||
/// Definitely non-optimal, but a simple design that works well for the
|
||||
/// synchronous web server Rojo uses, Rouille.
|
||||
#[derive(Default)]
|
||||
pub struct MessageQueue<T> {
|
||||
messages: RwLock<Vec<T>>,
|
||||
|
||||
@@ -3,14 +3,18 @@ use std::{
|
||||
collections::{HashMap, HashSet},
|
||||
};
|
||||
|
||||
use serde_derive::Serialize;
|
||||
use log::warn;
|
||||
|
||||
#[derive(Debug, Serialize)]
|
||||
struct PathMapNode<T> {
|
||||
value: T,
|
||||
children: HashSet<PathBuf>,
|
||||
}
|
||||
|
||||
/// A map from paths to instance IDs, with a bit of additional data that enables
|
||||
/// removing a path and all of its child paths from the tree more quickly.
|
||||
/// A map from paths to another type, like instance IDs, with a bit of
|
||||
/// additional data that enables removing a path and all of its child paths from
|
||||
/// the tree more quickly.
|
||||
#[derive(Debug, Serialize)]
|
||||
pub struct PathMap<T> {
|
||||
nodes: HashMap<PathBuf, PathMapNode<T>>,
|
||||
@@ -27,6 +31,10 @@ impl<T> PathMap<T> {
|
||||
self.nodes.get(path).map(|v| &v.value)
|
||||
}
|
||||
|
||||
pub fn get_mut(&mut self, path: &Path) -> Option<&mut T> {
|
||||
self.nodes.get_mut(path).map(|v| &mut v.value)
|
||||
}
|
||||
|
||||
pub fn insert(&mut self, path: PathBuf, value: T) {
|
||||
if let Some(parent_path) = path.parent() {
|
||||
if let Some(parent) = self.nodes.get_mut(parent_path) {
|
||||
@@ -71,6 +79,14 @@ impl<T> PathMap<T> {
|
||||
Some(root_value)
|
||||
}
|
||||
|
||||
/// Traverses the route between `start_path` and `target_path` and returns
|
||||
/// the path closest to `target_path` in the tree.
|
||||
///
|
||||
/// This is useful when trying to determine what paths need to be marked as
|
||||
/// altered when a change to a path is registered. Depending on the order of
|
||||
/// FS events, a file remove event could be followed by that file's
|
||||
/// directory being removed, in which case we should process that
|
||||
/// directory's parent.
|
||||
pub fn descend(&self, start_path: &Path, target_path: &Path) -> PathBuf {
|
||||
let relative_path = target_path.strip_prefix(start_path)
|
||||
.expect("target_path did not begin with start_path");
|
||||
|
||||
69
server/src/path_serializer.rs
Normal file
@@ -0,0 +1,69 @@
|
||||
//! path_serializer is used in cases where we need to serialize relative Path
|
||||
//! and PathBuf objects in a way that's cross-platform.
|
||||
//!
|
||||
//! This is used for the snapshot testing system to make sure that snapshots
|
||||
//! that reference local paths that are generated on Windows don't fail when run
|
||||
//! in systems that use a different directory separator.
|
||||
//!
|
||||
//! To use, annotate your PathBuf or Option<PathBuf> field with the correct
|
||||
//! serializer function:
|
||||
//!
|
||||
//! ```
|
||||
//! # use std::path::PathBuf;
|
||||
//! # use serde_derive::{Serialize, Deserialize};
|
||||
//!
|
||||
//! #[derive(Serialize, Deserialize)]
|
||||
//! struct Mine {
|
||||
//! name: String,
|
||||
//!
|
||||
//! // Use 'crate' instead of librojo if writing code inside Rojo
|
||||
//! #[serde(serialize_with = "librojo::path_serializer::serialize")]
|
||||
//! source_path: PathBuf,
|
||||
//!
|
||||
//! #[serde(serialize_with = "librojo::path_serializer::serialize_option")]
|
||||
//! maybe_path: Option<PathBuf>,
|
||||
//! }
|
||||
//! ```
|
||||
//!
|
||||
//! **The methods in this module can only handle relative paths, since absolute
|
||||
//! paths are never portable.**
|
||||
|
||||
use std::path::{Component, Path};
|
||||
|
||||
use serde::Serializer;
|
||||
|
||||
pub fn serialize_option<S, T>(maybe_path: &Option<T>, serializer: S) -> Result<S::Ok, S::Error>
|
||||
where S: Serializer,
|
||||
T: AsRef<Path>,
|
||||
{
|
||||
match maybe_path {
|
||||
Some(path) => serialize(path, serializer),
|
||||
None => serializer.serialize_none()
|
||||
}
|
||||
}
|
||||
|
||||
pub fn serialize<S, T>(path: T, serializer: S) -> Result<S::Ok, S::Error>
|
||||
where S: Serializer,
|
||||
T: AsRef<Path>,
|
||||
{
|
||||
let path = path.as_ref();
|
||||
|
||||
assert!(path.is_relative(), "path_serializer can only handle relative paths");
|
||||
|
||||
let mut output = String::new();
|
||||
|
||||
for component in path.components() {
|
||||
if !output.is_empty() {
|
||||
output.push('/');
|
||||
}
|
||||
|
||||
match component {
|
||||
Component::CurDir => output.push('.'),
|
||||
Component::ParentDir => output.push_str(".."),
|
||||
Component::Normal(piece) => output.push_str(piece.to_str().unwrap()),
|
||||
_ => panic!("path_serializer cannot handle absolute path components"),
|
||||
}
|
||||
}
|
||||
|
||||
serializer.serialize_str(&output)
|
||||
}
|
||||
@@ -6,90 +6,33 @@ use std::{
|
||||
path::{Path, PathBuf},
|
||||
};
|
||||
|
||||
use maplit::hashmap;
|
||||
use log::warn;
|
||||
use failure::Fail;
|
||||
use maplit::hashmap;
|
||||
use rbx_tree::RbxValue;
|
||||
use serde_derive::{Serialize, Deserialize};
|
||||
|
||||
pub static PROJECT_FILENAME: &'static str = "roblox-project.json";
|
||||
|
||||
// Serde is silly.
|
||||
const fn yeah() -> bool {
|
||||
true
|
||||
}
|
||||
|
||||
const fn is_true(value: &bool) -> bool {
|
||||
*value
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
#[serde(untagged)]
|
||||
enum SourceProjectNode {
|
||||
Instance {
|
||||
#[serde(rename = "$className")]
|
||||
class_name: String,
|
||||
|
||||
#[serde(rename = "$properties", default = "HashMap::new", skip_serializing_if = "HashMap::is_empty")]
|
||||
properties: HashMap<String, RbxValue>,
|
||||
|
||||
#[serde(rename = "$ignoreUnknownInstances", default = "yeah", skip_serializing_if = "is_true")]
|
||||
ignore_unknown_instances: bool,
|
||||
|
||||
#[serde(flatten)]
|
||||
children: HashMap<String, SourceProjectNode>,
|
||||
},
|
||||
SyncPoint {
|
||||
#[serde(rename = "$path")]
|
||||
path: String,
|
||||
}
|
||||
}
|
||||
|
||||
impl SourceProjectNode {
|
||||
pub fn into_project_node(self, project_file_location: &Path) -> ProjectNode {
|
||||
match self {
|
||||
SourceProjectNode::Instance { class_name, mut children, properties, ignore_unknown_instances } => {
|
||||
let mut new_children = HashMap::new();
|
||||
|
||||
for (node_name, node) in children.drain() {
|
||||
new_children.insert(node_name, node.into_project_node(project_file_location));
|
||||
}
|
||||
|
||||
ProjectNode::Instance(InstanceProjectNode {
|
||||
class_name,
|
||||
children: new_children,
|
||||
properties,
|
||||
metadata: InstanceProjectNodeMetadata {
|
||||
ignore_unknown_instances,
|
||||
},
|
||||
})
|
||||
},
|
||||
SourceProjectNode::SyncPoint { path: source_path } => {
|
||||
let path = if Path::new(&source_path).is_absolute() {
|
||||
PathBuf::from(source_path)
|
||||
} else {
|
||||
let project_folder_location = project_file_location.parent().unwrap();
|
||||
project_folder_location.join(source_path)
|
||||
};
|
||||
|
||||
ProjectNode::SyncPoint(SyncPointProjectNode {
|
||||
path,
|
||||
})
|
||||
},
|
||||
}
|
||||
}
|
||||
}
|
||||
pub static PROJECT_FILENAME: &'static str = "default.project.json";
|
||||
pub static COMPAT_PROJECT_FILENAME: &'static str = "roblox-project.json";
|
||||
|
||||
/// SourceProject is the format that users author projects on-disk. Since we
|
||||
/// want to do things like transforming paths to be absolute before handing them
|
||||
/// off to the rest of Rojo, we use this intermediate struct.
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
struct SourceProject {
|
||||
name: String,
|
||||
tree: SourceProjectNode,
|
||||
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
serve_port: Option<u16>,
|
||||
|
||||
#[serde(skip_serializing_if = "Option::is_none")]
|
||||
serve_place_ids: Option<HashSet<u64>>,
|
||||
}
|
||||
|
||||
impl SourceProject {
|
||||
/// Consumes the SourceProject and yields a Project, ready for prime-time.
|
||||
pub fn into_project(self, project_file_location: &Path) -> Project {
|
||||
let tree = self.tree.into_project_node(project_file_location);
|
||||
|
||||
@@ -103,6 +46,56 @@ impl SourceProject {
|
||||
}
|
||||
}
|
||||
|
||||
/// Similar to SourceProject, the structure of nodes in the project tree is
|
||||
/// slightly different on-disk than how we want to handle them in the rest of
|
||||
/// Rojo.
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
struct SourceProjectNode {
|
||||
#[serde(rename = "$className", skip_serializing_if = "Option::is_none")]
|
||||
class_name: Option<String>,
|
||||
|
||||
#[serde(rename = "$properties", default = "HashMap::new", skip_serializing_if = "HashMap::is_empty")]
|
||||
properties: HashMap<String, RbxValue>,
|
||||
|
||||
#[serde(rename = "$ignoreUnknownInstances", skip_serializing_if = "Option::is_none")]
|
||||
ignore_unknown_instances: Option<bool>,
|
||||
|
||||
#[serde(rename = "$path", skip_serializing_if = "Option::is_none")]
|
||||
path: Option<String>,
|
||||
|
||||
#[serde(flatten)]
|
||||
children: HashMap<String, SourceProjectNode>,
|
||||
}
|
||||
|
||||
impl SourceProjectNode {
|
||||
/// Consumes the SourceProjectNode and turns it into a ProjectNode.
|
||||
pub fn into_project_node(mut self, project_file_location: &Path) -> ProjectNode {
|
||||
let children = self.children.drain()
|
||||
.map(|(key, value)| (key, value.into_project_node(project_file_location)))
|
||||
.collect();
|
||||
|
||||
// Make sure that paths are absolute, transforming them by adding the
|
||||
// project folder if they're not already absolute.
|
||||
let path = self.path.as_ref().map(|source_path| {
|
||||
if Path::new(source_path).is_absolute() {
|
||||
PathBuf::from(source_path)
|
||||
} else {
|
||||
let project_folder_location = project_file_location.parent().unwrap();
|
||||
project_folder_location.join(source_path)
|
||||
}
|
||||
});
|
||||
|
||||
ProjectNode {
|
||||
class_name: self.class_name,
|
||||
properties: self.properties,
|
||||
ignore_unknown_instances: self.ignore_unknown_instances,
|
||||
path,
|
||||
children,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Error returned by Project::load_exact
|
||||
#[derive(Debug, Fail)]
|
||||
pub enum ProjectLoadExactError {
|
||||
#[fail(display = "IO error: {}", _0)]
|
||||
@@ -112,6 +105,7 @@ pub enum ProjectLoadExactError {
|
||||
JsonError(#[fail(cause)] serde_json::Error),
|
||||
}
|
||||
|
||||
/// Error returned by Project::load_fuzzy
|
||||
#[derive(Debug, Fail)]
|
||||
pub enum ProjectLoadFuzzyError {
|
||||
#[fail(display = "Project not found")]
|
||||
@@ -133,6 +127,7 @@ impl From<ProjectLoadExactError> for ProjectLoadFuzzyError {
|
||||
}
|
||||
}
|
||||
|
||||
/// Error returned by Project::init_place and Project::init_model
|
||||
#[derive(Debug, Fail)]
|
||||
pub enum ProjectInitError {
|
||||
AlreadyExists(PathBuf),
|
||||
@@ -150,6 +145,7 @@ impl fmt::Display for ProjectInitError {
|
||||
}
|
||||
}
|
||||
|
||||
/// Error returned by Project::save
|
||||
#[derive(Debug, Fail)]
|
||||
pub enum ProjectSaveError {
|
||||
#[fail(display = "JSON error: {}", _0)]
|
||||
@@ -159,75 +155,49 @@ pub enum ProjectSaveError {
|
||||
IoError(#[fail(cause)] io::Error),
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct InstanceProjectNodeMetadata {
|
||||
pub ignore_unknown_instances: bool,
|
||||
}
|
||||
#[derive(Debug, Clone, PartialEq, Default, Serialize, Deserialize)]
|
||||
pub struct ProjectNode {
|
||||
pub class_name: Option<String>,
|
||||
pub children: HashMap<String, ProjectNode>,
|
||||
pub properties: HashMap<String, RbxValue>,
|
||||
pub ignore_unknown_instances: Option<bool>,
|
||||
|
||||
impl Default for InstanceProjectNodeMetadata {
|
||||
fn default() -> InstanceProjectNodeMetadata {
|
||||
InstanceProjectNodeMetadata {
|
||||
ignore_unknown_instances: true,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(tag = "type")]
|
||||
pub enum ProjectNode {
|
||||
Instance(InstanceProjectNode),
|
||||
SyncPoint(SyncPointProjectNode),
|
||||
#[serde(serialize_with = "crate::path_serializer::serialize_option")]
|
||||
pub path: Option<PathBuf>,
|
||||
}
|
||||
|
||||
impl ProjectNode {
|
||||
fn to_source_node(&self, project_file_location: &Path) -> SourceProjectNode {
|
||||
match self {
|
||||
ProjectNode::Instance(node) => {
|
||||
let mut children = HashMap::new();
|
||||
let children = self.children.iter()
|
||||
.map(|(key, value)| (key.clone(), value.to_source_node(project_file_location)))
|
||||
.collect();
|
||||
|
||||
for (key, child) in &node.children {
|
||||
children.insert(key.clone(), child.to_source_node(project_file_location));
|
||||
}
|
||||
// If paths are relative to the project file, transform them to look
|
||||
// Unixy and write relative paths instead.
|
||||
//
|
||||
// This isn't perfect, since it means that paths like .. will stay as
|
||||
// absolute paths and make projects non-portable. Fixing this probably
|
||||
// means keeping the paths relative in the project format and making
|
||||
// everywhere else in Rojo do the resolution locally.
|
||||
let path = self.path.as_ref().map(|path| {
|
||||
let project_folder_location = project_file_location.parent().unwrap();
|
||||
|
||||
SourceProjectNode::Instance {
|
||||
class_name: node.class_name.clone(),
|
||||
children,
|
||||
properties: node.properties.clone(),
|
||||
ignore_unknown_instances: node.metadata.ignore_unknown_instances,
|
||||
}
|
||||
},
|
||||
ProjectNode::SyncPoint(sync_node) => {
|
||||
let project_folder_location = project_file_location.parent().unwrap();
|
||||
match path.strip_prefix(project_folder_location) {
|
||||
Ok(stripped) => stripped.to_str().unwrap().replace("\\", "/"),
|
||||
Err(_) => format!("{}", path.display()),
|
||||
}
|
||||
});
|
||||
|
||||
let friendly_path = match sync_node.path.strip_prefix(project_folder_location) {
|
||||
Ok(stripped) => stripped.to_str().unwrap().replace("\\", "/"),
|
||||
Err(_) => format!("{}", sync_node.path.display()),
|
||||
};
|
||||
|
||||
SourceProjectNode::SyncPoint {
|
||||
path: friendly_path,
|
||||
}
|
||||
},
|
||||
SourceProjectNode {
|
||||
class_name: self.class_name.clone(),
|
||||
properties: self.properties.clone(),
|
||||
ignore_unknown_instances: self.ignore_unknown_instances,
|
||||
children,
|
||||
path,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct InstanceProjectNode {
|
||||
pub class_name: String,
|
||||
pub children: HashMap<String, ProjectNode>,
|
||||
pub properties: HashMap<String, RbxValue>,
|
||||
pub metadata: InstanceProjectNodeMetadata,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct SyncPointProjectNode {
|
||||
pub path: PathBuf,
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
|
||||
pub struct Project {
|
||||
pub name: String,
|
||||
@@ -247,33 +217,31 @@ impl Project {
|
||||
project_fuzzy_path.file_name().unwrap().to_str().unwrap()
|
||||
};
|
||||
|
||||
let tree = ProjectNode::Instance(InstanceProjectNode {
|
||||
class_name: "DataModel".to_string(),
|
||||
let tree = ProjectNode {
|
||||
class_name: Some(String::from("DataModel")),
|
||||
children: hashmap! {
|
||||
String::from("ReplicatedStorage") => ProjectNode::Instance(InstanceProjectNode {
|
||||
class_name: String::from("ReplicatedStorage"),
|
||||
String::from("ReplicatedStorage") => ProjectNode {
|
||||
class_name: Some(String::from("ReplicatedStorage")),
|
||||
children: hashmap! {
|
||||
String::from("Source") => ProjectNode::SyncPoint(SyncPointProjectNode {
|
||||
path: project_folder_path.join("src"),
|
||||
}),
|
||||
String::from("Source") => ProjectNode {
|
||||
path: Some(project_folder_path.join("src")),
|
||||
..Default::default()
|
||||
},
|
||||
},
|
||||
properties: HashMap::new(),
|
||||
metadata: Default::default(),
|
||||
}),
|
||||
String::from("HttpService") => ProjectNode::Instance(InstanceProjectNode {
|
||||
class_name: String::from("HttpService"),
|
||||
children: HashMap::new(),
|
||||
..Default::default()
|
||||
},
|
||||
String::from("HttpService") => ProjectNode {
|
||||
class_name: Some(String::from("HttpService")),
|
||||
properties: hashmap! {
|
||||
String::from("HttpEnabled") => RbxValue::Bool {
|
||||
value: true,
|
||||
},
|
||||
},
|
||||
metadata: Default::default(),
|
||||
}),
|
||||
..Default::default()
|
||||
},
|
||||
},
|
||||
properties: HashMap::new(),
|
||||
metadata: Default::default(),
|
||||
});
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let project = Project {
|
||||
name: project_name.to_string(),
|
||||
@@ -298,9 +266,10 @@ impl Project {
|
||||
project_fuzzy_path.file_name().unwrap().to_str().unwrap()
|
||||
};
|
||||
|
||||
let tree = ProjectNode::SyncPoint(SyncPointProjectNode {
|
||||
path: project_folder_path.join("src"),
|
||||
});
|
||||
let tree = ProjectNode {
|
||||
path: Some(project_folder_path.join("src")),
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let project = Project {
|
||||
name: project_name.to_string(),
|
||||
@@ -340,17 +309,23 @@ impl Project {
|
||||
// TODO: Check for specific error kinds, convert 'not found' to Result.
|
||||
let location_metadata = fs::metadata(start_location).ok()?;
|
||||
|
||||
// If this is a file, we should assume it's the config we want
|
||||
// If this is a file, assume it's the config the user was looking for.
|
||||
if location_metadata.is_file() {
|
||||
return Some(start_location.to_path_buf());
|
||||
} else if location_metadata.is_dir() {
|
||||
let with_file = start_location.join(PROJECT_FILENAME);
|
||||
|
||||
if let Ok(with_file_metadata) = fs::metadata(&with_file) {
|
||||
if with_file_metadata.is_file() {
|
||||
if let Ok(file_metadata) = fs::metadata(&with_file) {
|
||||
if file_metadata.is_file() {
|
||||
return Some(with_file);
|
||||
} else {
|
||||
return None;
|
||||
}
|
||||
}
|
||||
|
||||
let with_compat_file = start_location.join(COMPAT_PROJECT_FILENAME);
|
||||
|
||||
if let Ok(file_metadata) = fs::metadata(&with_compat_file) {
|
||||
if file_metadata.is_file() {
|
||||
return Some(with_compat_file);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -389,6 +364,25 @@ impl Project {
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Checks if there are any compatibility issues with this project file and
|
||||
/// warns the user if there are any.
|
||||
pub fn check_compatibility(&self) {
|
||||
let file_name = self.file_location
|
||||
.file_name().unwrap()
|
||||
.to_str().expect("Project file path was not valid Unicode!");
|
||||
|
||||
if file_name == COMPAT_PROJECT_FILENAME {
|
||||
warn!("Rojo's default project file name changed in 0.5.0-alpha3.");
|
||||
warn!("Support for the old project file name will be dropped before 0.5.0 releases.");
|
||||
warn!("Your project file is named {}", COMPAT_PROJECT_FILENAME);
|
||||
warn!("Rename your project file to {}", PROJECT_FILENAME);
|
||||
} else if !file_name.ends_with(".project.json") {
|
||||
warn!("Starting in Rojo 0.5.0-alpha3, it's recommended to give all project files the");
|
||||
warn!(".project.json extension. This helps Rojo differentiate project files from");
|
||||
warn!("other JSON files!");
|
||||
}
|
||||
}
|
||||
|
||||
fn to_source_project(&self) -> SourceProject {
|
||||
SourceProject {
|
||||
name: self.name.clone(),
|
||||
|
||||
@@ -1,33 +1,54 @@
|
||||
use std::{
|
||||
borrow::Cow,
|
||||
collections::HashMap,
|
||||
fmt,
|
||||
collections::{HashSet, HashMap},
|
||||
path::{Path, PathBuf},
|
||||
str,
|
||||
sync::{Arc, Mutex},
|
||||
};
|
||||
|
||||
use failure::Fail;
|
||||
|
||||
use rbx_tree::{RbxTree, RbxInstanceProperties, RbxValue, RbxId};
|
||||
use serde_derive::{Serialize, Deserialize};
|
||||
use log::{info, trace};
|
||||
use rbx_tree::{RbxTree, RbxId};
|
||||
|
||||
use crate::{
|
||||
project::{Project, ProjectNode, InstanceProjectNodeMetadata},
|
||||
project::{Project, ProjectNode},
|
||||
message_queue::MessageQueue,
|
||||
imfs::{Imfs, ImfsItem, ImfsFile},
|
||||
imfs::{Imfs, ImfsItem},
|
||||
path_map::PathMap,
|
||||
rbx_snapshot::{RbxSnapshotInstance, InstanceChanges, snapshot_from_tree, reify_root, reconcile_subtree},
|
||||
rbx_snapshot::{snapshot_project_tree, snapshot_project_node, snapshot_imfs_path},
|
||||
snapshot_reconciler::{InstanceChanges, reify_root, reconcile_subtree},
|
||||
};
|
||||
|
||||
const INIT_SCRIPT: &str = "init.lua";
|
||||
const INIT_SERVER_SCRIPT: &str = "init.server.lua";
|
||||
const INIT_CLIENT_SCRIPT: &str = "init.client.lua";
|
||||
|
||||
/// `source_path` or `project_definition` or both must both be Some.
|
||||
#[derive(Debug, Clone, PartialEq, Default, Serialize, Deserialize)]
|
||||
pub struct MetadataPerInstance {
|
||||
pub ignore_unknown_instances: bool,
|
||||
|
||||
/// The path on the filesystem that the instance was read from the
|
||||
/// filesystem if it came from the filesystem.
|
||||
#[serde(serialize_with = "crate::path_serializer::serialize_option")]
|
||||
pub source_path: Option<PathBuf>,
|
||||
|
||||
/// Information about the instance that came from the project that defined
|
||||
/// it, if that's where it was defined.
|
||||
///
|
||||
/// A key-value pair where the key should be the name of the instance and
|
||||
/// the value is the ProjectNode from the instance's project.
|
||||
pub project_definition: Option<(String, ProjectNode)>,
|
||||
}
|
||||
|
||||
/// Contains all of the state needed to update an `RbxTree` in real time using
|
||||
/// the in-memory filesystem, as well as messaging to Rojo clients what
|
||||
/// instances have actually updated at any point.
|
||||
pub struct RbxSession {
|
||||
tree: RbxTree,
|
||||
path_map: PathMap<RbxId>,
|
||||
instance_metadata_map: HashMap<RbxId, InstanceProjectNodeMetadata>,
|
||||
sync_point_names: HashMap<PathBuf, String>,
|
||||
|
||||
instances_per_path: PathMap<HashSet<RbxId>>,
|
||||
metadata_per_instance: HashMap<RbxId, MetadataPerInstance>,
|
||||
message_queue: Arc<MessageQueue<InstanceChanges>>,
|
||||
imfs: Arc<Mutex<Imfs>>,
|
||||
}
|
||||
@@ -38,20 +59,18 @@ impl RbxSession {
|
||||
imfs: Arc<Mutex<Imfs>>,
|
||||
message_queue: Arc<MessageQueue<InstanceChanges>>,
|
||||
) -> RbxSession {
|
||||
let mut sync_point_names = HashMap::new();
|
||||
let mut path_map = PathMap::new();
|
||||
let mut instance_metadata_map = HashMap::new();
|
||||
let mut instances_per_path = PathMap::new();
|
||||
let mut metadata_per_instance = HashMap::new();
|
||||
|
||||
let tree = {
|
||||
let temp_imfs = imfs.lock().unwrap();
|
||||
construct_initial_tree(&project, &temp_imfs, &mut path_map, &mut instance_metadata_map, &mut sync_point_names)
|
||||
reify_initial_tree(&project, &temp_imfs, &mut instances_per_path, &mut metadata_per_instance)
|
||||
};
|
||||
|
||||
RbxSession {
|
||||
tree,
|
||||
path_map,
|
||||
instance_metadata_map,
|
||||
sync_point_names,
|
||||
instances_per_path,
|
||||
metadata_per_instance,
|
||||
message_queue,
|
||||
imfs,
|
||||
}
|
||||
@@ -68,8 +87,7 @@ impl RbxSession {
|
||||
.expect("Path was outside in-memory filesystem roots");
|
||||
|
||||
// Find the closest instance in the tree that currently exists
|
||||
let mut path_to_snapshot = self.path_map.descend(root_path, path);
|
||||
let &instance_id = self.path_map.get(&path_to_snapshot).unwrap();
|
||||
let mut path_to_snapshot = self.instances_per_path.descend(root_path, path);
|
||||
|
||||
// If this is a file that might affect its parent if modified, we
|
||||
// should snapshot its parent instead.
|
||||
@@ -82,27 +100,44 @@ impl RbxSession {
|
||||
|
||||
trace!("Snapshotting path {}", path_to_snapshot.display());
|
||||
|
||||
let maybe_snapshot = snapshot_instances_from_imfs(&imfs, &path_to_snapshot, &mut self.sync_point_names)
|
||||
.unwrap_or_else(|_| panic!("Could not generate instance snapshot for path {}", path_to_snapshot.display()));
|
||||
let instances_at_path = self.instances_per_path.get(&path_to_snapshot)
|
||||
.expect("Metadata did not exist for path")
|
||||
.clone();
|
||||
|
||||
let snapshot = match maybe_snapshot {
|
||||
Some(snapshot) => snapshot,
|
||||
None => {
|
||||
trace!("Path resulted in no snapshot being generated.");
|
||||
return;
|
||||
},
|
||||
};
|
||||
for instance_id in &instances_at_path {
|
||||
let instance_metadata = self.metadata_per_instance.get(&instance_id)
|
||||
.expect("Metadata for instance ID did not exist");
|
||||
|
||||
trace!("Snapshot: {:#?}", snapshot);
|
||||
let maybe_snapshot = match &instance_metadata.project_definition {
|
||||
Some((instance_name, project_node)) => {
|
||||
snapshot_project_node(&imfs, &project_node, Cow::Owned(instance_name.clone()))
|
||||
.unwrap_or_else(|_| panic!("Could not generate instance snapshot for path {}", path_to_snapshot.display()))
|
||||
},
|
||||
None => {
|
||||
snapshot_imfs_path(&imfs, &path_to_snapshot, None)
|
||||
.unwrap_or_else(|_| panic!("Could not generate instance snapshot for path {}", path_to_snapshot.display()))
|
||||
},
|
||||
};
|
||||
|
||||
reconcile_subtree(
|
||||
&mut self.tree,
|
||||
instance_id,
|
||||
&snapshot,
|
||||
&mut self.path_map,
|
||||
&mut self.instance_metadata_map,
|
||||
&mut changes,
|
||||
);
|
||||
let snapshot = match maybe_snapshot {
|
||||
Some(snapshot) => snapshot,
|
||||
None => {
|
||||
trace!("Path resulted in no snapshot being generated.");
|
||||
return;
|
||||
},
|
||||
};
|
||||
|
||||
trace!("Snapshot: {:#?}", snapshot);
|
||||
|
||||
reconcile_subtree(
|
||||
&mut self.tree,
|
||||
*instance_id,
|
||||
&snapshot,
|
||||
&mut self.instances_per_path,
|
||||
&mut self.metadata_per_instance,
|
||||
&mut changes,
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
if changes.is_empty() {
|
||||
@@ -127,10 +162,14 @@ impl RbxSession {
|
||||
// If the path doesn't exist or is a directory, we don't care if it
|
||||
// updated
|
||||
match imfs.get(path) {
|
||||
Some(ImfsItem::Directory(_)) | None => {
|
||||
Some(ImfsItem::Directory(_)) => {
|
||||
trace!("Updated path was a directory, ignoring.");
|
||||
return;
|
||||
},
|
||||
None => {
|
||||
trace!("Updated path did not exist in IMFS, ignoring.");
|
||||
return;
|
||||
},
|
||||
Some(ImfsItem::File(_)) => {},
|
||||
}
|
||||
}
|
||||
@@ -140,13 +179,13 @@ impl RbxSession {
|
||||
|
||||
pub fn path_removed(&mut self, path: &Path) {
|
||||
info!("Path removed: {}", path.display());
|
||||
self.path_map.remove(path);
|
||||
self.instances_per_path.remove(path);
|
||||
self.path_created_or_updated(path);
|
||||
}
|
||||
|
||||
pub fn path_renamed(&mut self, from_path: &Path, to_path: &Path) {
|
||||
info!("Path renamed from {} to {}", from_path.display(), to_path.display());
|
||||
self.path_map.remove(from_path);
|
||||
self.instances_per_path.remove(from_path);
|
||||
self.path_created_or_updated(from_path);
|
||||
self.path_created_or_updated(to_path);
|
||||
}
|
||||
@@ -155,385 +194,29 @@ impl RbxSession {
|
||||
&self.tree
|
||||
}
|
||||
|
||||
pub fn get_instance_metadata(&self, id: RbxId) -> Option<&InstanceProjectNodeMetadata> {
|
||||
self.instance_metadata_map.get(&id)
|
||||
}
|
||||
|
||||
pub fn debug_get_path_map(&self) -> &PathMap<RbxId> {
|
||||
&self.path_map
|
||||
pub fn get_instance_metadata(&self, id: RbxId) -> Option<&MetadataPerInstance> {
|
||||
self.metadata_per_instance.get(&id)
|
||||
}
|
||||
}
|
||||
|
||||
pub fn construct_oneoff_tree(project: &Project, imfs: &Imfs) -> RbxTree {
|
||||
let mut path_map = PathMap::new();
|
||||
let mut instance_metadata_map = HashMap::new();
|
||||
let mut sync_point_names = HashMap::new();
|
||||
construct_initial_tree(project, imfs, &mut path_map, &mut instance_metadata_map, &mut sync_point_names)
|
||||
let mut instances_per_path = PathMap::new();
|
||||
let mut metadata_per_instance = HashMap::new();
|
||||
reify_initial_tree(project, imfs, &mut instances_per_path, &mut metadata_per_instance)
|
||||
}
|
||||
|
||||
fn construct_initial_tree(
|
||||
fn reify_initial_tree(
|
||||
project: &Project,
|
||||
imfs: &Imfs,
|
||||
path_map: &mut PathMap<RbxId>,
|
||||
instance_metadata_map: &mut HashMap<RbxId, InstanceProjectNodeMetadata>,
|
||||
sync_point_names: &mut HashMap<PathBuf, String>,
|
||||
instances_per_path: &mut PathMap<HashSet<RbxId>>,
|
||||
metadata_per_instance: &mut HashMap<RbxId, MetadataPerInstance>,
|
||||
) -> RbxTree {
|
||||
let snapshot = construct_project_node(
|
||||
imfs,
|
||||
&project.name,
|
||||
&project.tree,
|
||||
sync_point_names,
|
||||
);
|
||||
let snapshot = snapshot_project_tree(imfs, project)
|
||||
.expect("Could not snapshot project tree")
|
||||
.expect("Project did not produce any instances");
|
||||
|
||||
let mut changes = InstanceChanges::default();
|
||||
let tree = reify_root(&snapshot, path_map, instance_metadata_map, &mut changes);
|
||||
let tree = reify_root(&snapshot, instances_per_path, metadata_per_instance, &mut changes);
|
||||
|
||||
tree
|
||||
}
|
||||
|
||||
fn construct_project_node<'a>(
|
||||
imfs: &'a Imfs,
|
||||
instance_name: &'a str,
|
||||
project_node: &'a ProjectNode,
|
||||
sync_point_names: &mut HashMap<PathBuf, String>,
|
||||
) -> RbxSnapshotInstance<'a> {
|
||||
match project_node {
|
||||
ProjectNode::Instance(node) => {
|
||||
let mut children = Vec::new();
|
||||
|
||||
for (child_name, child_project_node) in &node.children {
|
||||
children.push(construct_project_node(imfs, child_name, child_project_node, sync_point_names));
|
||||
}
|
||||
|
||||
RbxSnapshotInstance {
|
||||
class_name: Cow::Borrowed(&node.class_name),
|
||||
name: Cow::Borrowed(instance_name),
|
||||
properties: node.properties.clone(),
|
||||
children,
|
||||
source_path: None,
|
||||
metadata: Some(node.metadata.clone()),
|
||||
}
|
||||
},
|
||||
ProjectNode::SyncPoint(node) => {
|
||||
// TODO: Propagate errors upward instead of dying
|
||||
let mut snapshot = snapshot_instances_from_imfs(imfs, &node.path, sync_point_names)
|
||||
.expect("Could not reify nodes from Imfs")
|
||||
.expect("Sync point node did not result in an instance");
|
||||
|
||||
snapshot.name = Cow::Borrowed(instance_name);
|
||||
sync_point_names.insert(node.path.clone(), instance_name.to_string());
|
||||
|
||||
snapshot
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Clone, Copy)]
|
||||
enum FileType {
|
||||
ModuleScript,
|
||||
ServerScript,
|
||||
ClientScript,
|
||||
StringValue,
|
||||
LocalizationTable,
|
||||
XmlModel,
|
||||
BinaryModel,
|
||||
}
|
||||
|
||||
fn get_trailing<'a>(input: &'a str, trailer: &str) -> Option<&'a str> {
|
||||
if input.ends_with(trailer) {
|
||||
let end = input.len().saturating_sub(trailer.len());
|
||||
Some(&input[..end])
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
fn classify_file(file: &ImfsFile) -> Option<(&str, FileType)> {
|
||||
static EXTENSIONS_TO_TYPES: &[(&str, FileType)] = &[
|
||||
(".server.lua", FileType::ServerScript),
|
||||
(".client.lua", FileType::ClientScript),
|
||||
(".lua", FileType::ModuleScript),
|
||||
(".csv", FileType::LocalizationTable),
|
||||
(".txt", FileType::StringValue),
|
||||
(".rbxmx", FileType::XmlModel),
|
||||
(".rbxm", FileType::BinaryModel),
|
||||
];
|
||||
|
||||
let file_name = file.path.file_name()?.to_str()?;
|
||||
|
||||
for (extension, file_type) in EXTENSIONS_TO_TYPES {
|
||||
if let Some(instance_name) = get_trailing(file_name, extension) {
|
||||
return Some((instance_name, *file_type))
|
||||
}
|
||||
}
|
||||
|
||||
None
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "PascalCase")]
|
||||
struct LocalizationEntryCsv {
|
||||
key: String,
|
||||
context: String,
|
||||
example: String,
|
||||
source: String,
|
||||
#[serde(flatten)]
|
||||
values: HashMap<String, String>,
|
||||
}
|
||||
|
||||
impl LocalizationEntryCsv {
|
||||
fn to_json(self) -> LocalizationEntryJson {
|
||||
LocalizationEntryJson {
|
||||
key: self.key,
|
||||
context: self.context,
|
||||
example: self.example,
|
||||
source: self.source,
|
||||
values: self.values,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
struct LocalizationEntryJson {
|
||||
key: String,
|
||||
context: String,
|
||||
example: String,
|
||||
source: String,
|
||||
values: HashMap<String, String>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Fail)]
|
||||
enum SnapshotError {
|
||||
DidNotExist(PathBuf),
|
||||
|
||||
// TODO: Add file path to the error message?
|
||||
Utf8Error {
|
||||
#[fail(cause)]
|
||||
inner: str::Utf8Error,
|
||||
path: PathBuf,
|
||||
},
|
||||
|
||||
XmlModelDecodeError {
|
||||
inner: rbx_xml::DecodeError,
|
||||
path: PathBuf,
|
||||
},
|
||||
|
||||
BinaryModelDecodeError {
|
||||
inner: rbx_binary::DecodeError,
|
||||
path: PathBuf,
|
||||
},
|
||||
}
|
||||
|
||||
impl fmt::Display for SnapshotError {
|
||||
fn fmt(&self, output: &mut fmt::Formatter) -> fmt::Result {
|
||||
match self {
|
||||
SnapshotError::DidNotExist(path) => write!(output, "Path did not exist: {}", path.display()),
|
||||
SnapshotError::Utf8Error { inner, path } => {
|
||||
write!(output, "Invalid UTF-8: {} in path {}", inner, path.display())
|
||||
},
|
||||
SnapshotError::XmlModelDecodeError { inner, path } => {
|
||||
write!(output, "Malformed rbxmx model: {:?} in path {}", inner, path.display())
|
||||
},
|
||||
SnapshotError::BinaryModelDecodeError { inner, path } => {
|
||||
write!(output, "Malformed rbxm model: {:?} in path {}", inner, path.display())
|
||||
},
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn snapshot_xml_model<'a>(
|
||||
instance_name: Cow<'a, str>,
|
||||
file: &ImfsFile,
|
||||
) -> Result<Option<RbxSnapshotInstance<'a>>, SnapshotError> {
|
||||
let mut temp_tree = RbxTree::new(RbxInstanceProperties {
|
||||
name: "Temp".to_owned(),
|
||||
class_name: "Folder".to_owned(),
|
||||
properties: HashMap::new(),
|
||||
});
|
||||
|
||||
let root_id = temp_tree.get_root_id();
|
||||
rbx_xml::decode(&mut temp_tree, root_id, file.contents.as_slice())
|
||||
.map_err(|inner| SnapshotError::XmlModelDecodeError {
|
||||
inner,
|
||||
path: file.path.clone(),
|
||||
})?;
|
||||
|
||||
let root_instance = temp_tree.get_instance(root_id).unwrap();
|
||||
let children = root_instance.get_children_ids();
|
||||
|
||||
match children.len() {
|
||||
0 => Ok(None),
|
||||
1 => {
|
||||
let mut snapshot = snapshot_from_tree(&temp_tree, children[0]).unwrap();
|
||||
snapshot.name = instance_name;
|
||||
Ok(Some(snapshot))
|
||||
},
|
||||
_ => panic!("Rojo doesn't have support for model files with multiple roots yet"),
|
||||
}
|
||||
}
|
||||
|
||||
fn snapshot_binary_model<'a>(
|
||||
instance_name: Cow<'a, str>,
|
||||
file: &ImfsFile,
|
||||
) -> Result<Option<RbxSnapshotInstance<'a>>, SnapshotError> {
|
||||
let mut temp_tree = RbxTree::new(RbxInstanceProperties {
|
||||
name: "Temp".to_owned(),
|
||||
class_name: "Folder".to_owned(),
|
||||
properties: HashMap::new(),
|
||||
});
|
||||
|
||||
let root_id = temp_tree.get_root_id();
|
||||
rbx_binary::decode(&mut temp_tree, root_id, file.contents.as_slice())
|
||||
.map_err(|inner| SnapshotError::BinaryModelDecodeError {
|
||||
inner,
|
||||
path: file.path.clone(),
|
||||
})?;
|
||||
|
||||
let root_instance = temp_tree.get_instance(root_id).unwrap();
|
||||
let children = root_instance.get_children_ids();
|
||||
|
||||
match children.len() {
|
||||
0 => Ok(None),
|
||||
1 => {
|
||||
let mut snapshot = snapshot_from_tree(&temp_tree, children[0]).unwrap();
|
||||
snapshot.name = instance_name;
|
||||
Ok(Some(snapshot))
|
||||
},
|
||||
_ => panic!("Rojo doesn't have support for model files with multiple roots yet"),
|
||||
}
|
||||
}
|
||||
|
||||
fn snapshot_instances_from_imfs<'a>(
|
||||
imfs: &'a Imfs,
|
||||
imfs_path: &Path,
|
||||
sync_point_names: &HashMap<PathBuf, String>,
|
||||
) -> Result<Option<RbxSnapshotInstance<'a>>, SnapshotError> {
|
||||
match imfs.get(imfs_path) {
|
||||
Some(ImfsItem::File(file)) => {
|
||||
let (instance_name, file_type) = match classify_file(file) {
|
||||
Some(info) => info,
|
||||
None => return Ok(None),
|
||||
};
|
||||
|
||||
let instance_name = if let Some(actual_name) = sync_point_names.get(imfs_path) {
|
||||
Cow::Owned(actual_name.clone())
|
||||
} else {
|
||||
Cow::Borrowed(instance_name)
|
||||
};
|
||||
|
||||
let class_name = match file_type {
|
||||
FileType::ModuleScript => "ModuleScript",
|
||||
FileType::ServerScript => "Script",
|
||||
FileType::ClientScript => "LocalScript",
|
||||
FileType::StringValue => "StringValue",
|
||||
FileType::LocalizationTable => "LocalizationTable",
|
||||
FileType::XmlModel => return snapshot_xml_model(instance_name, file),
|
||||
FileType::BinaryModel => return snapshot_binary_model(instance_name, file),
|
||||
};
|
||||
|
||||
let contents = str::from_utf8(&file.contents)
|
||||
.map_err(|inner| SnapshotError::Utf8Error {
|
||||
inner,
|
||||
path: imfs_path.to_path_buf(),
|
||||
})?;
|
||||
|
||||
let mut properties = HashMap::new();
|
||||
|
||||
match file_type {
|
||||
FileType::ModuleScript | FileType::ServerScript | FileType::ClientScript => {
|
||||
properties.insert(String::from("Source"), RbxValue::String {
|
||||
value: contents.to_string(),
|
||||
});
|
||||
},
|
||||
FileType::StringValue => {
|
||||
properties.insert(String::from("Value"), RbxValue::String {
|
||||
value: contents.to_string(),
|
||||
});
|
||||
},
|
||||
FileType::LocalizationTable => {
|
||||
let entries: Vec<LocalizationEntryJson> = csv::Reader::from_reader(contents.as_bytes())
|
||||
.deserialize()
|
||||
.map(|result| result.expect("Malformed localization table found!"))
|
||||
.map(LocalizationEntryCsv::to_json)
|
||||
.collect();
|
||||
|
||||
let table_contents = serde_json::to_string(&entries)
|
||||
.expect("Could not encode JSON for localization table");
|
||||
|
||||
properties.insert(String::from("Contents"), RbxValue::String {
|
||||
value: table_contents,
|
||||
});
|
||||
},
|
||||
FileType::XmlModel | FileType::BinaryModel => unreachable!(),
|
||||
}
|
||||
|
||||
Ok(Some(RbxSnapshotInstance {
|
||||
name: instance_name,
|
||||
class_name: Cow::Borrowed(class_name),
|
||||
properties,
|
||||
children: Vec::new(),
|
||||
source_path: Some(file.path.clone()),
|
||||
metadata: None,
|
||||
}))
|
||||
},
|
||||
Some(ImfsItem::Directory(directory)) => {
|
||||
// TODO: Expand init support to handle server and client scripts
|
||||
let init_path = directory.path.join(INIT_SCRIPT);
|
||||
let init_server_path = directory.path.join(INIT_SERVER_SCRIPT);
|
||||
let init_client_path = directory.path.join(INIT_CLIENT_SCRIPT);
|
||||
|
||||
let mut instance = if directory.children.contains(&init_path) {
|
||||
snapshot_instances_from_imfs(imfs, &init_path, sync_point_names)?
|
||||
.expect("Could not snapshot instance from file that existed!")
|
||||
} else if directory.children.contains(&init_server_path) {
|
||||
snapshot_instances_from_imfs(imfs, &init_server_path, sync_point_names)?
|
||||
.expect("Could not snapshot instance from file that existed!")
|
||||
} else if directory.children.contains(&init_client_path) {
|
||||
snapshot_instances_from_imfs(imfs, &init_client_path, sync_point_names)?
|
||||
.expect("Could not snapshot instance from file that existed!")
|
||||
} else {
|
||||
RbxSnapshotInstance {
|
||||
class_name: Cow::Borrowed("Folder"),
|
||||
name: Cow::Borrowed(""),
|
||||
properties: HashMap::new(),
|
||||
children: Vec::new(),
|
||||
source_path: Some(directory.path.clone()),
|
||||
metadata: None,
|
||||
}
|
||||
};
|
||||
|
||||
// We have to be careful not to lose instance names that are
|
||||
// specified in the project manifest. We store them in
|
||||
// sync_point_names when the original tree is constructed.
|
||||
instance.name = if let Some(actual_name) = sync_point_names.get(&directory.path) {
|
||||
Cow::Owned(actual_name.clone())
|
||||
} else {
|
||||
Cow::Borrowed(directory.path
|
||||
.file_name().expect("Could not extract file name")
|
||||
.to_str().expect("Could not convert path to UTF-8"))
|
||||
};
|
||||
|
||||
for child_path in &directory.children {
|
||||
match child_path.file_name().unwrap().to_str().unwrap() {
|
||||
INIT_SCRIPT | INIT_SERVER_SCRIPT | INIT_CLIENT_SCRIPT => {
|
||||
// The existence of files with these names modifies the
|
||||
// parent instance and is handled above, so we can skip
|
||||
// them here.
|
||||
},
|
||||
_ => {
|
||||
match snapshot_instances_from_imfs(imfs, child_path, sync_point_names)? {
|
||||
Some(child) => {
|
||||
instance.children.push(child);
|
||||
},
|
||||
None => {},
|
||||
}
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
Ok(Some(instance))
|
||||
},
|
||||
None => Err(SnapshotError::DidNotExist(imfs_path.to_path_buf())),
|
||||
}
|
||||
}
|
||||
@@ -1,307 +1,575 @@
|
||||
//! Defines how Rojo transforms files into instances through the snapshot
|
||||
//! system.
|
||||
|
||||
use std::{
|
||||
str,
|
||||
borrow::Cow,
|
||||
collections::{HashMap, HashSet},
|
||||
collections::HashMap,
|
||||
fmt,
|
||||
path::PathBuf,
|
||||
path::{Path, PathBuf},
|
||||
str,
|
||||
};
|
||||
|
||||
use rbx_tree::{RbxTree, RbxId, RbxInstanceProperties, RbxValue};
|
||||
use failure::Fail;
|
||||
use log::info;
|
||||
use maplit::hashmap;
|
||||
use rbx_tree::{RbxTree, RbxValue, RbxInstanceProperties};
|
||||
use serde_derive::{Serialize, Deserialize};
|
||||
|
||||
use crate::{
|
||||
path_map::PathMap,
|
||||
project::InstanceProjectNodeMetadata,
|
||||
imfs::{
|
||||
Imfs,
|
||||
ImfsItem,
|
||||
ImfsFile,
|
||||
ImfsDirectory,
|
||||
},
|
||||
project::{
|
||||
Project,
|
||||
ProjectNode,
|
||||
},
|
||||
snapshot_reconciler::{
|
||||
RbxSnapshotInstance,
|
||||
snapshot_from_tree,
|
||||
},
|
||||
// TODO: Move MetadataPerInstance into this module?
|
||||
rbx_session::MetadataPerInstance,
|
||||
};
|
||||
|
||||
#[derive(Debug, Clone, Default, Serialize, Deserialize)]
|
||||
pub struct InstanceChanges {
|
||||
pub added: HashSet<RbxId>,
|
||||
pub removed: HashSet<RbxId>,
|
||||
pub updated: HashSet<RbxId>,
|
||||
const INIT_MODULE_NAME: &str = "init.lua";
|
||||
const INIT_SERVER_NAME: &str = "init.server.lua";
|
||||
const INIT_CLIENT_NAME: &str = "init.client.lua";
|
||||
|
||||
pub type SnapshotResult<'a> = Result<Option<RbxSnapshotInstance<'a>>, SnapshotError>;
|
||||
|
||||
#[derive(Debug, Fail)]
|
||||
pub enum SnapshotError {
|
||||
DidNotExist(PathBuf),
|
||||
|
||||
Utf8Error {
|
||||
#[fail(cause)]
|
||||
inner: str::Utf8Error,
|
||||
path: PathBuf,
|
||||
},
|
||||
|
||||
JsonModelDecodeError {
|
||||
#[fail(cause)]
|
||||
inner: serde_json::Error,
|
||||
path: PathBuf,
|
||||
},
|
||||
|
||||
XmlModelDecodeError {
|
||||
inner: rbx_xml::DecodeError,
|
||||
path: PathBuf,
|
||||
},
|
||||
|
||||
BinaryModelDecodeError {
|
||||
inner: rbx_binary::DecodeError,
|
||||
path: PathBuf,
|
||||
},
|
||||
|
||||
ProjectNodeUnusable,
|
||||
|
||||
ProjectNodeInvalidTransmute {
|
||||
partition_path: PathBuf,
|
||||
},
|
||||
}
|
||||
|
||||
impl fmt::Display for InstanceChanges {
|
||||
impl fmt::Display for SnapshotError {
|
||||
fn fmt(&self, output: &mut fmt::Formatter) -> fmt::Result {
|
||||
writeln!(output, "InstanceChanges {{")?;
|
||||
|
||||
if !self.added.is_empty() {
|
||||
writeln!(output, " Added:")?;
|
||||
for id in &self.added {
|
||||
writeln!(output, " {}", id)?;
|
||||
}
|
||||
match self {
|
||||
SnapshotError::DidNotExist(path) => write!(output, "Path did not exist: {}", path.display()),
|
||||
SnapshotError::Utf8Error { inner, path } => {
|
||||
write!(output, "Invalid UTF-8: {} in path {}", inner, path.display())
|
||||
},
|
||||
SnapshotError::JsonModelDecodeError { inner, path } => {
|
||||
write!(output, "Malformed .model.json model: {} in path {}", inner, path.display())
|
||||
},
|
||||
SnapshotError::XmlModelDecodeError { inner, path } => {
|
||||
write!(output, "Malformed rbxmx model: {:?} in path {}", inner, path.display())
|
||||
},
|
||||
SnapshotError::BinaryModelDecodeError { inner, path } => {
|
||||
write!(output, "Malformed rbxm model: {:?} in path {}", inner, path.display())
|
||||
},
|
||||
SnapshotError::ProjectNodeUnusable => {
|
||||
write!(output, "Rojo project nodes must specify either $path or $className.")
|
||||
},
|
||||
SnapshotError::ProjectNodeInvalidTransmute { partition_path } => {
|
||||
writeln!(output, "Rojo project nodes that specify both $path and $className require that the")?;
|
||||
writeln!(output, "instance produced by the files pointed to by $path has a ClassName of")?;
|
||||
writeln!(output, "Folder.")?;
|
||||
writeln!(output, "")?;
|
||||
writeln!(output, "Partition target ($path): {}", partition_path.display())
|
||||
},
|
||||
}
|
||||
|
||||
if !self.removed.is_empty() {
|
||||
writeln!(output, " Removed:")?;
|
||||
for id in &self.removed {
|
||||
writeln!(output, " {}", id)?;
|
||||
}
|
||||
}
|
||||
|
||||
if !self.updated.is_empty() {
|
||||
writeln!(output, " Updated:")?;
|
||||
for id in &self.updated {
|
||||
writeln!(output, " {}", id)?;
|
||||
}
|
||||
}
|
||||
|
||||
writeln!(output, "}}")
|
||||
}
|
||||
}
|
||||
|
||||
impl InstanceChanges {
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.added.is_empty() && self.removed.is_empty() && self.updated.is_empty()
|
||||
}
|
||||
pub fn snapshot_project_tree<'source>(
|
||||
imfs: &'source Imfs,
|
||||
project: &'source Project,
|
||||
) -> SnapshotResult<'source> {
|
||||
snapshot_project_node(imfs, &project.tree, Cow::Borrowed(&project.name))
|
||||
}
|
||||
|
||||
#[derive(Debug)]
|
||||
pub struct RbxSnapshotInstance<'a> {
|
||||
pub name: Cow<'a, str>,
|
||||
pub class_name: Cow<'a, str>,
|
||||
pub properties: HashMap<String, RbxValue>,
|
||||
pub children: Vec<RbxSnapshotInstance<'a>>,
|
||||
pub source_path: Option<PathBuf>,
|
||||
pub metadata: Option<InstanceProjectNodeMetadata>,
|
||||
}
|
||||
pub fn snapshot_project_node<'source>(
|
||||
imfs: &'source Imfs,
|
||||
node: &ProjectNode,
|
||||
instance_name: Cow<'source, str>,
|
||||
) -> SnapshotResult<'source> {
|
||||
let maybe_snapshot = match &node.path {
|
||||
Some(path) => snapshot_imfs_path(imfs, &path, Some(instance_name.clone()))?,
|
||||
None => match &node.class_name {
|
||||
Some(_class_name) => Some(RbxSnapshotInstance {
|
||||
name: instance_name.clone(),
|
||||
|
||||
pub fn snapshot_from_tree(tree: &RbxTree, id: RbxId) -> Option<RbxSnapshotInstance<'static>> {
|
||||
let instance = tree.get_instance(id)?;
|
||||
|
||||
let mut children = Vec::new();
|
||||
for &child_id in instance.get_children_ids() {
|
||||
children.push(snapshot_from_tree(tree, child_id)?);
|
||||
}
|
||||
|
||||
Some(RbxSnapshotInstance {
|
||||
name: Cow::Owned(instance.name.to_owned()),
|
||||
class_name: Cow::Owned(instance.class_name.to_owned()),
|
||||
properties: instance.properties.clone(),
|
||||
children,
|
||||
source_path: None,
|
||||
metadata: None,
|
||||
})
|
||||
}
|
||||
|
||||
pub fn reify_root(
|
||||
snapshot: &RbxSnapshotInstance,
|
||||
path_map: &mut PathMap<RbxId>,
|
||||
instance_metadata_map: &mut HashMap<RbxId, InstanceProjectNodeMetadata>,
|
||||
changes: &mut InstanceChanges,
|
||||
) -> RbxTree {
|
||||
let instance = reify_core(snapshot);
|
||||
let mut tree = RbxTree::new(instance);
|
||||
let root_id = tree.get_root_id();
|
||||
|
||||
if let Some(source_path) = &snapshot.source_path {
|
||||
path_map.insert(source_path.clone(), root_id);
|
||||
}
|
||||
|
||||
if let Some(metadata) = &snapshot.metadata {
|
||||
instance_metadata_map.insert(root_id, metadata.clone());
|
||||
}
|
||||
|
||||
changes.added.insert(root_id);
|
||||
|
||||
for child in &snapshot.children {
|
||||
reify_subtree(child, &mut tree, root_id, path_map, instance_metadata_map, changes);
|
||||
}
|
||||
|
||||
tree
|
||||
}
|
||||
|
||||
pub fn reify_subtree(
|
||||
snapshot: &RbxSnapshotInstance,
|
||||
tree: &mut RbxTree,
|
||||
parent_id: RbxId,
|
||||
path_map: &mut PathMap<RbxId>,
|
||||
instance_metadata_map: &mut HashMap<RbxId, InstanceProjectNodeMetadata>,
|
||||
changes: &mut InstanceChanges,
|
||||
) {
|
||||
let instance = reify_core(snapshot);
|
||||
let id = tree.insert_instance(instance, parent_id);
|
||||
|
||||
if let Some(source_path) = &snapshot.source_path {
|
||||
path_map.insert(source_path.clone(), id);
|
||||
}
|
||||
|
||||
if let Some(metadata) = &snapshot.metadata {
|
||||
instance_metadata_map.insert(id, metadata.clone());
|
||||
}
|
||||
|
||||
changes.added.insert(id);
|
||||
|
||||
for child in &snapshot.children {
|
||||
reify_subtree(child, tree, id, path_map, instance_metadata_map, changes);
|
||||
}
|
||||
}
|
||||
|
||||
pub fn reconcile_subtree(
|
||||
tree: &mut RbxTree,
|
||||
id: RbxId,
|
||||
snapshot: &RbxSnapshotInstance,
|
||||
path_map: &mut PathMap<RbxId>,
|
||||
instance_metadata_map: &mut HashMap<RbxId, InstanceProjectNodeMetadata>,
|
||||
changes: &mut InstanceChanges,
|
||||
) {
|
||||
if let Some(source_path) = &snapshot.source_path {
|
||||
path_map.insert(source_path.clone(), id);
|
||||
}
|
||||
|
||||
if let Some(metadata) = &snapshot.metadata {
|
||||
instance_metadata_map.insert(id, metadata.clone());
|
||||
}
|
||||
|
||||
if reconcile_instance_properties(tree.get_instance_mut(id).unwrap(), snapshot) {
|
||||
changes.updated.insert(id);
|
||||
}
|
||||
|
||||
reconcile_instance_children(tree, id, snapshot, path_map, instance_metadata_map, changes);
|
||||
}
|
||||
|
||||
fn reify_core(snapshot: &RbxSnapshotInstance) -> RbxInstanceProperties {
|
||||
let mut properties = HashMap::new();
|
||||
|
||||
for (key, value) in &snapshot.properties {
|
||||
properties.insert(key.clone(), value.clone());
|
||||
}
|
||||
|
||||
let instance = RbxInstanceProperties {
|
||||
name: snapshot.name.to_string(),
|
||||
class_name: snapshot.class_name.to_string(),
|
||||
properties,
|
||||
// These properties are replaced later in the function to
|
||||
// reduce code duplication.
|
||||
class_name: Cow::Borrowed("Folder"),
|
||||
properties: HashMap::new(),
|
||||
children: Vec::new(),
|
||||
metadata: MetadataPerInstance {
|
||||
source_path: None,
|
||||
ignore_unknown_instances: true,
|
||||
project_definition: None,
|
||||
},
|
||||
}),
|
||||
None => {
|
||||
return Err(SnapshotError::ProjectNodeUnusable);
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
instance
|
||||
// If the snapshot resulted in no instances, like if it targets an unknown
|
||||
// file or an empty model file, we can early-return.
|
||||
//
|
||||
// In the future, we might want to issue a warning if the project also
|
||||
// specified fields like class_name, since the user will probably be
|
||||
// confused as to why nothing showed up in the tree.
|
||||
let mut snapshot = match maybe_snapshot {
|
||||
Some(snapshot) => snapshot,
|
||||
None => {
|
||||
// TODO: Return some other sort of marker here instead? If a node
|
||||
// transitions from None into Some, it's possible that configuration
|
||||
// from the ProjectNode might be lost since there's nowhere to put
|
||||
// it!
|
||||
return Ok(None);
|
||||
},
|
||||
};
|
||||
|
||||
// Applies the class name specified in `class_name` from the project, if it's
|
||||
// set.
|
||||
if let Some(class_name) = &node.class_name {
|
||||
// This can only happen if `path` was specified in the project node and
|
||||
// that path represented a non-Folder instance.
|
||||
if snapshot.class_name != "Folder" {
|
||||
return Err(SnapshotError::ProjectNodeInvalidTransmute {
|
||||
partition_path: node.path.as_ref().unwrap().to_owned(),
|
||||
});
|
||||
}
|
||||
|
||||
snapshot.class_name = Cow::Owned(class_name.to_owned());
|
||||
}
|
||||
|
||||
for (child_name, child_project_node) in &node.children {
|
||||
if let Some(child) = snapshot_project_node(imfs, child_project_node, Cow::Owned(child_name.clone()))? {
|
||||
snapshot.children.push(child);
|
||||
}
|
||||
}
|
||||
|
||||
for (key, value) in &node.properties {
|
||||
snapshot.properties.insert(key.clone(), value.clone());
|
||||
}
|
||||
|
||||
if let Some(ignore_unknown_instances) = node.ignore_unknown_instances {
|
||||
snapshot.metadata.ignore_unknown_instances = ignore_unknown_instances;
|
||||
}
|
||||
|
||||
snapshot.metadata.project_definition = Some((instance_name.into_owned(), node.clone()));
|
||||
|
||||
Ok(Some(snapshot))
|
||||
}
|
||||
|
||||
fn reconcile_instance_properties(instance: &mut RbxInstanceProperties, snapshot: &RbxSnapshotInstance) -> bool {
|
||||
let mut has_diffs = false;
|
||||
|
||||
if instance.name != snapshot.name {
|
||||
instance.name = snapshot.name.to_string();
|
||||
has_diffs = true;
|
||||
pub fn snapshot_imfs_path<'source>(
|
||||
imfs: &'source Imfs,
|
||||
path: &Path,
|
||||
instance_name: Option<Cow<'source, str>>,
|
||||
) -> SnapshotResult<'source> {
|
||||
// If the given path doesn't exist in the in-memory filesystem, we consider
|
||||
// that an error.
|
||||
match imfs.get(path) {
|
||||
Some(imfs_item) => snapshot_imfs_item(imfs, imfs_item, instance_name),
|
||||
None => return Err(SnapshotError::DidNotExist(path.to_owned())),
|
||||
}
|
||||
|
||||
if instance.class_name != snapshot.class_name {
|
||||
instance.class_name = snapshot.class_name.to_string();
|
||||
has_diffs = true;
|
||||
}
|
||||
|
||||
let mut property_updates = HashMap::new();
|
||||
|
||||
for (key, instance_value) in &instance.properties {
|
||||
match snapshot.properties.get(key) {
|
||||
Some(snapshot_value) => {
|
||||
if snapshot_value != instance_value {
|
||||
property_updates.insert(key.clone(), Some(snapshot_value.clone()));
|
||||
}
|
||||
},
|
||||
None => {
|
||||
property_updates.insert(key.clone(), None);
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
for (key, snapshot_value) in &snapshot.properties {
|
||||
if property_updates.contains_key(key) {
|
||||
continue;
|
||||
}
|
||||
|
||||
match instance.properties.get(key) {
|
||||
Some(instance_value) => {
|
||||
if snapshot_value != instance_value {
|
||||
property_updates.insert(key.clone(), Some(snapshot_value.clone()));
|
||||
}
|
||||
},
|
||||
None => {
|
||||
property_updates.insert(key.clone(), Some(snapshot_value.clone()));
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
has_diffs = has_diffs || !property_updates.is_empty();
|
||||
|
||||
for (key, change) in property_updates.drain() {
|
||||
match change {
|
||||
Some(value) => instance.properties.insert(key, value),
|
||||
None => instance.properties.remove(&key),
|
||||
};
|
||||
}
|
||||
|
||||
has_diffs
|
||||
}
|
||||
|
||||
fn reconcile_instance_children(
|
||||
tree: &mut RbxTree,
|
||||
id: RbxId,
|
||||
snapshot: &RbxSnapshotInstance,
|
||||
path_map: &mut PathMap<RbxId>,
|
||||
instance_metadata_map: &mut HashMap<RbxId, InstanceProjectNodeMetadata>,
|
||||
changes: &mut InstanceChanges,
|
||||
) {
|
||||
let mut visited_snapshot_indices = HashSet::new();
|
||||
fn snapshot_imfs_item<'source>(
|
||||
imfs: &'source Imfs,
|
||||
item: &'source ImfsItem,
|
||||
instance_name: Option<Cow<'source, str>>,
|
||||
) -> SnapshotResult<'source> {
|
||||
match item {
|
||||
ImfsItem::File(file) => snapshot_imfs_file(file, instance_name),
|
||||
ImfsItem::Directory(directory) => snapshot_imfs_directory(imfs, directory, instance_name),
|
||||
}
|
||||
}
|
||||
|
||||
let mut children_to_update: Vec<(RbxId, &RbxSnapshotInstance)> = Vec::new();
|
||||
let mut children_to_add: Vec<&RbxSnapshotInstance> = Vec::new();
|
||||
let mut children_to_remove: Vec<RbxId> = Vec::new();
|
||||
fn snapshot_imfs_directory<'source>(
|
||||
imfs: &'source Imfs,
|
||||
directory: &'source ImfsDirectory,
|
||||
instance_name: Option<Cow<'source, str>>,
|
||||
) -> SnapshotResult<'source> {
|
||||
let init_path = directory.path.join(INIT_MODULE_NAME);
|
||||
let init_server_path = directory.path.join(INIT_SERVER_NAME);
|
||||
let init_client_path = directory.path.join(INIT_CLIENT_NAME);
|
||||
|
||||
let children_ids = tree.get_instance(id).unwrap().get_children_ids();
|
||||
let snapshot_name = instance_name
|
||||
.unwrap_or_else(|| {
|
||||
Cow::Borrowed(directory.path
|
||||
.file_name().expect("Could not extract file name")
|
||||
.to_str().expect("Could not convert path to UTF-8"))
|
||||
});
|
||||
|
||||
// Find all instances that were removed or updated, which we derive by
|
||||
// trying to pair up existing instances to snapshots.
|
||||
for &child_id in children_ids {
|
||||
let child_instance = tree.get_instance(child_id).unwrap();
|
||||
|
||||
// Locate a matching snapshot for this instance
|
||||
let mut matching_snapshot = None;
|
||||
for (snapshot_index, child_snapshot) in snapshot.children.iter().enumerate() {
|
||||
if visited_snapshot_indices.contains(&snapshot_index) {
|
||||
continue;
|
||||
}
|
||||
|
||||
// We assume that instances with the same name are probably pretty
|
||||
// similar. This heuristic is similar to React's reconciliation
|
||||
// strategy.
|
||||
if child_snapshot.name == child_instance.name {
|
||||
visited_snapshot_indices.insert(snapshot_index);
|
||||
matching_snapshot = Some(child_snapshot);
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
match matching_snapshot {
|
||||
Some(child_snapshot) => {
|
||||
children_to_update.push((child_instance.get_id(), child_snapshot));
|
||||
let mut snapshot = if directory.children.contains(&init_path) {
|
||||
snapshot_imfs_path(imfs, &init_path, Some(snapshot_name))?.unwrap()
|
||||
} else if directory.children.contains(&init_server_path) {
|
||||
snapshot_imfs_path(imfs, &init_server_path, Some(snapshot_name))?.unwrap()
|
||||
} else if directory.children.contains(&init_client_path) {
|
||||
snapshot_imfs_path(imfs, &init_client_path, Some(snapshot_name))?.unwrap()
|
||||
} else {
|
||||
RbxSnapshotInstance {
|
||||
class_name: Cow::Borrowed("Folder"),
|
||||
name: snapshot_name,
|
||||
properties: HashMap::new(),
|
||||
children: Vec::new(),
|
||||
metadata: MetadataPerInstance {
|
||||
source_path: None,
|
||||
ignore_unknown_instances: false,
|
||||
project_definition: None,
|
||||
},
|
||||
None => {
|
||||
children_to_remove.push(child_instance.get_id());
|
||||
}
|
||||
};
|
||||
|
||||
snapshot.metadata.source_path = Some(directory.path.to_owned());
|
||||
|
||||
for child_path in &directory.children {
|
||||
let child_name = child_path
|
||||
.file_name().expect("Couldn't extract file name")
|
||||
.to_str().expect("Couldn't convert file name to UTF-8");
|
||||
|
||||
match child_name {
|
||||
INIT_MODULE_NAME | INIT_SERVER_NAME | INIT_CLIENT_NAME => {
|
||||
// The existence of files with these names modifies the
|
||||
// parent instance and is handled above, so we can skip
|
||||
// them here.
|
||||
},
|
||||
_ => {
|
||||
if let Some(child) = snapshot_imfs_path(imfs, child_path, None)? {
|
||||
snapshot.children.push(child);
|
||||
}
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
// Find all instancs that were added, which is just the snapshots we didn't
|
||||
// match up to existing instances above.
|
||||
for (snapshot_index, child_snapshot) in snapshot.children.iter().enumerate() {
|
||||
if !visited_snapshot_indices.contains(&snapshot_index) {
|
||||
children_to_add.push(child_snapshot);
|
||||
}
|
||||
}
|
||||
Ok(Some(snapshot))
|
||||
}
|
||||
|
||||
for child_snapshot in &children_to_add {
|
||||
reify_subtree(child_snapshot, tree, id, path_map, instance_metadata_map, changes);
|
||||
}
|
||||
fn snapshot_imfs_file<'source>(
|
||||
file: &'source ImfsFile,
|
||||
instance_name: Option<Cow<'source, str>>,
|
||||
) -> SnapshotResult<'source> {
|
||||
let extension = file.path.extension()
|
||||
.map(|v| v.to_str().expect("Could not convert extension to UTF-8"));
|
||||
|
||||
for child_id in &children_to_remove {
|
||||
if let Some(subtree) = tree.remove_instance(*child_id) {
|
||||
for id in subtree.iter_all_ids() {
|
||||
instance_metadata_map.remove(&id);
|
||||
changes.removed.insert(id);
|
||||
let mut maybe_snapshot = match extension {
|
||||
Some("lua") => snapshot_lua_file(file)?,
|
||||
Some("csv") => snapshot_csv_file(file)?,
|
||||
Some("txt") => snapshot_txt_file(file)?,
|
||||
Some("rbxmx") => snapshot_xml_model_file(file)?,
|
||||
Some("rbxm") => snapshot_binary_model_file(file)?,
|
||||
Some("json") => {
|
||||
let file_stem = file.path
|
||||
.file_stem().expect("Could not extract file stem")
|
||||
.to_str().expect("Could not convert path to UTF-8");
|
||||
|
||||
if file_stem.ends_with(".model") {
|
||||
snapshot_json_model_file(file)?
|
||||
} else {
|
||||
None
|
||||
}
|
||||
},
|
||||
Some(_) | None => None,
|
||||
};
|
||||
|
||||
if let Some(snapshot) = maybe_snapshot.as_mut() {
|
||||
// Carefully preserve name from project manifest if present.
|
||||
if let Some(snapshot_name) = instance_name {
|
||||
snapshot.name = snapshot_name;
|
||||
}
|
||||
} else {
|
||||
info!("File generated no snapshot: {}", file.path.display());
|
||||
}
|
||||
|
||||
for (child_id, child_snapshot) in &children_to_update {
|
||||
reconcile_subtree(tree, *child_id, child_snapshot, path_map, instance_metadata_map, changes);
|
||||
Ok(maybe_snapshot)
|
||||
}
|
||||
|
||||
fn snapshot_lua_file<'source>(
|
||||
file: &'source ImfsFile,
|
||||
) -> SnapshotResult<'source> {
|
||||
let file_stem = file.path
|
||||
.file_stem().expect("Could not extract file stem")
|
||||
.to_str().expect("Could not convert path to UTF-8");
|
||||
|
||||
let (instance_name, class_name) = if let Some(name) = match_trailing(file_stem, ".server") {
|
||||
(name, "Script")
|
||||
} else if let Some(name) = match_trailing(file_stem, ".client") {
|
||||
(name, "LocalScript")
|
||||
} else {
|
||||
(file_stem, "ModuleScript")
|
||||
};
|
||||
|
||||
let contents = str::from_utf8(&file.contents)
|
||||
.map_err(|inner| SnapshotError::Utf8Error {
|
||||
inner,
|
||||
path: file.path.to_path_buf(),
|
||||
})?;
|
||||
|
||||
Ok(Some(RbxSnapshotInstance {
|
||||
name: Cow::Borrowed(instance_name),
|
||||
class_name: Cow::Borrowed(class_name),
|
||||
properties: hashmap! {
|
||||
"Source".to_owned() => RbxValue::String {
|
||||
value: contents.to_owned(),
|
||||
},
|
||||
},
|
||||
children: Vec::new(),
|
||||
metadata: MetadataPerInstance {
|
||||
source_path: Some(file.path.to_path_buf()),
|
||||
ignore_unknown_instances: false,
|
||||
project_definition: None,
|
||||
},
|
||||
}))
|
||||
}
|
||||
|
||||
fn match_trailing<'a>(input: &'a str, trailer: &str) -> Option<&'a str> {
|
||||
if input.ends_with(trailer) {
|
||||
let end = input.len().saturating_sub(trailer.len());
|
||||
Some(&input[..end])
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
fn snapshot_txt_file<'source>(
|
||||
file: &'source ImfsFile,
|
||||
) -> SnapshotResult<'source> {
|
||||
let instance_name = file.path
|
||||
.file_stem().expect("Could not extract file stem")
|
||||
.to_str().expect("Could not convert path to UTF-8");
|
||||
|
||||
let contents = str::from_utf8(&file.contents)
|
||||
.map_err(|inner| SnapshotError::Utf8Error {
|
||||
inner,
|
||||
path: file.path.to_path_buf(),
|
||||
})?;
|
||||
|
||||
Ok(Some(RbxSnapshotInstance {
|
||||
name: Cow::Borrowed(instance_name),
|
||||
class_name: Cow::Borrowed("StringValue"),
|
||||
properties: hashmap! {
|
||||
"Value".to_owned() => RbxValue::String {
|
||||
value: contents.to_owned(),
|
||||
},
|
||||
},
|
||||
children: Vec::new(),
|
||||
metadata: MetadataPerInstance {
|
||||
source_path: Some(file.path.to_path_buf()),
|
||||
ignore_unknown_instances: false,
|
||||
project_definition: None,
|
||||
},
|
||||
}))
|
||||
}
|
||||
|
||||
fn snapshot_csv_file<'source>(
|
||||
file: &'source ImfsFile,
|
||||
) -> SnapshotResult<'source> {
|
||||
let instance_name = file.path
|
||||
.file_stem().expect("Could not extract file stem")
|
||||
.to_str().expect("Could not convert path to UTF-8");
|
||||
|
||||
let entries: Vec<LocalizationEntryJson> = csv::Reader::from_reader(file.contents.as_slice())
|
||||
.deserialize()
|
||||
// TODO: Propagate error upward instead of panicking
|
||||
.map(|result| result.expect("Malformed localization table found!"))
|
||||
.map(LocalizationEntryCsv::to_json)
|
||||
.collect();
|
||||
|
||||
let table_contents = serde_json::to_string(&entries)
|
||||
.expect("Could not encode JSON for localization table");
|
||||
|
||||
Ok(Some(RbxSnapshotInstance {
|
||||
name: Cow::Borrowed(instance_name),
|
||||
class_name: Cow::Borrowed("LocalizationTable"),
|
||||
properties: hashmap! {
|
||||
"Contents".to_owned() => RbxValue::String {
|
||||
value: table_contents,
|
||||
},
|
||||
},
|
||||
children: Vec::new(),
|
||||
metadata: MetadataPerInstance {
|
||||
source_path: Some(file.path.to_path_buf()),
|
||||
ignore_unknown_instances: false,
|
||||
project_definition: None,
|
||||
},
|
||||
}))
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "PascalCase")]
|
||||
struct LocalizationEntryCsv {
|
||||
key: String,
|
||||
context: String,
|
||||
example: String,
|
||||
source: String,
|
||||
#[serde(flatten)]
|
||||
values: HashMap<String, String>,
|
||||
}
|
||||
|
||||
impl LocalizationEntryCsv {
|
||||
fn to_json(self) -> LocalizationEntryJson {
|
||||
LocalizationEntryJson {
|
||||
key: self.key,
|
||||
context: self.context,
|
||||
example: self.example,
|
||||
source: self.source,
|
||||
values: self.values,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
struct LocalizationEntryJson {
|
||||
key: String,
|
||||
context: String,
|
||||
example: String,
|
||||
source: String,
|
||||
values: HashMap<String, String>,
|
||||
}
|
||||
|
||||
fn snapshot_json_model_file<'source>(
|
||||
file: &'source ImfsFile,
|
||||
) -> SnapshotResult<'source> {
|
||||
let contents = str::from_utf8(&file.contents)
|
||||
.map_err(|inner| SnapshotError::Utf8Error {
|
||||
inner,
|
||||
path: file.path.to_owned(),
|
||||
})?;
|
||||
|
||||
let json_instance: JsonModelInstance = serde_json::from_str(contents)
|
||||
.map_err(|inner| SnapshotError::JsonModelDecodeError {
|
||||
inner,
|
||||
path: file.path.to_owned(),
|
||||
})?;
|
||||
|
||||
let mut snapshot = json_instance.into_snapshot();
|
||||
snapshot.metadata.source_path = Some(file.path.to_owned());
|
||||
|
||||
Ok(Some(snapshot))
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "PascalCase")]
|
||||
struct JsonModelInstance {
|
||||
name: String,
|
||||
class_name: String,
|
||||
|
||||
#[serde(default = "Vec::new", skip_serializing_if = "Vec::is_empty")]
|
||||
children: Vec<JsonModelInstance>,
|
||||
|
||||
#[serde(default = "HashMap::new", skip_serializing_if = "HashMap::is_empty")]
|
||||
properties: HashMap<String, RbxValue>,
|
||||
}
|
||||
|
||||
impl JsonModelInstance {
|
||||
fn into_snapshot(mut self) -> RbxSnapshotInstance<'static> {
|
||||
let children = self.children
|
||||
.drain(..)
|
||||
.map(JsonModelInstance::into_snapshot)
|
||||
.collect();
|
||||
|
||||
RbxSnapshotInstance {
|
||||
name: Cow::Owned(self.name),
|
||||
class_name: Cow::Owned(self.class_name),
|
||||
properties: self.properties,
|
||||
children,
|
||||
metadata: Default::default(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn snapshot_xml_model_file<'source>(
|
||||
file: &'source ImfsFile,
|
||||
) -> SnapshotResult<'source> {
|
||||
let instance_name = file.path
|
||||
.file_stem().expect("Could not extract file stem")
|
||||
.to_str().expect("Could not convert path to UTF-8");
|
||||
|
||||
let mut temp_tree = RbxTree::new(RbxInstanceProperties {
|
||||
name: "Temp".to_owned(),
|
||||
class_name: "Folder".to_owned(),
|
||||
properties: HashMap::new(),
|
||||
});
|
||||
|
||||
let root_id = temp_tree.get_root_id();
|
||||
rbx_xml::decode(&mut temp_tree, root_id, file.contents.as_slice())
|
||||
.map_err(|inner| SnapshotError::XmlModelDecodeError {
|
||||
inner,
|
||||
path: file.path.clone(),
|
||||
})?;
|
||||
|
||||
let root_instance = temp_tree.get_instance(root_id).unwrap();
|
||||
let children = root_instance.get_children_ids();
|
||||
|
||||
match children.len() {
|
||||
0 => Ok(None),
|
||||
1 => {
|
||||
let mut snapshot = snapshot_from_tree(&temp_tree, children[0]).unwrap();
|
||||
snapshot.name = Cow::Borrowed(instance_name);
|
||||
Ok(Some(snapshot))
|
||||
},
|
||||
_ => panic!("Rojo doesn't have support for model files with multiple roots yet"),
|
||||
}
|
||||
}
|
||||
|
||||
fn snapshot_binary_model_file<'source>(
|
||||
file: &'source ImfsFile,
|
||||
) -> SnapshotResult<'source> {
|
||||
let instance_name = file.path
|
||||
.file_stem().expect("Could not extract file stem")
|
||||
.to_str().expect("Could not convert path to UTF-8");
|
||||
|
||||
let mut temp_tree = RbxTree::new(RbxInstanceProperties {
|
||||
name: "Temp".to_owned(),
|
||||
class_name: "Folder".to_owned(),
|
||||
properties: HashMap::new(),
|
||||
});
|
||||
|
||||
let root_id = temp_tree.get_root_id();
|
||||
rbx_binary::decode(&mut temp_tree, root_id, file.contents.as_slice())
|
||||
.map_err(|inner| SnapshotError::BinaryModelDecodeError {
|
||||
inner,
|
||||
path: file.path.clone(),
|
||||
})?;
|
||||
|
||||
let root_instance = temp_tree.get_instance(root_id).unwrap();
|
||||
let children = root_instance.get_children_ids();
|
||||
|
||||
match children.len() {
|
||||
0 => Ok(None),
|
||||
1 => {
|
||||
let mut snapshot = snapshot_from_tree(&temp_tree, children[0]).unwrap();
|
||||
snapshot.name = Cow::Borrowed(instance_name);
|
||||
Ok(Some(snapshot))
|
||||
},
|
||||
_ => panic!("Rojo doesn't have support for model files with multiple roots yet"),
|
||||
}
|
||||
}
|
||||
@@ -1,63 +0,0 @@
|
||||
//! Interactions with Roblox Studio's installation, including its location and
|
||||
//! mechanisms like PluginSettings.
|
||||
|
||||
#![allow(dead_code)]
|
||||
|
||||
use std::path::PathBuf;
|
||||
|
||||
#[cfg(all(not(debug_assertions), not(feature = "bundle-plugin")))]
|
||||
compile_error!("`bundle-plugin` feature must be set for release builds.");
|
||||
|
||||
#[cfg(feature = "bundle-plugin")]
|
||||
static PLUGIN_RBXM: &'static [u8] = include_bytes!("../target/plugin.rbxmx");
|
||||
|
||||
#[cfg(target_os = "windows")]
|
||||
pub fn get_install_location() -> Option<PathBuf> {
|
||||
use std::env;
|
||||
|
||||
let local_app_data = env::var("LocalAppData").ok()?;
|
||||
let mut location = PathBuf::from(local_app_data);
|
||||
|
||||
location.push("Roblox");
|
||||
|
||||
Some(location)
|
||||
}
|
||||
|
||||
#[cfg(target_os = "macos")]
|
||||
pub fn get_install_location() -> Option<PathBuf> {
|
||||
unimplemented!();
|
||||
}
|
||||
|
||||
#[cfg(not(any(target_os = "windows", target_os = "macos")))]
|
||||
pub fn get_install_location() -> Option<PathBuf> {
|
||||
// Roblox Studio doesn't install on any other platforms!
|
||||
None
|
||||
}
|
||||
|
||||
pub fn get_plugin_location() -> Option<PathBuf> {
|
||||
let mut location = get_install_location()?;
|
||||
|
||||
location.push("Plugins/Rojo.rbxmx");
|
||||
|
||||
Some(location)
|
||||
}
|
||||
|
||||
#[cfg(feature = "bundle-plugin")]
|
||||
pub fn install_bundled_plugin() -> Option<()> {
|
||||
use std::fs::File;
|
||||
use std::io::Write;
|
||||
|
||||
info!("Installing plugin...");
|
||||
|
||||
let mut file = File::create(get_plugin_location()?).ok()?;
|
||||
file.write_all(PLUGIN_RBXM).ok()?;
|
||||
|
||||
Some(())
|
||||
}
|
||||
|
||||
#[cfg(not(feature = "bundle-plugin"))]
|
||||
pub fn install_bundled_plugin() -> Option<()> {
|
||||
info!("Skipping plugin installation, bundle-plugin not set.");
|
||||
|
||||
Some(())
|
||||
}
|
||||
338
server/src/snapshot_reconciler.rs
Normal file
@@ -0,0 +1,338 @@
|
||||
//! Defines the snapshot subsystem of Rojo, which defines a lightweight instance
|
||||
//! representation (`RbxSnapshotInstance`) and a system to incrementally update
|
||||
//! an `RbxTree` based on snapshots.
|
||||
|
||||
use std::{
|
||||
borrow::Cow,
|
||||
cmp::Ordering,
|
||||
collections::{HashMap, HashSet},
|
||||
fmt,
|
||||
str,
|
||||
};
|
||||
|
||||
use rbx_tree::{RbxTree, RbxId, RbxInstanceProperties, RbxValue};
|
||||
use serde_derive::{Serialize, Deserialize};
|
||||
|
||||
use crate::{
|
||||
path_map::PathMap,
|
||||
rbx_session::MetadataPerInstance,
|
||||
};
|
||||
|
||||
/// Contains all of the IDs that were modified when the snapshot reconciler
|
||||
/// applied an update.
|
||||
#[derive(Debug, Clone, Default, Serialize, Deserialize)]
|
||||
pub struct InstanceChanges {
|
||||
pub added: HashSet<RbxId>,
|
||||
pub removed: HashSet<RbxId>,
|
||||
pub updated: HashSet<RbxId>,
|
||||
}
|
||||
|
||||
impl fmt::Display for InstanceChanges {
|
||||
fn fmt(&self, output: &mut fmt::Formatter) -> fmt::Result {
|
||||
writeln!(output, "InstanceChanges {{")?;
|
||||
|
||||
if !self.added.is_empty() {
|
||||
writeln!(output, " Added:")?;
|
||||
for id in &self.added {
|
||||
writeln!(output, " {}", id)?;
|
||||
}
|
||||
}
|
||||
|
||||
if !self.removed.is_empty() {
|
||||
writeln!(output, " Removed:")?;
|
||||
for id in &self.removed {
|
||||
writeln!(output, " {}", id)?;
|
||||
}
|
||||
}
|
||||
|
||||
if !self.updated.is_empty() {
|
||||
writeln!(output, " Updated:")?;
|
||||
for id in &self.updated {
|
||||
writeln!(output, " {}", id)?;
|
||||
}
|
||||
}
|
||||
|
||||
writeln!(output, "}}")
|
||||
}
|
||||
}
|
||||
|
||||
impl InstanceChanges {
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.added.is_empty() && self.removed.is_empty() && self.updated.is_empty()
|
||||
}
|
||||
}
|
||||
|
||||
/// A lightweight, hierarchical representation of an instance that can be
|
||||
/// applied to the tree.
|
||||
#[derive(Debug, PartialEq, Serialize, Deserialize)]
|
||||
pub struct RbxSnapshotInstance<'a> {
|
||||
pub name: Cow<'a, str>,
|
||||
pub class_name: Cow<'a, str>,
|
||||
pub properties: HashMap<String, RbxValue>,
|
||||
pub children: Vec<RbxSnapshotInstance<'a>>,
|
||||
pub metadata: MetadataPerInstance,
|
||||
}
|
||||
|
||||
impl<'a> PartialOrd for RbxSnapshotInstance<'a> {
|
||||
fn partial_cmp(&self, other: &RbxSnapshotInstance) -> Option<Ordering> {
|
||||
Some(self.name.cmp(&other.name)
|
||||
.then(self.class_name.cmp(&other.class_name)))
|
||||
}
|
||||
}
|
||||
|
||||
/// Generates an `RbxSnapshotInstance` from an existing `RbxTree` and an ID to
|
||||
/// use as the root of the snapshot.
|
||||
///
|
||||
/// This is used to transform instances created by rbx_xml and rbx_binary into
|
||||
/// snapshots that can be applied to the tree to reduce instance churn.
|
||||
pub fn snapshot_from_tree(tree: &RbxTree, id: RbxId) -> Option<RbxSnapshotInstance<'static>> {
|
||||
let instance = tree.get_instance(id)?;
|
||||
|
||||
let mut children = Vec::new();
|
||||
for &child_id in instance.get_children_ids() {
|
||||
children.push(snapshot_from_tree(tree, child_id)?);
|
||||
}
|
||||
|
||||
Some(RbxSnapshotInstance {
|
||||
name: Cow::Owned(instance.name.to_owned()),
|
||||
class_name: Cow::Owned(instance.class_name.to_owned()),
|
||||
properties: instance.properties.clone(),
|
||||
children,
|
||||
metadata: MetadataPerInstance {
|
||||
source_path: None,
|
||||
ignore_unknown_instances: false,
|
||||
project_definition: None,
|
||||
},
|
||||
})
|
||||
}
|
||||
|
||||
/// Constructs a new `RbxTree` out of a snapshot and places to attach metadata.
|
||||
pub fn reify_root(
|
||||
snapshot: &RbxSnapshotInstance,
|
||||
instance_per_path: &mut PathMap<HashSet<RbxId>>,
|
||||
metadata_per_instance: &mut HashMap<RbxId, MetadataPerInstance>,
|
||||
changes: &mut InstanceChanges,
|
||||
) -> RbxTree {
|
||||
let instance = reify_core(snapshot);
|
||||
let mut tree = RbxTree::new(instance);
|
||||
let id = tree.get_root_id();
|
||||
|
||||
reify_metadata(snapshot, id, instance_per_path, metadata_per_instance);
|
||||
|
||||
changes.added.insert(id);
|
||||
|
||||
for child in &snapshot.children {
|
||||
reify_subtree(child, &mut tree, id, instance_per_path, metadata_per_instance, changes);
|
||||
}
|
||||
|
||||
tree
|
||||
}
|
||||
|
||||
/// Adds instances to a portion of the given `RbxTree`, used for when new
|
||||
/// instances are created.
|
||||
pub fn reify_subtree(
|
||||
snapshot: &RbxSnapshotInstance,
|
||||
tree: &mut RbxTree,
|
||||
parent_id: RbxId,
|
||||
instance_per_path: &mut PathMap<HashSet<RbxId>>,
|
||||
metadata_per_instance: &mut HashMap<RbxId, MetadataPerInstance>,
|
||||
changes: &mut InstanceChanges,
|
||||
) {
|
||||
let instance = reify_core(snapshot);
|
||||
let id = tree.insert_instance(instance, parent_id);
|
||||
|
||||
reify_metadata(snapshot, id, instance_per_path, metadata_per_instance);
|
||||
|
||||
changes.added.insert(id);
|
||||
|
||||
for child in &snapshot.children {
|
||||
reify_subtree(child, tree, id, instance_per_path, metadata_per_instance, changes);
|
||||
}
|
||||
}
|
||||
|
||||
fn reify_metadata(
|
||||
snapshot: &RbxSnapshotInstance,
|
||||
instance_id: RbxId,
|
||||
instance_per_path: &mut PathMap<HashSet<RbxId>>,
|
||||
metadata_per_instance: &mut HashMap<RbxId, MetadataPerInstance>,
|
||||
) {
|
||||
if let Some(source_path) = &snapshot.metadata.source_path {
|
||||
let path_metadata = match instance_per_path.get_mut(&source_path) {
|
||||
Some(v) => v,
|
||||
None => {
|
||||
instance_per_path.insert(source_path.clone(), Default::default());
|
||||
instance_per_path.get_mut(&source_path).unwrap()
|
||||
},
|
||||
};
|
||||
|
||||
path_metadata.insert(instance_id);
|
||||
}
|
||||
|
||||
metadata_per_instance.insert(instance_id, snapshot.metadata.clone());
|
||||
}
|
||||
|
||||
/// Updates existing instances in an existing `RbxTree`, potentially adding,
|
||||
/// updating, or removing children and properties.
|
||||
pub fn reconcile_subtree(
|
||||
tree: &mut RbxTree,
|
||||
id: RbxId,
|
||||
snapshot: &RbxSnapshotInstance,
|
||||
instance_per_path: &mut PathMap<HashSet<RbxId>>,
|
||||
metadata_per_instance: &mut HashMap<RbxId, MetadataPerInstance>,
|
||||
changes: &mut InstanceChanges,
|
||||
) {
|
||||
reify_metadata(snapshot, id, instance_per_path, metadata_per_instance);
|
||||
|
||||
if reconcile_instance_properties(tree.get_instance_mut(id).unwrap(), snapshot) {
|
||||
changes.updated.insert(id);
|
||||
}
|
||||
|
||||
reconcile_instance_children(tree, id, snapshot, instance_per_path, metadata_per_instance, changes);
|
||||
}
|
||||
|
||||
fn reify_core(snapshot: &RbxSnapshotInstance) -> RbxInstanceProperties {
|
||||
let mut properties = HashMap::new();
|
||||
|
||||
for (key, value) in &snapshot.properties {
|
||||
properties.insert(key.clone(), value.clone());
|
||||
}
|
||||
|
||||
let instance = RbxInstanceProperties {
|
||||
name: snapshot.name.to_string(),
|
||||
class_name: snapshot.class_name.to_string(),
|
||||
properties,
|
||||
};
|
||||
|
||||
instance
|
||||
}
|
||||
|
||||
fn reconcile_instance_properties(instance: &mut RbxInstanceProperties, snapshot: &RbxSnapshotInstance) -> bool {
|
||||
let mut has_diffs = false;
|
||||
|
||||
if instance.name != snapshot.name {
|
||||
instance.name = snapshot.name.to_string();
|
||||
has_diffs = true;
|
||||
}
|
||||
|
||||
if instance.class_name != snapshot.class_name {
|
||||
instance.class_name = snapshot.class_name.to_string();
|
||||
has_diffs = true;
|
||||
}
|
||||
|
||||
let mut property_updates = HashMap::new();
|
||||
|
||||
for (key, instance_value) in &instance.properties {
|
||||
match snapshot.properties.get(key) {
|
||||
Some(snapshot_value) => {
|
||||
if snapshot_value != instance_value {
|
||||
property_updates.insert(key.clone(), Some(snapshot_value.clone()));
|
||||
}
|
||||
},
|
||||
None => {
|
||||
property_updates.insert(key.clone(), None);
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
for (key, snapshot_value) in &snapshot.properties {
|
||||
if property_updates.contains_key(key) {
|
||||
continue;
|
||||
}
|
||||
|
||||
match instance.properties.get(key) {
|
||||
Some(instance_value) => {
|
||||
if snapshot_value != instance_value {
|
||||
property_updates.insert(key.clone(), Some(snapshot_value.clone()));
|
||||
}
|
||||
},
|
||||
None => {
|
||||
property_updates.insert(key.clone(), Some(snapshot_value.clone()));
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
has_diffs = has_diffs || !property_updates.is_empty();
|
||||
|
||||
for (key, change) in property_updates.drain() {
|
||||
match change {
|
||||
Some(value) => instance.properties.insert(key, value),
|
||||
None => instance.properties.remove(&key),
|
||||
};
|
||||
}
|
||||
|
||||
has_diffs
|
||||
}
|
||||
|
||||
fn reconcile_instance_children(
|
||||
tree: &mut RbxTree,
|
||||
id: RbxId,
|
||||
snapshot: &RbxSnapshotInstance,
|
||||
instance_per_path: &mut PathMap<HashSet<RbxId>>,
|
||||
metadata_per_instance: &mut HashMap<RbxId, MetadataPerInstance>,
|
||||
changes: &mut InstanceChanges,
|
||||
) {
|
||||
let mut visited_snapshot_indices = HashSet::new();
|
||||
|
||||
let mut children_to_update: Vec<(RbxId, &RbxSnapshotInstance)> = Vec::new();
|
||||
let mut children_to_add: Vec<&RbxSnapshotInstance> = Vec::new();
|
||||
let mut children_to_remove: Vec<RbxId> = Vec::new();
|
||||
|
||||
let children_ids = tree.get_instance(id).unwrap().get_children_ids();
|
||||
|
||||
// Find all instances that were removed or updated, which we derive by
|
||||
// trying to pair up existing instances to snapshots.
|
||||
for &child_id in children_ids {
|
||||
let child_instance = tree.get_instance(child_id).unwrap();
|
||||
|
||||
// Locate a matching snapshot for this instance
|
||||
let mut matching_snapshot = None;
|
||||
for (snapshot_index, child_snapshot) in snapshot.children.iter().enumerate() {
|
||||
if visited_snapshot_indices.contains(&snapshot_index) {
|
||||
continue;
|
||||
}
|
||||
|
||||
// We assume that instances with the same name are probably pretty
|
||||
// similar. This heuristic is similar to React's reconciliation
|
||||
// strategy.
|
||||
if child_snapshot.name == child_instance.name {
|
||||
visited_snapshot_indices.insert(snapshot_index);
|
||||
matching_snapshot = Some(child_snapshot);
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
match matching_snapshot {
|
||||
Some(child_snapshot) => {
|
||||
children_to_update.push((child_instance.get_id(), child_snapshot));
|
||||
},
|
||||
None => {
|
||||
children_to_remove.push(child_instance.get_id());
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
// Find all instancs that were added, which is just the snapshots we didn't
|
||||
// match up to existing instances above.
|
||||
for (snapshot_index, child_snapshot) in snapshot.children.iter().enumerate() {
|
||||
if !visited_snapshot_indices.contains(&snapshot_index) {
|
||||
children_to_add.push(child_snapshot);
|
||||
}
|
||||
}
|
||||
|
||||
for child_snapshot in &children_to_add {
|
||||
reify_subtree(child_snapshot, tree, id, instance_per_path, metadata_per_instance, changes);
|
||||
}
|
||||
|
||||
for child_id in &children_to_remove {
|
||||
if let Some(subtree) = tree.remove_instance(*child_id) {
|
||||
for id in subtree.iter_all_ids() {
|
||||
metadata_per_instance.remove(&id);
|
||||
changes.removed.insert(id);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for (child_id, child_snapshot) in &children_to_update {
|
||||
reconcile_subtree(tree, *child_id, child_snapshot, instance_per_path, metadata_per_instance, changes);
|
||||
}
|
||||
}
|
||||
@@ -5,11 +5,13 @@ use std::{
|
||||
process::{Command, Stdio},
|
||||
};
|
||||
|
||||
use log::warn;
|
||||
use rbx_tree::RbxId;
|
||||
|
||||
use crate::{
|
||||
imfs::{Imfs, ImfsItem},
|
||||
rbx_session::RbxSession,
|
||||
web::PublicInstanceMetadata,
|
||||
};
|
||||
|
||||
static GRAPHVIZ_HEADER: &str = r#"
|
||||
@@ -25,13 +27,22 @@ digraph RojoTree {
|
||||
];
|
||||
"#;
|
||||
|
||||
pub fn graphviz_to_svg(source: &str) -> String {
|
||||
let mut child = Command::new("dot")
|
||||
/// Compiles DOT source to SVG by invoking dot on the command line.
|
||||
pub fn graphviz_to_svg(source: &str) -> Option<String> {
|
||||
let command = Command::new("dot")
|
||||
.arg("-Tsvg")
|
||||
.stdin(Stdio::piped())
|
||||
.stdout(Stdio::piped())
|
||||
.spawn()
|
||||
.expect("Failed to spawn GraphViz process -- make sure it's installed in order to use /api/visualize");
|
||||
.spawn();
|
||||
|
||||
let mut child = match command {
|
||||
Ok(child) => child,
|
||||
Err(_) => {
|
||||
warn!("Failed to spawn GraphViz process to visualize current state.");
|
||||
warn!("If you want pretty graphs, install GraphViz and make sure 'dot' is on your PATH!");
|
||||
return None;
|
||||
},
|
||||
};
|
||||
|
||||
{
|
||||
let stdin = child.stdin.as_mut().expect("Failed to open stdin");
|
||||
@@ -39,9 +50,10 @@ pub fn graphviz_to_svg(source: &str) -> String {
|
||||
}
|
||||
|
||||
let output = child.wait_with_output().expect("Failed to read stdout");
|
||||
String::from_utf8(output.stdout).expect("Failed to parse stdout as UTF-8")
|
||||
Some(String::from_utf8(output.stdout).expect("Failed to parse stdout as UTF-8"))
|
||||
}
|
||||
|
||||
/// A Display wrapper struct to visualize an RbxSession as SVG.
|
||||
pub struct VisualizeRbxSession<'a>(pub &'a RbxSession);
|
||||
|
||||
impl<'a> fmt::Display for VisualizeRbxSession<'a> {
|
||||
@@ -61,9 +73,10 @@ fn visualize_rbx_node(session: &RbxSession, id: RbxId, output: &mut fmt::Formatt
|
||||
|
||||
let mut node_label = format!("{}|{}|{}", node.name, node.class_name, id);
|
||||
|
||||
if let Some(metadata) = session.get_instance_metadata(id) {
|
||||
if let Some(session_metadata) = session.get_instance_metadata(id) {
|
||||
let metadata = PublicInstanceMetadata::from_session_metadata(session_metadata);
|
||||
node_label.push('|');
|
||||
node_label.push_str(&serde_json::to_string(metadata).unwrap());
|
||||
node_label.push_str(&serde_json::to_string(&metadata).unwrap());
|
||||
}
|
||||
|
||||
node_label = node_label
|
||||
@@ -81,6 +94,7 @@ fn visualize_rbx_node(session: &RbxSession, id: RbxId, output: &mut fmt::Formatt
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// A Display wrapper struct to visualize an Imfs as SVG.
|
||||
pub struct VisualizeImfs<'a>(pub &'a Imfs);
|
||||
|
||||
impl<'a> fmt::Display for VisualizeImfs<'a> {
|
||||
|
||||
@@ -1,9 +1,14 @@
|
||||
//! Defines Rojo's web interface that all clients use to communicate with a
|
||||
//! running live-sync session.
|
||||
|
||||
use std::{
|
||||
borrow::Cow,
|
||||
collections::{HashMap, HashSet},
|
||||
sync::{mpsc, Arc},
|
||||
};
|
||||
|
||||
use serde_derive::{Serialize, Deserialize};
|
||||
use log::trace;
|
||||
use rouille::{
|
||||
self,
|
||||
router,
|
||||
@@ -13,13 +18,30 @@ use rouille::{
|
||||
use rbx_tree::{RbxId, RbxInstance};
|
||||
|
||||
use crate::{
|
||||
session::Session,
|
||||
live_session::LiveSession,
|
||||
session_id::SessionId,
|
||||
project::InstanceProjectNodeMetadata,
|
||||
rbx_snapshot::InstanceChanges,
|
||||
snapshot_reconciler::InstanceChanges,
|
||||
visualize::{VisualizeRbxSession, VisualizeImfs, graphviz_to_svg},
|
||||
rbx_session::{MetadataPerInstance},
|
||||
};
|
||||
|
||||
static HOME_CONTENT: &str = include_str!("../assets/index.html");
|
||||
|
||||
/// Contains the instance metadata relevant to Rojo clients.
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
#[serde(rename_all = "camelCase")]
|
||||
pub struct PublicInstanceMetadata {
|
||||
ignore_unknown_instances: bool,
|
||||
}
|
||||
|
||||
impl PublicInstanceMetadata {
|
||||
pub fn from_session_metadata(meta: &MetadataPerInstance) -> PublicInstanceMetadata {
|
||||
PublicInstanceMetadata {
|
||||
ignore_unknown_instances: meta.ignore_unknown_instances,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Used to attach metadata specific to Rojo to instances, which come from the
|
||||
/// rbx_tree crate.
|
||||
///
|
||||
@@ -31,7 +53,7 @@ pub struct InstanceWithMetadata<'a> {
|
||||
pub instance: Cow<'a, RbxInstance>,
|
||||
|
||||
#[serde(rename = "Metadata")]
|
||||
pub metadata: Option<Cow<'a, InstanceProjectNodeMetadata>>,
|
||||
pub metadata: Option<PublicInstanceMetadata>,
|
||||
}
|
||||
|
||||
#[derive(Debug, Serialize, Deserialize)]
|
||||
@@ -61,14 +83,14 @@ pub struct SubscribeResponse<'a> {
|
||||
}
|
||||
|
||||
pub struct Server {
|
||||
session: Arc<Session>,
|
||||
live_session: Arc<LiveSession>,
|
||||
server_version: &'static str,
|
||||
}
|
||||
|
||||
impl Server {
|
||||
pub fn new(session: Arc<Session>) -> Server {
|
||||
pub fn new(live_session: Arc<LiveSession>) -> Server {
|
||||
Server {
|
||||
session,
|
||||
live_session,
|
||||
server_version: env!("CARGO_PKG_VERSION"),
|
||||
}
|
||||
}
|
||||
@@ -79,136 +101,28 @@ impl Server {
|
||||
|
||||
router!(request,
|
||||
(GET) (/) => {
|
||||
Response::text("Rojo is up and running!")
|
||||
self.handle_home()
|
||||
},
|
||||
|
||||
(GET) (/api/rojo) => {
|
||||
// Get a summary of information about the server.
|
||||
|
||||
let rbx_session = self.session.rbx_session.lock().unwrap();
|
||||
let tree = rbx_session.get_tree();
|
||||
|
||||
Response::json(&ServerInfoResponse {
|
||||
server_version: self.server_version,
|
||||
protocol_version: 2,
|
||||
session_id: self.session.session_id,
|
||||
expected_place_ids: self.session.project.serve_place_ids.clone(),
|
||||
root_instance_id: tree.get_root_id(),
|
||||
})
|
||||
self.handle_api_rojo()
|
||||
},
|
||||
|
||||
(GET) (/api/subscribe/{ cursor: u32 }) => {
|
||||
// Retrieve any messages past the given cursor index, and if
|
||||
// there weren't any, subscribe to receive any new messages.
|
||||
|
||||
let message_queue = Arc::clone(&self.session.message_queue);
|
||||
|
||||
// Did the client miss any messages since the last subscribe?
|
||||
{
|
||||
let (new_cursor, new_messages) = message_queue.get_messages_since(cursor);
|
||||
|
||||
if !new_messages.is_empty() {
|
||||
return Response::json(&SubscribeResponse {
|
||||
session_id: self.session.session_id,
|
||||
messages: Cow::Borrowed(&new_messages),
|
||||
message_cursor: new_cursor,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
let (tx, rx) = mpsc::channel();
|
||||
|
||||
let sender_id = message_queue.subscribe(tx);
|
||||
|
||||
match rx.recv() {
|
||||
Ok(_) => (),
|
||||
Err(_) => return Response::text("error!").with_status_code(500),
|
||||
}
|
||||
|
||||
message_queue.unsubscribe(sender_id);
|
||||
|
||||
{
|
||||
let (new_cursor, new_messages) = message_queue.get_messages_since(cursor);
|
||||
|
||||
return Response::json(&SubscribeResponse {
|
||||
session_id: self.session.session_id,
|
||||
messages: Cow::Owned(new_messages),
|
||||
message_cursor: new_cursor,
|
||||
})
|
||||
}
|
||||
self.handle_api_subscribe(cursor)
|
||||
},
|
||||
|
||||
(GET) (/api/read/{ id_list: String }) => {
|
||||
let message_queue = Arc::clone(&self.session.message_queue);
|
||||
|
||||
let requested_ids: Option<Vec<RbxId>> = id_list
|
||||
.split(',')
|
||||
.map(RbxId::parse_str)
|
||||
.collect();
|
||||
|
||||
let requested_ids = match requested_ids {
|
||||
Some(id) => id,
|
||||
None => return rouille::Response::text("Malformed ID list").with_status_code(400),
|
||||
};
|
||||
|
||||
let rbx_session = self.session.rbx_session.lock().unwrap();
|
||||
let tree = rbx_session.get_tree();
|
||||
|
||||
let message_cursor = message_queue.get_message_cursor();
|
||||
|
||||
let mut instances = HashMap::new();
|
||||
|
||||
for &requested_id in &requested_ids {
|
||||
if let Some(instance) = tree.get_instance(requested_id) {
|
||||
let metadata = rbx_session.get_instance_metadata(requested_id)
|
||||
.map(Cow::Borrowed);
|
||||
|
||||
instances.insert(instance.get_id(), InstanceWithMetadata {
|
||||
instance: Cow::Borrowed(instance),
|
||||
metadata,
|
||||
});
|
||||
|
||||
for descendant in tree.descendants(requested_id) {
|
||||
let descendant_meta = rbx_session.get_instance_metadata(descendant.get_id())
|
||||
.map(Cow::Borrowed);
|
||||
|
||||
instances.insert(descendant.get_id(), InstanceWithMetadata {
|
||||
instance: Cow::Borrowed(descendant),
|
||||
metadata: descendant_meta,
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Response::json(&ReadResponse {
|
||||
session_id: self.session.session_id,
|
||||
message_cursor,
|
||||
instances,
|
||||
})
|
||||
self.handle_api_read(requested_ids)
|
||||
},
|
||||
|
||||
(GET) (/visualize/rbx) => {
|
||||
let rbx_session = self.session.rbx_session.lock().unwrap();
|
||||
|
||||
let dot_source = format!("{}", VisualizeRbxSession(&rbx_session));
|
||||
|
||||
Response::svg(graphviz_to_svg(&dot_source))
|
||||
self.handle_visualize_rbx()
|
||||
},
|
||||
|
||||
(GET) (/visualize/imfs) => {
|
||||
let imfs = self.session.imfs.lock().unwrap();
|
||||
|
||||
let dot_source = format!("{}", VisualizeImfs(&imfs));
|
||||
|
||||
Response::svg(graphviz_to_svg(&dot_source))
|
||||
self.handle_visualize_imfs()
|
||||
},
|
||||
|
||||
(GET) (/visualize/path_map) => {
|
||||
let rbx_session = self.session.rbx_session.lock().unwrap();
|
||||
|
||||
Response::json(&rbx_session.debug_get_path_map())
|
||||
},
|
||||
|
||||
_ => Response::empty_404()
|
||||
)
|
||||
}
|
||||
@@ -218,4 +132,126 @@ impl Server {
|
||||
|
||||
rouille::start_server(address, move |request| self.handle_request(request));
|
||||
}
|
||||
|
||||
fn handle_home(&self) -> Response {
|
||||
Response::html(HOME_CONTENT)
|
||||
}
|
||||
|
||||
/// Get a summary of information about the server
|
||||
fn handle_api_rojo(&self) -> Response {
|
||||
let rbx_session = self.live_session.rbx_session.lock().unwrap();
|
||||
let tree = rbx_session.get_tree();
|
||||
|
||||
Response::json(&ServerInfoResponse {
|
||||
server_version: self.server_version,
|
||||
protocol_version: 2,
|
||||
session_id: self.live_session.session_id,
|
||||
expected_place_ids: self.live_session.project.serve_place_ids.clone(),
|
||||
root_instance_id: tree.get_root_id(),
|
||||
})
|
||||
}
|
||||
|
||||
/// Retrieve any messages past the given cursor index, and if
|
||||
/// there weren't any, subscribe to receive any new messages.
|
||||
fn handle_api_subscribe(&self, cursor: u32) -> Response {
|
||||
let message_queue = Arc::clone(&self.live_session.message_queue);
|
||||
|
||||
// Did the client miss any messages since the last subscribe?
|
||||
{
|
||||
let (new_cursor, new_messages) = message_queue.get_messages_since(cursor);
|
||||
|
||||
if !new_messages.is_empty() {
|
||||
return Response::json(&SubscribeResponse {
|
||||
session_id: self.live_session.session_id,
|
||||
messages: Cow::Borrowed(&new_messages),
|
||||
message_cursor: new_cursor,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
let (tx, rx) = mpsc::channel();
|
||||
|
||||
let sender_id = message_queue.subscribe(tx);
|
||||
|
||||
match rx.recv() {
|
||||
Ok(_) => (),
|
||||
Err(_) => return Response::text("error!").with_status_code(500),
|
||||
}
|
||||
|
||||
message_queue.unsubscribe(sender_id);
|
||||
|
||||
{
|
||||
let (new_cursor, new_messages) = message_queue.get_messages_since(cursor);
|
||||
|
||||
return Response::json(&SubscribeResponse {
|
||||
session_id: self.live_session.session_id,
|
||||
messages: Cow::Owned(new_messages),
|
||||
message_cursor: new_cursor,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
fn handle_api_read(&self, requested_ids: Option<Vec<RbxId>>) -> Response {
|
||||
let message_queue = Arc::clone(&self.live_session.message_queue);
|
||||
|
||||
let requested_ids = match requested_ids {
|
||||
Some(id) => id,
|
||||
None => return rouille::Response::text("Malformed ID list").with_status_code(400),
|
||||
};
|
||||
|
||||
let rbx_session = self.live_session.rbx_session.lock().unwrap();
|
||||
let tree = rbx_session.get_tree();
|
||||
|
||||
let message_cursor = message_queue.get_message_cursor();
|
||||
|
||||
let mut instances = HashMap::new();
|
||||
|
||||
for &requested_id in &requested_ids {
|
||||
if let Some(instance) = tree.get_instance(requested_id) {
|
||||
let metadata = rbx_session.get_instance_metadata(requested_id)
|
||||
.map(PublicInstanceMetadata::from_session_metadata);
|
||||
|
||||
instances.insert(instance.get_id(), InstanceWithMetadata {
|
||||
instance: Cow::Borrowed(instance),
|
||||
metadata,
|
||||
});
|
||||
|
||||
for descendant in tree.descendants(requested_id) {
|
||||
let descendant_meta = rbx_session.get_instance_metadata(descendant.get_id())
|
||||
.map(PublicInstanceMetadata::from_session_metadata);
|
||||
|
||||
instances.insert(descendant.get_id(), InstanceWithMetadata {
|
||||
instance: Cow::Borrowed(descendant),
|
||||
metadata: descendant_meta,
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Response::json(&ReadResponse {
|
||||
session_id: self.live_session.session_id,
|
||||
message_cursor,
|
||||
instances,
|
||||
})
|
||||
}
|
||||
|
||||
fn handle_visualize_rbx(&self) -> Response {
|
||||
let rbx_session = self.live_session.rbx_session.lock().unwrap();
|
||||
let dot_source = format!("{}", VisualizeRbxSession(&rbx_session));
|
||||
|
||||
match graphviz_to_svg(&dot_source) {
|
||||
Some(svg) => Response::svg(svg),
|
||||
None => Response::text(dot_source),
|
||||
}
|
||||
}
|
||||
|
||||
fn handle_visualize_imfs(&self) -> Response {
|
||||
let imfs = self.live_session.imfs.lock().unwrap();
|
||||
let dot_source = format!("{}", VisualizeImfs(&imfs));
|
||||
|
||||
match graphviz_to_svg(&dot_source) {
|
||||
Some(svg) => Response::svg(svg),
|
||||
None => Response::text(dot_source),
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,18 +0,0 @@
|
||||
#!/bin/sh
|
||||
|
||||
set -e
|
||||
|
||||
if [ ! -d "../test-projects/$1" ]
|
||||
then
|
||||
echo "Pick a project that exists!"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ -d "scratch" ]
|
||||
then
|
||||
rm -rf scratch
|
||||
fi
|
||||
|
||||
mkdir -p scratch
|
||||
cp -r "../test-projects/$1" scratch
|
||||
cargo run -- serve "scratch/$1"
|
||||
@@ -1,10 +1,10 @@
|
||||
use std::{
|
||||
collections::{HashMap, HashSet},
|
||||
io,
|
||||
fs,
|
||||
path::PathBuf,
|
||||
};
|
||||
|
||||
use failure::Error;
|
||||
use tempfile::{TempDir, tempdir};
|
||||
|
||||
use librojo::{
|
||||
@@ -19,7 +19,7 @@ enum FsEvent {
|
||||
Moved(PathBuf, PathBuf),
|
||||
}
|
||||
|
||||
fn send_events(imfs: &mut Imfs, events: &[FsEvent]) -> io::Result<()> {
|
||||
fn send_events(imfs: &mut Imfs, events: &[FsEvent]) -> Result<(), Error> {
|
||||
for event in events {
|
||||
match event {
|
||||
FsEvent::Created(path) => imfs.path_created(path)?,
|
||||
@@ -56,7 +56,7 @@ fn check_expected(real: &Imfs, expected: &ExpectedImfs) {
|
||||
}
|
||||
}
|
||||
|
||||
fn base_tree() -> io::Result<(TempDir, Imfs, ExpectedImfs, TestResources)> {
|
||||
fn base_tree() -> Result<(TempDir, Imfs, ExpectedImfs, TestResources), Error> {
|
||||
let root = tempdir()?;
|
||||
|
||||
let foo_path = root.path().join("foo");
|
||||
@@ -125,7 +125,7 @@ fn base_tree() -> io::Result<(TempDir, Imfs, ExpectedImfs, TestResources)> {
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn initial_read() -> io::Result<()> {
|
||||
fn initial_read() -> Result<(), Error> {
|
||||
let (_root, imfs, expected_imfs, _resources) = base_tree()?;
|
||||
|
||||
check_expected(&imfs, &expected_imfs);
|
||||
@@ -134,7 +134,7 @@ fn initial_read() -> io::Result<()> {
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn adding_files() -> io::Result<()> {
|
||||
fn adding_files() -> Result<(), Error> {
|
||||
let (root, mut imfs, mut expected_imfs, resources) = base_tree()?;
|
||||
|
||||
check_expected(&imfs, &expected_imfs);
|
||||
@@ -178,7 +178,7 @@ fn adding_files() -> io::Result<()> {
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn adding_folder() -> io::Result<()> {
|
||||
fn adding_folder() -> Result<(), Error> {
|
||||
let (root, imfs, mut expected_imfs, _resources) = base_tree()?;
|
||||
|
||||
check_expected(&imfs, &expected_imfs);
|
||||
@@ -232,6 +232,16 @@ fn adding_folder() -> io::Result<()> {
|
||||
FsEvent::Created(file1_path.clone()),
|
||||
FsEvent::Created(file2_path.clone()),
|
||||
],
|
||||
vec![
|
||||
FsEvent::Created(file1_path.clone()),
|
||||
FsEvent::Created(file2_path.clone()),
|
||||
FsEvent::Created(folder_path.clone()),
|
||||
],
|
||||
vec![
|
||||
FsEvent::Created(file1_path.clone()),
|
||||
FsEvent::Created(folder_path.clone()),
|
||||
FsEvent::Created(file2_path.clone()),
|
||||
],
|
||||
];
|
||||
|
||||
for events in &possible_event_sequences {
|
||||
@@ -245,7 +255,36 @@ fn adding_folder() -> io::Result<()> {
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn removing_file() -> io::Result<()> {
|
||||
fn updating_files() -> Result<(), Error> {
|
||||
let (_root, mut imfs, mut expected_imfs, resources) = base_tree()?;
|
||||
|
||||
check_expected(&imfs, &expected_imfs);
|
||||
|
||||
fs::write(&resources.bar_path, b"bar updated")?;
|
||||
fs::write(&resources.baz_path, b"baz updated")?;
|
||||
|
||||
imfs.path_updated(&resources.bar_path)?;
|
||||
imfs.path_updated(&resources.baz_path)?;
|
||||
|
||||
let bar_updated_item = ImfsItem::File(ImfsFile {
|
||||
path: resources.bar_path.clone(),
|
||||
contents: b"bar updated".to_vec(),
|
||||
});
|
||||
let baz_updated_item = ImfsItem::File(ImfsFile {
|
||||
path: resources.baz_path.clone(),
|
||||
contents: b"baz updated".to_vec(),
|
||||
});
|
||||
|
||||
expected_imfs.items.insert(resources.bar_path.clone(), bar_updated_item);
|
||||
expected_imfs.items.insert(resources.baz_path.clone(), baz_updated_item);
|
||||
|
||||
check_expected(&imfs, &expected_imfs);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn removing_file() -> Result<(), Error> {
|
||||
let (root, mut imfs, mut expected_imfs, resources) = base_tree()?;
|
||||
|
||||
check_expected(&imfs, &expected_imfs);
|
||||
@@ -269,7 +308,7 @@ fn removing_file() -> io::Result<()> {
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn removing_folder() -> io::Result<()> {
|
||||
fn removing_folder() -> Result<(), Error> {
|
||||
let (root, imfs, mut expected_imfs, resources) = base_tree()?;
|
||||
|
||||
check_expected(&imfs, &expected_imfs);
|
||||
@@ -294,6 +333,10 @@ fn removing_folder() -> io::Result<()> {
|
||||
FsEvent::Removed(resources.baz_path.clone()),
|
||||
FsEvent::Removed(resources.foo_path.clone()),
|
||||
],
|
||||
vec![
|
||||
FsEvent::Removed(resources.foo_path.clone()),
|
||||
FsEvent::Removed(resources.baz_path.clone()),
|
||||
],
|
||||
];
|
||||
|
||||
for events in &possible_event_sequences {
|
||||
|
||||
@@ -1,16 +1,15 @@
|
||||
#[macro_use] extern crate lazy_static;
|
||||
|
||||
extern crate librojo;
|
||||
|
||||
use std::{
|
||||
collections::HashMap,
|
||||
path::{Path, PathBuf},
|
||||
};
|
||||
|
||||
use pretty_assertions::assert_eq;
|
||||
use rbx_tree::RbxValue;
|
||||
|
||||
use librojo::{
|
||||
project::{Project, ProjectNode, InstanceProjectNode, SyncPointProjectNode},
|
||||
project::{Project, ProjectNode},
|
||||
};
|
||||
|
||||
lazy_static! {
|
||||
@@ -21,7 +20,7 @@ lazy_static! {
|
||||
|
||||
#[test]
|
||||
fn empty() {
|
||||
let project_file_location = TEST_PROJECTS_ROOT.join("empty/roblox-project.json");
|
||||
let project_file_location = TEST_PROJECTS_ROOT.join("empty/default.project.json");
|
||||
let project = Project::load_exact(&project_file_location).unwrap();
|
||||
|
||||
assert_eq!(project.name, "empty");
|
||||
@@ -29,7 +28,7 @@ fn empty() {
|
||||
|
||||
#[test]
|
||||
fn empty_fuzzy_file() {
|
||||
let project_file_location = TEST_PROJECTS_ROOT.join("empty/roblox-project.json");
|
||||
let project_file_location = TEST_PROJECTS_ROOT.join("empty/default.project.json");
|
||||
let project = Project::load_fuzzy(&project_file_location).unwrap();
|
||||
|
||||
assert_eq!(project.name, "empty");
|
||||
@@ -44,54 +43,52 @@ fn empty_fuzzy_folder() {
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn single_sync_point() {
|
||||
let project_file_location = TEST_PROJECTS_ROOT.join("single-sync-point/roblox-project.json");
|
||||
let project = Project::load_exact(&project_file_location).unwrap();
|
||||
fn single_partition_game() {
|
||||
let project_location = TEST_PROJECTS_ROOT.join("single_partition_game");
|
||||
let project = Project::load_fuzzy(&project_location).unwrap();
|
||||
|
||||
let expected_project = {
|
||||
let foo = ProjectNode::SyncPoint(SyncPointProjectNode {
|
||||
path: project_file_location.parent().unwrap().join("lib"),
|
||||
});
|
||||
let foo = ProjectNode {
|
||||
path: Some(project_location.join("lib")),
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let mut replicated_storage_children = HashMap::new();
|
||||
replicated_storage_children.insert("Foo".to_string(), foo);
|
||||
|
||||
let replicated_storage = ProjectNode::Instance(InstanceProjectNode {
|
||||
class_name: "ReplicatedStorage".to_string(),
|
||||
let replicated_storage = ProjectNode {
|
||||
class_name: Some(String::from("ReplicatedStorage")),
|
||||
children: replicated_storage_children,
|
||||
properties: HashMap::new(),
|
||||
metadata: Default::default(),
|
||||
});
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let mut http_service_properties = HashMap::new();
|
||||
http_service_properties.insert("HttpEnabled".to_string(), RbxValue::Bool {
|
||||
value: true,
|
||||
});
|
||||
|
||||
let http_service = ProjectNode::Instance(InstanceProjectNode {
|
||||
class_name: "HttpService".to_string(),
|
||||
children: HashMap::new(),
|
||||
let http_service = ProjectNode {
|
||||
class_name: Some(String::from("HttpService")),
|
||||
properties: http_service_properties,
|
||||
metadata: Default::default(),
|
||||
});
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
let mut root_children = HashMap::new();
|
||||
root_children.insert("ReplicatedStorage".to_string(), replicated_storage);
|
||||
root_children.insert("HttpService".to_string(), http_service);
|
||||
|
||||
let root_node = ProjectNode::Instance(InstanceProjectNode {
|
||||
class_name: "DataModel".to_string(),
|
||||
let root_node = ProjectNode {
|
||||
class_name: Some(String::from("DataModel")),
|
||||
children: root_children,
|
||||
properties: HashMap::new(),
|
||||
metadata: Default::default(),
|
||||
});
|
||||
..Default::default()
|
||||
};
|
||||
|
||||
Project {
|
||||
name: "single-sync-point".to_string(),
|
||||
tree: root_node,
|
||||
serve_port: None,
|
||||
serve_place_ids: None,
|
||||
file_location: project_file_location.clone(),
|
||||
file_location: project_location.join("default.project.json"),
|
||||
}
|
||||
};
|
||||
|
||||
@@ -99,9 +96,17 @@ fn single_sync_point() {
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_model() {
|
||||
let project_file_location = TEST_PROJECTS_ROOT.join("test-model/roblox-project.json");
|
||||
let project = Project::load_exact(&project_file_location).unwrap();
|
||||
fn single_partition_model() {
|
||||
let project_file_location = TEST_PROJECTS_ROOT.join("single_partition_model");
|
||||
let project = Project::load_fuzzy(&project_file_location).unwrap();
|
||||
|
||||
assert_eq!(project.name, "test-model");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn composing_models() {
|
||||
let project_file_location = TEST_PROJECTS_ROOT.join("composing_models");
|
||||
let project = Project::load_fuzzy(&project_file_location).unwrap();
|
||||
|
||||
assert_eq!(project.name, "composing-models");
|
||||
}
|
||||
124
server/tests/snapshots.rs
Normal file
@@ -0,0 +1,124 @@
|
||||
use std::{
|
||||
fs::{self, File},
|
||||
path::{Path, PathBuf},
|
||||
};
|
||||
|
||||
use pretty_assertions::assert_eq;
|
||||
|
||||
use librojo::{
|
||||
imfs::Imfs,
|
||||
project::{Project, ProjectNode},
|
||||
rbx_snapshot::snapshot_project_tree,
|
||||
snapshot_reconciler::{RbxSnapshotInstance},
|
||||
};
|
||||
|
||||
macro_rules! generate_snapshot_tests {
|
||||
($($name: ident),*) => {
|
||||
$(
|
||||
paste::item! {
|
||||
#[test]
|
||||
fn [<snapshot_ $name>]() {
|
||||
let tests_folder = Path::new(env!("CARGO_MANIFEST_DIR")).join("../test-projects");
|
||||
let project_folder = tests_folder.join(stringify!($name));
|
||||
run_snapshot_test(&project_folder);
|
||||
}
|
||||
}
|
||||
)*
|
||||
};
|
||||
}
|
||||
|
||||
generate_snapshot_tests!(
|
||||
empty,
|
||||
nested_partitions,
|
||||
single_partition_game,
|
||||
single_partition_model,
|
||||
transmute_partition
|
||||
);
|
||||
|
||||
const SNAPSHOT_EXPECTED_NAME: &str = "expected-snapshot.json";
|
||||
|
||||
fn run_snapshot_test(path: &Path) {
|
||||
println!("Running snapshot from project: {}", path.display());
|
||||
|
||||
let project = Project::load_fuzzy(path)
|
||||
.expect("Couldn't load project file for snapshot test");
|
||||
|
||||
let mut imfs = Imfs::new();
|
||||
imfs.add_roots_from_project(&project)
|
||||
.expect("Could not add IMFS roots to snapshot project");
|
||||
|
||||
let mut snapshot = snapshot_project_tree(&imfs, &project)
|
||||
.expect("Could not generate snapshot for snapshot test");
|
||||
|
||||
if let Some(snapshot) = snapshot.as_mut() {
|
||||
anonymize_snapshot(path, snapshot);
|
||||
}
|
||||
|
||||
match read_expected_snapshot(path) {
|
||||
Some(expected_snapshot) => assert_eq!(snapshot, expected_snapshot),
|
||||
None => write_expected_snapshot(path, &snapshot),
|
||||
}
|
||||
}
|
||||
|
||||
/// Snapshots contain absolute paths, which simplifies much of Rojo.
|
||||
///
|
||||
/// For saving snapshots to the disk, we should strip off the project folder
|
||||
/// path to make them machine-independent. This doesn't work for paths that fall
|
||||
/// outside of the project folder, but that's okay here.
|
||||
///
|
||||
/// We also need to sort children, since Rojo tends to enumerate the filesystem
|
||||
/// in an unpredictable order.
|
||||
fn anonymize_snapshot(project_folder_path: &Path, snapshot: &mut RbxSnapshotInstance) {
|
||||
match snapshot.metadata.source_path.as_mut() {
|
||||
Some(path) => *path = anonymize_path(project_folder_path, path),
|
||||
None => {},
|
||||
}
|
||||
|
||||
match snapshot.metadata.project_definition.as_mut() {
|
||||
Some((_, project_node)) => anonymize_project_node(project_folder_path, project_node),
|
||||
None => {},
|
||||
}
|
||||
|
||||
snapshot.children.sort_by(|a, b| a.partial_cmp(b).unwrap());
|
||||
|
||||
for child in snapshot.children.iter_mut() {
|
||||
anonymize_snapshot(project_folder_path, child);
|
||||
}
|
||||
}
|
||||
|
||||
fn anonymize_project_node(project_folder_path: &Path, project_node: &mut ProjectNode) {
|
||||
match project_node.path.as_mut() {
|
||||
Some(path) => *path = anonymize_path(project_folder_path, path),
|
||||
None => {},
|
||||
}
|
||||
|
||||
for child_node in project_node.children.values_mut() {
|
||||
anonymize_project_node(project_folder_path, child_node);
|
||||
}
|
||||
}
|
||||
|
||||
fn anonymize_path(project_folder_path: &Path, path: &Path) -> PathBuf {
|
||||
if path.is_absolute() {
|
||||
path.strip_prefix(project_folder_path)
|
||||
.expect("Could not anonymize absolute path")
|
||||
.to_path_buf()
|
||||
} else {
|
||||
path.to_path_buf()
|
||||
}
|
||||
}
|
||||
|
||||
fn read_expected_snapshot(path: &Path) -> Option<Option<RbxSnapshotInstance<'static>>> {
|
||||
let contents = fs::read(path.join(SNAPSHOT_EXPECTED_NAME)).ok()?;
|
||||
let snapshot: Option<RbxSnapshotInstance<'static>> = serde_json::from_slice(&contents)
|
||||
.expect("Could not deserialize snapshot");
|
||||
|
||||
Some(snapshot)
|
||||
}
|
||||
|
||||
fn write_expected_snapshot(path: &Path, snapshot: &Option<RbxSnapshotInstance>) {
|
||||
let mut file = File::create(path.join(SNAPSHOT_EXPECTED_NAME))
|
||||
.expect("Could not open file to write snapshot");
|
||||
|
||||
serde_json::to_writer_pretty(&mut file, snapshot)
|
||||
.expect("Could not serialize snapshot to file");
|
||||
}
|
||||
14
test-projects/composing_models/src/Remotes.model.json
Normal file
@@ -0,0 +1,14 @@
|
||||
{
|
||||
"Name": "All my Remote Events",
|
||||
"ClassName": "Folder",
|
||||
"Children": [
|
||||
{
|
||||
"Name": "SendMoney",
|
||||
"ClassName": "RemoteEvent"
|
||||
},
|
||||
{
|
||||
"Name": "SendItems",
|
||||
"ClassName": "RemoteEvent"
|
||||
}
|
||||
]
|
||||
}
|
||||
@@ -5,12 +5,10 @@
|
||||
<Item class="Script" referent="RBX634A9A9988354E4B9D971B2A4DEBD26E">
|
||||
<Properties>
|
||||
<bool name="Disabled">false</bool>
|
||||
<Content name="LinkedSource"><null></null></Content>
|
||||
<string name="Name">Lone Script</string>
|
||||
<string name="ScriptGuid">{C62CD9FB-FF28-4FD9-9712-AD28A1E92C84}</string>
|
||||
<ProtectedString name="Source"><![CDATA[print("Hello world!")
|
||||
]]></ProtectedString>
|
||||
<BinaryString name="Tags"></BinaryString>
|
||||
<string name="Source"><![CDATA[print("Hello world!")
|
||||
]]></string>
|
||||
</Properties>
|
||||
</Item>
|
||||
</roblox>
|
||||
20
test-projects/empty/expected-snapshot.json
Normal file
@@ -0,0 +1,20 @@
|
||||
{
|
||||
"name": "empty",
|
||||
"class_name": "DataModel",
|
||||
"properties": {},
|
||||
"children": [],
|
||||
"metadata": {
|
||||
"ignore_unknown_instances": true,
|
||||
"source_path": null,
|
||||
"project_definition": [
|
||||
"empty",
|
||||
{
|
||||
"class_name": "DataModel",
|
||||
"children": {},
|
||||
"properties": {},
|
||||
"ignore_unknown_instances": null,
|
||||
"path": null
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
6
test-projects/missing_files/default.project.json
Normal file
@@ -0,0 +1,6 @@
|
||||
{
|
||||
"name": "missing-files",
|
||||
"tree": {
|
||||
"$path": "does-not-exist"
|
||||
}
|
||||
}
|
||||
9
test-projects/nested_partitions/default.project.json
Normal file
@@ -0,0 +1,9 @@
|
||||
{
|
||||
"name": "nested-partitions",
|
||||
"tree": {
|
||||
"$path": "outer",
|
||||
"inner": {
|
||||
"$path": "inner"
|
||||
}
|
||||
}
|
||||
}
|
||||
82
test-projects/nested_partitions/expected-snapshot.json
Normal file
@@ -0,0 +1,82 @@
|
||||
{
|
||||
"name": "nested-partitions",
|
||||
"class_name": "Folder",
|
||||
"properties": {},
|
||||
"children": [
|
||||
{
|
||||
"name": "inner",
|
||||
"class_name": "Folder",
|
||||
"properties": {},
|
||||
"children": [
|
||||
{
|
||||
"name": "hello",
|
||||
"class_name": "ModuleScript",
|
||||
"properties": {
|
||||
"Source": {
|
||||
"Type": "String",
|
||||
"Value": "-- inner/hello.lua"
|
||||
}
|
||||
},
|
||||
"children": [],
|
||||
"metadata": {
|
||||
"ignore_unknown_instances": false,
|
||||
"source_path": "inner/hello.lua",
|
||||
"project_definition": null
|
||||
}
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"ignore_unknown_instances": false,
|
||||
"source_path": "inner",
|
||||
"project_definition": [
|
||||
"inner",
|
||||
{
|
||||
"class_name": null,
|
||||
"children": {},
|
||||
"properties": {},
|
||||
"ignore_unknown_instances": null,
|
||||
"path": "inner"
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "world",
|
||||
"class_name": "ModuleScript",
|
||||
"properties": {
|
||||
"Source": {
|
||||
"Type": "String",
|
||||
"Value": "-- outer/world.lua"
|
||||
}
|
||||
},
|
||||
"children": [],
|
||||
"metadata": {
|
||||
"ignore_unknown_instances": false,
|
||||
"source_path": "outer/world.lua",
|
||||
"project_definition": null
|
||||
}
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"ignore_unknown_instances": false,
|
||||
"source_path": "outer",
|
||||
"project_definition": [
|
||||
"nested-partitions",
|
||||
{
|
||||
"class_name": null,
|
||||
"children": {
|
||||
"inner": {
|
||||
"class_name": null,
|
||||
"children": {},
|
||||
"properties": {},
|
||||
"ignore_unknown_instances": null,
|
||||
"path": "inner"
|
||||
}
|
||||
},
|
||||
"properties": {},
|
||||
"ignore_unknown_instances": null,
|
||||
"path": "outer"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
1
test-projects/nested_partitions/inner/hello.lua
Normal file
@@ -0,0 +1 @@
|
||||
-- inner/hello.lua
|
||||
1
test-projects/nested_partitions/outer/world.lua
Normal file
@@ -0,0 +1 @@
|
||||
-- outer/world.lua
|
||||
@@ -1,6 +0,0 @@
|
||||
Key,Context,Example,Source,es-es,de
|
||||
,ClickableGroup:BuilderGui:TextLabel,You got 22 hearts!,You got {1} hearts!,,
|
||||
,,"Team ""Red"" wins!","Team ""{1}"" wins!","¡Gana el equipo ""{1}""!","¡Gana el equipo ""{1}""!"
|
||||
,Frame:TextLabel,,"{1} killed {2}, with a {3}","{1} mató a {2} con
|
||||
una escopeta","{1} mató a {2} con
|
||||
una escopeta"
|
||||
|
161
test-projects/single_partition_game/expected-snapshot.json
Normal file
@@ -0,0 +1,161 @@
|
||||
{
|
||||
"name": "single-sync-point",
|
||||
"class_name": "DataModel",
|
||||
"properties": {},
|
||||
"children": [
|
||||
{
|
||||
"name": "HttpService",
|
||||
"class_name": "HttpService",
|
||||
"properties": {
|
||||
"HttpEnabled": {
|
||||
"Type": "Bool",
|
||||
"Value": true
|
||||
}
|
||||
},
|
||||
"children": [],
|
||||
"metadata": {
|
||||
"ignore_unknown_instances": true,
|
||||
"source_path": null,
|
||||
"project_definition": [
|
||||
"HttpService",
|
||||
{
|
||||
"class_name": "HttpService",
|
||||
"children": {},
|
||||
"properties": {
|
||||
"HttpEnabled": {
|
||||
"Type": "Bool",
|
||||
"Value": true
|
||||
}
|
||||
},
|
||||
"ignore_unknown_instances": null,
|
||||
"path": null
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "ReplicatedStorage",
|
||||
"class_name": "ReplicatedStorage",
|
||||
"properties": {},
|
||||
"children": [
|
||||
{
|
||||
"name": "Foo",
|
||||
"class_name": "Folder",
|
||||
"properties": {},
|
||||
"children": [
|
||||
{
|
||||
"name": "foo",
|
||||
"class_name": "StringValue",
|
||||
"properties": {
|
||||
"Value": {
|
||||
"Type": "String",
|
||||
"Value": "Hello world, from foo.txt"
|
||||
}
|
||||
},
|
||||
"children": [],
|
||||
"metadata": {
|
||||
"ignore_unknown_instances": false,
|
||||
"source_path": "lib/foo.txt",
|
||||
"project_definition": null
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "main",
|
||||
"class_name": "ModuleScript",
|
||||
"properties": {
|
||||
"Source": {
|
||||
"Type": "String",
|
||||
"Value": "-- hello, from main"
|
||||
}
|
||||
},
|
||||
"children": [],
|
||||
"metadata": {
|
||||
"ignore_unknown_instances": false,
|
||||
"source_path": "lib/main.lua",
|
||||
"project_definition": null
|
||||
}
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"ignore_unknown_instances": false,
|
||||
"source_path": "lib",
|
||||
"project_definition": [
|
||||
"Foo",
|
||||
{
|
||||
"class_name": null,
|
||||
"children": {},
|
||||
"properties": {},
|
||||
"ignore_unknown_instances": null,
|
||||
"path": "lib"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"ignore_unknown_instances": true,
|
||||
"source_path": null,
|
||||
"project_definition": [
|
||||
"ReplicatedStorage",
|
||||
{
|
||||
"class_name": "ReplicatedStorage",
|
||||
"children": {
|
||||
"Foo": {
|
||||
"class_name": null,
|
||||
"children": {},
|
||||
"properties": {},
|
||||
"ignore_unknown_instances": null,
|
||||
"path": "lib"
|
||||
}
|
||||
},
|
||||
"properties": {},
|
||||
"ignore_unknown_instances": null,
|
||||
"path": null
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"ignore_unknown_instances": true,
|
||||
"source_path": null,
|
||||
"project_definition": [
|
||||
"single-sync-point",
|
||||
{
|
||||
"class_name": "DataModel",
|
||||
"children": {
|
||||
"HttpService": {
|
||||
"class_name": "HttpService",
|
||||
"children": {},
|
||||
"properties": {
|
||||
"HttpEnabled": {
|
||||
"Type": "Bool",
|
||||
"Value": true
|
||||
}
|
||||
},
|
||||
"ignore_unknown_instances": null,
|
||||
"path": null
|
||||
},
|
||||
"ReplicatedStorage": {
|
||||
"class_name": "ReplicatedStorage",
|
||||
"children": {
|
||||
"Foo": {
|
||||
"class_name": null,
|
||||
"children": {},
|
||||
"properties": {},
|
||||
"ignore_unknown_instances": null,
|
||||
"path": "lib"
|
||||
}
|
||||
},
|
||||
"properties": {},
|
||||
"ignore_unknown_instances": null,
|
||||
"path": null
|
||||
}
|
||||
},
|
||||
"properties": {},
|
||||
"ignore_unknown_instances": null,
|
||||
"path": null
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
53
test-projects/single_partition_model/expected-snapshot.json
Normal file
@@ -0,0 +1,53 @@
|
||||
{
|
||||
"name": "test-model",
|
||||
"class_name": "Folder",
|
||||
"properties": {},
|
||||
"children": [
|
||||
{
|
||||
"name": "main",
|
||||
"class_name": "Script",
|
||||
"properties": {
|
||||
"Source": {
|
||||
"Type": "String",
|
||||
"Value": "local other = require(script.Parent.other)\n\nprint(other)"
|
||||
}
|
||||
},
|
||||
"children": [],
|
||||
"metadata": {
|
||||
"ignore_unknown_instances": false,
|
||||
"source_path": "src/main.server.lua",
|
||||
"project_definition": null
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "other",
|
||||
"class_name": "ModuleScript",
|
||||
"properties": {
|
||||
"Source": {
|
||||
"Type": "String",
|
||||
"Value": "return \"Hello, world!\""
|
||||
}
|
||||
},
|
||||
"children": [],
|
||||
"metadata": {
|
||||
"ignore_unknown_instances": false,
|
||||
"source_path": "src/other.lua",
|
||||
"project_definition": null
|
||||
}
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"ignore_unknown_instances": false,
|
||||
"source_path": "src",
|
||||
"project_definition": [
|
||||
"test-model",
|
||||
{
|
||||
"class_name": null,
|
||||
"children": {},
|
||||
"properties": {},
|
||||
"ignore_unknown_instances": null,
|
||||
"path": "src"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1 @@
|
||||
-- ReplicatedStorage/hello.lua
|
||||
11
test-projects/transmute_partition/default.project.json
Normal file
@@ -0,0 +1,11 @@
|
||||
{
|
||||
"name": "transmute-partition",
|
||||
"tree": {
|
||||
"$className": "DataModel",
|
||||
|
||||
"ReplicatedStorage": {
|
||||
"$className": "ReplicatedStorage",
|
||||
"$path": "ReplicatedStorage"
|
||||
}
|
||||
}
|
||||
}
|
||||
66
test-projects/transmute_partition/expected-snapshot.json
Normal file
@@ -0,0 +1,66 @@
|
||||
{
|
||||
"name": "transmute-partition",
|
||||
"class_name": "DataModel",
|
||||
"properties": {},
|
||||
"children": [
|
||||
{
|
||||
"name": "ReplicatedStorage",
|
||||
"class_name": "ReplicatedStorage",
|
||||
"properties": {},
|
||||
"children": [
|
||||
{
|
||||
"name": "hello",
|
||||
"class_name": "ModuleScript",
|
||||
"properties": {
|
||||
"Source": {
|
||||
"Type": "String",
|
||||
"Value": "-- ReplicatedStorage/hello.lua"
|
||||
}
|
||||
},
|
||||
"children": [],
|
||||
"metadata": {
|
||||
"ignore_unknown_instances": false,
|
||||
"source_path": "ReplicatedStorage/hello.lua",
|
||||
"project_definition": null
|
||||
}
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"ignore_unknown_instances": false,
|
||||
"source_path": "ReplicatedStorage",
|
||||
"project_definition": [
|
||||
"ReplicatedStorage",
|
||||
{
|
||||
"class_name": "ReplicatedStorage",
|
||||
"children": {},
|
||||
"properties": {},
|
||||
"ignore_unknown_instances": null,
|
||||
"path": "ReplicatedStorage"
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"ignore_unknown_instances": true,
|
||||
"source_path": null,
|
||||
"project_definition": [
|
||||
"transmute-partition",
|
||||
{
|
||||
"class_name": "DataModel",
|
||||
"children": {
|
||||
"ReplicatedStorage": {
|
||||
"class_name": "ReplicatedStorage",
|
||||
"children": {},
|
||||
"properties": {},
|
||||
"ignore_unknown_instances": null,
|
||||
"path": "ReplicatedStorage"
|
||||
}
|
||||
},
|
||||
"properties": {},
|
||||
"ignore_unknown_instances": null,
|
||||
"path": null
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
21
test-scratch-project
Normal file
@@ -0,0 +1,21 @@
|
||||
#!/bin/sh
|
||||
|
||||
# Copies a project from 'test-projects' into a folder that can be messed with
|
||||
# without accidentally checking the results into version control.
|
||||
|
||||
set -e
|
||||
|
||||
if [ ! -d "test-projects/$1" ]
|
||||
then
|
||||
echo "Pick a project that exists!"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ -d "scratch-project/$1" ]
|
||||
then
|
||||
rm -rf "scratch-project/$1"
|
||||
fi
|
||||
|
||||
mkdir -p scratch-project
|
||||
cp -r "test-projects/$1" scratch-project
|
||||
cargo run -- serve "scratch-project/$1"
|
||||