Compare commits
57 Commits
v0.5.0-alp
...
v0.5.0-alp
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
69d1accf3f | ||
|
|
785bdb8ecb | ||
|
|
78a1947cec | ||
|
|
0ff59ecb4e | ||
|
|
b58fed16b4 | ||
|
|
6719be02c3 | ||
|
|
8757834e07 | ||
|
|
aa243d1b8a | ||
|
|
aeb18eb124 | ||
|
|
6c3e118ee3 | ||
|
|
3c0fe4d684 | ||
|
|
12fd9aa1ef | ||
|
|
821122a33d | ||
|
|
0d9406d991 | ||
|
|
350eec3bc7 | ||
|
|
e700b3105a | ||
|
|
dd2a730b4a | ||
|
|
c6766bbe77 | ||
|
|
e5d3204b6c | ||
|
|
4767cbd12b | ||
|
|
deb4118c5d | ||
|
|
4516df5aac | ||
|
|
663df7bdc2 | ||
|
|
e81f0a4a95 | ||
|
|
38cd13dc0c | ||
|
|
14fd470363 | ||
|
|
fc8d9dc1fe | ||
|
|
1659adb419 | ||
|
|
6490b77d4c | ||
|
|
23463b620e | ||
|
|
6bc331be75 | ||
|
|
87f6410877 | ||
|
|
b1ddfc3a49 | ||
|
|
d01e757d2f | ||
|
|
e593ce0420 | ||
|
|
578abfabb3 | ||
|
|
aa7b7e43ff | ||
|
|
af4d4e0246 | ||
|
|
fecb11cba4 | ||
|
|
614f886008 | ||
|
|
6fcb895d70 | ||
|
|
5a98ede45e | ||
|
|
779d462932 | ||
|
|
e301116e87 | ||
|
|
bd3a4a719d | ||
|
|
4cfdc72c00 | ||
|
|
3620a9d256 | ||
|
|
f254a51d59 | ||
|
|
99bbe58255 | ||
|
|
a400abff4c | ||
|
|
585806837e | ||
|
|
249aa999a3 | ||
|
|
aae1d8b34f | ||
|
|
9d3638fa46 | ||
|
|
5b2a830d2d | ||
|
|
b87943e39d | ||
|
|
c421fd0b25 |
2
.gitignore
vendored
@@ -1,5 +1,5 @@
|
|||||||
/site
|
/site
|
||||||
/target
|
/target
|
||||||
/server/scratch
|
/scratch-project
|
||||||
**/*.rs.bk
|
**/*.rs.bk
|
||||||
/generate-docs.run
|
/generate-docs.run
|
||||||
@@ -1,6 +1,31 @@
|
|||||||
# Rojo Change Log
|
# Rojo Changelog
|
||||||
|
|
||||||
## Current master
|
## [Unreleased]
|
||||||
|
|
||||||
|
## [0.5.0 Alpha 3](https://github.com/LPGhatguy/rojo/releases/tag/v0.5.0-alpha.3) (February 1, 2019)
|
||||||
|
* Changed default project file name from `roblox-project.json` to `default.project.json` ([#120](https://github.com/LPGhatguy/rojo/pull/120))
|
||||||
|
* The old file name will still be supported until 0.5.0 is fully released.
|
||||||
|
* Added warning when loading project files that don't end in `.project.json`
|
||||||
|
* This new extension enables Rojo to distinguish project files from random JSON files, which is necessary to support nested projects.
|
||||||
|
* Added new (empty) diagnostic page served from the server
|
||||||
|
* Added better error messages for when a file is missing that's referenced by a Rojo project
|
||||||
|
* Added support for visualization endpoints returning GraphViz source when Dot is not available
|
||||||
|
* Fixed an in-memory filesystem regression introduced recently ([#119](https://github.com/LPGhatguy/rojo/pull/119))
|
||||||
|
|
||||||
|
## [0.5.0 Alpha 2](https://github.com/LPGhatguy/rojo/releases/tag/v0.5.0-alpha.2) (January 28, 2019)
|
||||||
|
* Added support for `.model.json` files, compatible with 0.4.x
|
||||||
|
* Fixed in-memory filesystem not handling out-of-order filesystem change events
|
||||||
|
* Fixed long-polling error caused by a promise mixup ([#110](https://github.com/LPGhatguy/rojo/issues/110))
|
||||||
|
|
||||||
|
## [0.5.0 Alpha 1](https://github.com/LPGhatguy/rojo/releases/tag/v0.5.0-alpha.1) (January 25, 2019)
|
||||||
|
* Changed plugin UI to be way prettier
|
||||||
|
* Thanks to [Reselim](https://github.com/Reselim) for the design!
|
||||||
|
* Changed plugin error messages to be a little more useful
|
||||||
|
* Removed unused 'Config' button in plugin UI
|
||||||
|
* Fixed bug where bad server responses could cause the plugin to be in a bad state
|
||||||
|
* Upgraded to rbx\_tree, rbx\_xml, and rbx\_binary 0.2.0, which dramatically expands the kinds of properties that Rojo can handle, especially in XML.
|
||||||
|
|
||||||
|
## [0.5.0 Alpha 0](https://github.com/LPGhatguy/rojo/releases/tag/v0.5.0-alpha.0) (January 14, 2019)
|
||||||
* "Epiphany" rewrite, in progress since the beginning of time
|
* "Epiphany" rewrite, in progress since the beginning of time
|
||||||
* New live sync protocol
|
* New live sync protocol
|
||||||
* Uses HTTP long polling to reduce request count and improve responsiveness
|
* Uses HTTP long polling to reduce request count and improve responsiveness
|
||||||
@@ -25,36 +50,36 @@
|
|||||||
* Multiple places can be specified, like when building a multi-place game
|
* Multiple places can be specified, like when building a multi-place game
|
||||||
* Added support for specifying properties on services in project files
|
* Added support for specifying properties on services in project files
|
||||||
|
|
||||||
## 0.4.13 (November 12, 2018)
|
## [0.4.13](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.13) (November 12, 2018)
|
||||||
* When `rojo.json` points to a file or directory that does not exist, Rojo now issues a warning instead of throwing an error and exiting
|
* When `rojo.json` points to a file or directory that does not exist, Rojo now issues a warning instead of throwing an error and exiting
|
||||||
|
|
||||||
## 0.4.12 (June 21, 2018)
|
## [0.4.12](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.12) (June 21, 2018)
|
||||||
* Fixed obscure assertion failure when renaming or deleting files ([#78](https://github.com/LPGhatguy/rojo/issues/78))
|
* Fixed obscure assertion failure when renaming or deleting files ([#78](https://github.com/LPGhatguy/rojo/issues/78))
|
||||||
* Added a `PluginAction` for the sync in command, which should help with some automation scripts ([#80](https://github.com/LPGhatguy/rojo/pull/80))
|
* Added a `PluginAction` for the sync in command, which should help with some automation scripts ([#80](https://github.com/LPGhatguy/rojo/pull/80))
|
||||||
|
|
||||||
## 0.4.11 (June 10, 2018)
|
## [0.4.11](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.11) (June 10, 2018)
|
||||||
* Defensively insert existing instances into RouteMap; should fix most duplication cases when syncing into existing trees.
|
* Defensively insert existing instances into RouteMap; should fix most duplication cases when syncing into existing trees.
|
||||||
* Fixed incorrect synchronization from `Plugin:_pull` that would cause polling to create issues
|
* Fixed incorrect synchronization from `Plugin:_pull` that would cause polling to create issues
|
||||||
* Fixed incorrect file routes being assigned to `init.lua` and `init.model.json` files
|
* Fixed incorrect file routes being assigned to `init.lua` and `init.model.json` files
|
||||||
* Untangled route handling-internals slightly
|
* Untangled route handling-internals slightly
|
||||||
|
|
||||||
## 0.4.10 (June 2, 2018)
|
## [0.4.10](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.10) (June 2, 2018)
|
||||||
* Added support for `init.model.json` files, which enable versioning `Tool` instances (among other things) with Rojo. ([#66](https://github.com/LPGhatguy/rojo/issues/66))
|
* Added support for `init.model.json` files, which enable versioning `Tool` instances (among other things) with Rojo. ([#66](https://github.com/LPGhatguy/rojo/issues/66))
|
||||||
* Fixed obscure error when syncing into an invalid service.
|
* Fixed obscure error when syncing into an invalid service.
|
||||||
* Fixed multiple sync processes occurring when a server ID mismatch is detected.
|
* Fixed multiple sync processes occurring when a server ID mismatch is detected.
|
||||||
|
|
||||||
## 0.4.9 (May 26, 2018)
|
## [0.4.9](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.9) (May 26, 2018)
|
||||||
* Fixed warning when renaming or removing files that would sometimes corrupt the instance cache ([#72](https://github.com/LPGhatguy/rojo/pull/72))
|
* Fixed warning when renaming or removing files that would sometimes corrupt the instance cache ([#72](https://github.com/LPGhatguy/rojo/pull/72))
|
||||||
* JSON models are no longer as strict -- `Children` and `Properties` are now optional.
|
* JSON models are no longer as strict -- `Children` and `Properties` are now optional.
|
||||||
|
|
||||||
## 0.4.8 (May 26, 2018)
|
## [0.4.8](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.8) (May 26, 2018)
|
||||||
* Hotfix to prevent errors from being thrown when objects managed by Rojo are deleted
|
* Hotfix to prevent errors from being thrown when objects managed by Rojo are deleted
|
||||||
|
|
||||||
## 0.4.7 (May 25, 2018)
|
## [0.4.7](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.7) (May 25, 2018)
|
||||||
* Added icons to the Rojo plugin, made by [@Vorlias](https://github.com/Vorlias)! ([#70](https://github.com/LPGhatguy/rojo/pull/70))
|
* Added icons to the Rojo plugin, made by [@Vorlias](https://github.com/Vorlias)! ([#70](https://github.com/LPGhatguy/rojo/pull/70))
|
||||||
* Server will now issue a warning if no partitions are specified in `rojo serve` ([#40](https://github.com/LPGhatguy/rojo/issues/40))
|
* Server will now issue a warning if no partitions are specified in `rojo serve` ([#40](https://github.com/LPGhatguy/rojo/issues/40))
|
||||||
|
|
||||||
## 0.4.6 (May 21, 2018)
|
## [0.4.6](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.6) (May 21, 2018)
|
||||||
* Rojo handles being restarted by Roblox Studio more gracefully ([#67](https://github.com/LPGhatguy/rojo/issues/67))
|
* Rojo handles being restarted by Roblox Studio more gracefully ([#67](https://github.com/LPGhatguy/rojo/issues/67))
|
||||||
* Folders should no longer get collapsed when syncing occurs.
|
* Folders should no longer get collapsed when syncing occurs.
|
||||||
* **Significant** robustness improvements with regards to caching.
|
* **Significant** robustness improvements with regards to caching.
|
||||||
@@ -62,7 +87,7 @@
|
|||||||
* If there are any bugs with script duplication or caching in the future, restarting the Rojo server process will fix them for that session.
|
* If there are any bugs with script duplication or caching in the future, restarting the Rojo server process will fix them for that session.
|
||||||
* Fixed message in plugin not being prefixed with `Rojo: `.
|
* Fixed message in plugin not being prefixed with `Rojo: `.
|
||||||
|
|
||||||
## 0.4.5 (May 1, 2018)
|
## [0.4.5](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.5) (May 1, 2018)
|
||||||
* Rojo messages are now prefixed with `Rojo: ` to make them stand out in the output more.
|
* Rojo messages are now prefixed with `Rojo: ` to make them stand out in the output more.
|
||||||
* Fixed server to notice file changes *much* more quickly. (200ms vs 1000ms)
|
* Fixed server to notice file changes *much* more quickly. (200ms vs 1000ms)
|
||||||
* Server now lists name of project when starting up.
|
* Server now lists name of project when starting up.
|
||||||
@@ -70,23 +95,23 @@
|
|||||||
* Fixed multiple sync operations occuring at the same time. ([#61](https://github.com/LPGhatguy/rojo/issues/61))
|
* Fixed multiple sync operations occuring at the same time. ([#61](https://github.com/LPGhatguy/rojo/issues/61))
|
||||||
* Partitions targeting files directly now work as expected. ([#57](https://github.com/LPGhatguy/rojo/issues/57))
|
* Partitions targeting files directly now work as expected. ([#57](https://github.com/LPGhatguy/rojo/issues/57))
|
||||||
|
|
||||||
## 0.4.4 (April 7, 2018)
|
## [0.4.4](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.4) (April 7, 2018)
|
||||||
* Fix small regression introduced in 0.4.3
|
* Fix small regression introduced in 0.4.3
|
||||||
|
|
||||||
## 0.4.3 (April 7, 2018)
|
## [0.4.3](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.3) (April 7, 2018)
|
||||||
* Plugin now automatically selects `HttpService` if it determines that HTTP isn't enabled ([#58](https://github.com/LPGhatguy/rojo/pull/58))
|
* Plugin now automatically selects `HttpService` if it determines that HTTP isn't enabled ([#58](https://github.com/LPGhatguy/rojo/pull/58))
|
||||||
* Plugin now has much more robust handling and will wipe all state when the server changes.
|
* Plugin now has much more robust handling and will wipe all state when the server changes.
|
||||||
* This should fix issues that would otherwise be solved by restarting Roblox Studio.
|
* This should fix issues that would otherwise be solved by restarting Roblox Studio.
|
||||||
|
|
||||||
## 0.4.2 (April 4, 2018)
|
## [0.4.2](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.2) (April 4, 2018)
|
||||||
* Fixed final case of duplicated instance insertion, caused by reconciled instances not being inserted into `RouteMap`.
|
* Fixed final case of duplicated instance insertion, caused by reconciled instances not being inserted into `RouteMap`.
|
||||||
* The reconciler is still not a perfect solution, especially if script instances get moved around without being destroyed. I don't think this can be fixed before a big refactor.
|
* The reconciler is still not a perfect solution, especially if script instances get moved around without being destroyed. I don't think this can be fixed before a big refactor.
|
||||||
|
|
||||||
## 0.4.1 (April 1, 2018)
|
## [0.4.1](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.1) (April 1, 2018)
|
||||||
* Merged plugin repository into main Rojo repository for easier tracking.
|
* Merged plugin repository into main Rojo repository for easier tracking.
|
||||||
* Improved `RouteMap` object tracking; this should fix some cases of duplicated instances being synced into the tree.
|
* Improved `RouteMap` object tracking; this should fix some cases of duplicated instances being synced into the tree.
|
||||||
|
|
||||||
## 0.4.0 (March 27, 2018)
|
## [0.4.0](https://github.com/LPGhatguy/rojo/releases/tag/v0.4.0) (March 27, 2018)
|
||||||
* Protocol version 1, which shifts more responsibility onto the server
|
* Protocol version 1, which shifts more responsibility onto the server
|
||||||
* This is a **major breaking** change!
|
* This is a **major breaking** change!
|
||||||
* The server now has a content of 'filter plugins', which transform data at various stages in the pipeline
|
* The server now has a content of 'filter plugins', which transform data at various stages in the pipeline
|
||||||
@@ -94,36 +119,36 @@
|
|||||||
* Added `*.model.json` files, which let you embed small Roblox objects into your Rojo tree.
|
* Added `*.model.json` files, which let you embed small Roblox objects into your Rojo tree.
|
||||||
* Improved error messages in some cases ([#46](https://github.com/LPGhatguy/rojo/issues/46))
|
* Improved error messages in some cases ([#46](https://github.com/LPGhatguy/rojo/issues/46))
|
||||||
|
|
||||||
## 0.3.2 (December 20, 2017)
|
## [0.3.2](https://github.com/LPGhatguy/rojo/releases/tag/v0.3.2) (December 20, 2017)
|
||||||
* Fixed `rojo serve` failing to correctly construct an absolute root path when passed as an argument
|
* Fixed `rojo serve` failing to correctly construct an absolute root path when passed as an argument
|
||||||
* Fixed intense CPU usage when running `rojo serve`
|
* Fixed intense CPU usage when running `rojo serve`
|
||||||
|
|
||||||
## 0.3.1 (December 14, 2017)
|
## [0.3.1](https://github.com/LPGhatguy/rojo/releases/tag/v0.3.1) (December 14, 2017)
|
||||||
* Improved error reporting when invalid JSON is found in a `rojo.json` project
|
* Improved error reporting when invalid JSON is found in a `rojo.json` project
|
||||||
* These messages are passed on from Serde
|
* These messages are passed on from Serde
|
||||||
|
|
||||||
## 0.3.0 (December 12, 2017)
|
## [0.3.0](https://github.com/LPGhatguy/rojo/releases/tag/v0.3.0) (December 12, 2017)
|
||||||
* Factored out the plugin into a separate repository
|
* Factored out the plugin into a separate repository
|
||||||
* Fixed server when using a file as a partition
|
* Fixed server when using a file as a partition
|
||||||
* Previously, trailing slashes were put on the end of a partition even if the read request was an empty string. This broke file reading on Windows when a partition pointed to a file instead of a directory!
|
* Previously, trailing slashes were put on the end of a partition even if the read request was an empty string. This broke file reading on Windows when a partition pointed to a file instead of a directory!
|
||||||
* Started running automatic tests on Travis CI (#9)
|
* Started running automatic tests on Travis CI (#9)
|
||||||
|
|
||||||
## 0.2.3 (December 4, 2017)
|
## [0.2.3](https://github.com/LPGhatguy/rojo/releases/tag/v0.2.3) (December 4, 2017)
|
||||||
* Plugin only release
|
* Plugin only release
|
||||||
* Tightened `init` file rules to only match script files
|
* Tightened `init` file rules to only match script files
|
||||||
* Previously, Rojo would sometimes pick up the wrong file when syncing
|
* Previously, Rojo would sometimes pick up the wrong file when syncing
|
||||||
|
|
||||||
## 0.2.2 (December 1, 2017)
|
## [0.2.2](https://github.com/LPGhatguy/rojo/releases/tag/v0.2.2) (December 1, 2017)
|
||||||
* Plugin only release
|
* Plugin only release
|
||||||
* Fixed broken reconciliation behavior with `init` files
|
* Fixed broken reconciliation behavior with `init` files
|
||||||
|
|
||||||
## 0.2.1 (December 1, 2017)
|
## [0.2.1](https://github.com/LPGhatguy/rojo/releases/tag/v0.2.1) (December 1, 2017)
|
||||||
* Plugin only release
|
* Plugin only release
|
||||||
* Changes default port to 8000
|
* Changes default port to 8000
|
||||||
|
|
||||||
## 0.2.0 (December 1, 2017)
|
## [0.2.0](https://github.com/LPGhatguy/rojo/releases/tag/v0.2.0) (December 1, 2017)
|
||||||
* Support for `init.lua` like rbxfs and rbxpacker
|
* Support for `init.lua` like rbxfs and rbxpacker
|
||||||
* More robust syncing with a new reconciler
|
* More robust syncing with a new reconciler
|
||||||
|
|
||||||
## 0.1.0 (November 29, 2017)
|
## [0.1.0](https://github.com/LPGhatguy/rojo/releases/tag/v0.1.0) (November 29, 2017)
|
||||||
* Initial release, functionally very similar to [rbxfs](https://github.com/LPGhatguy/rbxfs)
|
* Initial release, functionally very similar to [rbxfs](https://github.com/LPGhatguy/rbxfs)
|
||||||
411
Cargo.lock
generated
22
README.md
@@ -12,7 +12,10 @@
|
|||||||
<img src="https://img.shields.io/crates/v/rojo.svg?label=version" alt="Latest server version" />
|
<img src="https://img.shields.io/crates/v/rojo.svg?label=version" alt="Latest server version" />
|
||||||
</a>
|
</a>
|
||||||
<a href="https://lpghatguy.github.io/rojo/0.4.x">
|
<a href="https://lpghatguy.github.io/rojo/0.4.x">
|
||||||
<img src="https://img.shields.io/badge/documentation-0.4.x-brightgreen.svg" alt="Rojo Documentation" />
|
<img src="https://img.shields.io/badge/docs-0.4.x-brightgreen.svg" alt="Rojo Documentation" />
|
||||||
|
</a>
|
||||||
|
<a href="https://lpghatguy.github.io/rojo/0.5.x">
|
||||||
|
<img src="https://img.shields.io/badge/docs-0.5.x-brightgreen.svg" alt="Rojo Documentation" />
|
||||||
</a>
|
</a>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
@@ -28,17 +31,16 @@ Rojo is designed for **power users** who want to use the **best tools available*
|
|||||||
Rojo lets you:
|
Rojo lets you:
|
||||||
|
|
||||||
* Work on scripts from the filesystem, in your favorite editor
|
* Work on scripts from the filesystem, in your favorite editor
|
||||||
* Version your place, library, or plugin using Git or another VCS
|
* Version your place, model, or plugin using Git or another VCS
|
||||||
* Sync JSON-format models from the filesystem into your game
|
* Sync `rbxmx` and `rbxm` models into your game in real time
|
||||||
|
* Package and deploy your project to Roblox.com from the command line
|
||||||
|
|
||||||
Soon, Rojo will be able to:
|
Soon, Rojo will be able to:
|
||||||
|
|
||||||
* Sync scripts from Roblox Studio to the filesystem
|
* Sync instances from Roblox Studio to the filesystem
|
||||||
* Compile MoonScript and sync it into Roblox Studio
|
* Compile MoonScript and other custom things for your project
|
||||||
* Sync `rbxmx` models between the filesystem and Roblox Studio
|
|
||||||
* Package projects into `rbxmx` files from the command line
|
|
||||||
|
|
||||||
## [Documentation](https://lpghatguy.github.io/rojo/0.4.x)
|
## [Documentation](https://lpghatguy.github.io/rojo)
|
||||||
You can also view the documentation by browsing the [docs](https://github.com/LPGhatguy/rojo/tree/master/docs) folder of the repository, but because it uses a number of Markdown extensions, it may not be very readable.
|
You can also view the documentation by browsing the [docs](https://github.com/LPGhatguy/rojo/tree/master/docs) folder of the repository, but because it uses a number of Markdown extensions, it may not be very readable.
|
||||||
|
|
||||||
## Inspiration and Alternatives
|
## Inspiration and Alternatives
|
||||||
@@ -58,11 +60,9 @@ Here are a few, if you're looking for alternatives or supplements to Rojo:
|
|||||||
If you use a plugin that _isn't_ Rojo for syncing code, open an issue and let me know why! I'd like Rojo to be the end-all tool so that people stop reinventing solutions to this problem.
|
If you use a plugin that _isn't_ Rojo for syncing code, open an issue and let me know why! I'd like Rojo to be the end-all tool so that people stop reinventing solutions to this problem.
|
||||||
|
|
||||||
## Contributing
|
## Contributing
|
||||||
The `master` branch is a rewrite known as **Epiphany**. It includes a breaking change to the project configuration format and an infrastructure overhaul.
|
|
||||||
|
|
||||||
Pull requests are welcome!
|
Pull requests are welcome!
|
||||||
|
|
||||||
All pull requests are run against a test suite on Travis CI. That test suite should always pass!
|
All pull requests are run against a test suite on Travis CI. That test suite should always pass!
|
||||||
|
|
||||||
## License
|
## License
|
||||||
Rojo is available under the terms of the Mozilla Public License, Version 2.0. See [LICENSE](LICENSE) for details.
|
Rojo is available under the terms of the Mozilla Public License, Version 2.0. See [LICENSE.txt](LICENSE.txt) for details.
|
||||||
BIN
assets/round-rect-4px-radius.png
Normal file
|
After Width: | Height: | Size: 175 B |
3
docs/extra.css
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
.md-typeset__table {
|
||||||
|
width: 100%;
|
||||||
|
}
|
||||||
|
Before Width: | Height: | Size: 5.8 KiB |
|
Before Width: | Height: | Size: 17 KiB |
17
docs/images/sync-example-files.gv
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
digraph "Sync Files" {
|
||||||
|
graph [
|
||||||
|
ranksep = "0.7",
|
||||||
|
nodesep = "0.5",
|
||||||
|
];
|
||||||
|
node [
|
||||||
|
fontname = "monospace",
|
||||||
|
shape = "record",
|
||||||
|
];
|
||||||
|
|
||||||
|
my_model [label = "MyModel"]
|
||||||
|
init_server [label = "init.server.lua"]
|
||||||
|
foo [label = "foo.lua"]
|
||||||
|
|
||||||
|
my_model -> init_server
|
||||||
|
my_model -> foo
|
||||||
|
}
|
||||||
38
docs/images/sync-example-files.svg
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
|
||||||
|
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN"
|
||||||
|
"http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
|
||||||
|
<!-- Generated by graphviz version 2.38.0 (20140413.2041)
|
||||||
|
-->
|
||||||
|
<!-- Title: Sync Files Pages: 1 -->
|
||||||
|
<svg width="258pt" height="132pt"
|
||||||
|
viewBox="0.00 0.00 258.00 132.00" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
|
||||||
|
<g id="graph0" class="graph" transform="scale(1 1) rotate(0) translate(4 128)">
|
||||||
|
<title>Sync Files</title>
|
||||||
|
<polygon fill="white" stroke="none" points="-4,4 -4,-128 254,-128 254,4 -4,4"/>
|
||||||
|
<!-- my_model -->
|
||||||
|
<g id="node1" class="node"><title>my_model</title>
|
||||||
|
<polygon fill="none" stroke="black" points="104,-87.5 104,-123.5 178,-123.5 178,-87.5 104,-87.5"/>
|
||||||
|
<text text-anchor="middle" x="141" y="-101.8" font-family="monospace" font-size="14.00">MyModel</text>
|
||||||
|
</g>
|
||||||
|
<!-- init_server -->
|
||||||
|
<g id="node2" class="node"><title>init_server</title>
|
||||||
|
<polygon fill="none" stroke="black" points="0,-0.5 0,-36.5 140,-36.5 140,-0.5 0,-0.5"/>
|
||||||
|
<text text-anchor="middle" x="70" y="-14.8" font-family="monospace" font-size="14.00">init.server.lua</text>
|
||||||
|
</g>
|
||||||
|
<!-- my_model->init_server -->
|
||||||
|
<g id="edge1" class="edge"><title>my_model->init_server</title>
|
||||||
|
<path fill="none" stroke="black" d="M126.632,-87.299C116.335,-74.9713 102.308,-58.1787 90.7907,-44.3902"/>
|
||||||
|
<polygon fill="black" stroke="black" points="93.4435,-42.1065 84.3465,-36.6754 88.0711,-46.594 93.4435,-42.1065"/>
|
||||||
|
</g>
|
||||||
|
<!-- foo -->
|
||||||
|
<g id="node3" class="node"><title>foo</title>
|
||||||
|
<polygon fill="none" stroke="black" points="176,-0.5 176,-36.5 250,-36.5 250,-0.5 176,-0.5"/>
|
||||||
|
<text text-anchor="middle" x="213" y="-14.8" font-family="monospace" font-size="14.00">foo.lua</text>
|
||||||
|
</g>
|
||||||
|
<!-- my_model->foo -->
|
||||||
|
<g id="edge2" class="edge"><title>my_model->foo</title>
|
||||||
|
<path fill="none" stroke="black" d="M155.57,-87.299C166.013,-74.9713 180.237,-58.1787 191.917,-44.3902"/>
|
||||||
|
<polygon fill="black" stroke="black" points="194.659,-46.5681 198.451,-36.6754 189.317,-42.0437 194.659,-46.5681"/>
|
||||||
|
</g>
|
||||||
|
</g>
|
||||||
|
</svg>
|
||||||
|
After Width: | Height: | Size: 2.0 KiB |
15
docs/images/sync-example-instances.gv
Normal file
@@ -0,0 +1,15 @@
|
|||||||
|
digraph "Sync Files" {
|
||||||
|
graph [
|
||||||
|
ranksep = "0.7",
|
||||||
|
nodesep = "0.5",
|
||||||
|
];
|
||||||
|
node [
|
||||||
|
fontname = "monospace",
|
||||||
|
shape = "record",
|
||||||
|
];
|
||||||
|
|
||||||
|
my_model [label = "MyModel (Script)"]
|
||||||
|
foo [label = "foo (ModuleScript)"]
|
||||||
|
|
||||||
|
my_model -> foo
|
||||||
|
}
|
||||||
28
docs/images/sync-example-instances.svg
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
|
||||||
|
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN"
|
||||||
|
"http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
|
||||||
|
<!-- Generated by graphviz version 2.38.0 (20140413.2041)
|
||||||
|
-->
|
||||||
|
<!-- Title: Sync Files Pages: 1 -->
|
||||||
|
<svg width="173pt" height="132pt"
|
||||||
|
viewBox="0.00 0.00 173.00 132.00" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
|
||||||
|
<g id="graph0" class="graph" transform="scale(1 1) rotate(0) translate(4 128)">
|
||||||
|
<title>Sync Files</title>
|
||||||
|
<polygon fill="white" stroke="none" points="-4,4 -4,-128 169,-128 169,4 -4,4"/>
|
||||||
|
<!-- my_model -->
|
||||||
|
<g id="node1" class="node"><title>my_model</title>
|
||||||
|
<polygon fill="none" stroke="black" points="8,-87.5 8,-123.5 157,-123.5 157,-87.5 8,-87.5"/>
|
||||||
|
<text text-anchor="middle" x="82.5" y="-101.8" font-family="monospace" font-size="14.00">MyModel (Script)</text>
|
||||||
|
</g>
|
||||||
|
<!-- foo -->
|
||||||
|
<g id="node2" class="node"><title>foo</title>
|
||||||
|
<polygon fill="none" stroke="black" points="0,-0.5 0,-36.5 165,-36.5 165,-0.5 0,-0.5"/>
|
||||||
|
<text text-anchor="middle" x="82.5" y="-14.8" font-family="monospace" font-size="14.00">foo (ModuleScript)</text>
|
||||||
|
</g>
|
||||||
|
<!-- my_model->foo -->
|
||||||
|
<g id="edge1" class="edge"><title>my_model->foo</title>
|
||||||
|
<path fill="none" stroke="black" d="M82.5,-87.299C82.5,-75.6626 82.5,-60.0479 82.5,-46.7368"/>
|
||||||
|
<polygon fill="black" stroke="black" points="86.0001,-46.6754 82.5,-36.6754 79.0001,-46.6755 86.0001,-46.6754"/>
|
||||||
|
</g>
|
||||||
|
</g>
|
||||||
|
</svg>
|
||||||
|
After Width: | Height: | Size: 1.4 KiB |
17
docs/images/sync-example-json-model.gv
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
digraph "Sync Files" {
|
||||||
|
graph [
|
||||||
|
ranksep = "0.7",
|
||||||
|
nodesep = "0.5",
|
||||||
|
];
|
||||||
|
node [
|
||||||
|
fontname = "monospace",
|
||||||
|
shape = "record",
|
||||||
|
];
|
||||||
|
|
||||||
|
model [label = "My Cool Model (Folder)"]
|
||||||
|
root_part [label = "RootPart (Part)"]
|
||||||
|
send_money [label = "SendMoney (RemoteEvent)"]
|
||||||
|
|
||||||
|
model -> root_part
|
||||||
|
model -> send_money
|
||||||
|
}
|
||||||
38
docs/images/sync-example-json-model.svg
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
|
||||||
|
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN"
|
||||||
|
"http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
|
||||||
|
<!-- Generated by graphviz version 2.38.0 (20140413.2041)
|
||||||
|
-->
|
||||||
|
<!-- Title: Sync Files Pages: 1 -->
|
||||||
|
<svg width="390pt" height="132pt"
|
||||||
|
viewBox="0.00 0.00 390.00 132.00" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
|
||||||
|
<g id="graph0" class="graph" transform="scale(1 1) rotate(0) translate(4 128)">
|
||||||
|
<title>Sync Files</title>
|
||||||
|
<polygon fill="white" stroke="none" points="-4,4 -4,-128 386,-128 386,4 -4,4"/>
|
||||||
|
<!-- model -->
|
||||||
|
<g id="node1" class="node"><title>model</title>
|
||||||
|
<polygon fill="none" stroke="black" points="75,-87.5 75,-123.5 273,-123.5 273,-87.5 75,-87.5"/>
|
||||||
|
<text text-anchor="middle" x="174" y="-101.8" font-family="monospace" font-size="14.00">My Cool Model (Folder)</text>
|
||||||
|
</g>
|
||||||
|
<!-- root_part -->
|
||||||
|
<g id="node2" class="node"><title>root_part</title>
|
||||||
|
<polygon fill="none" stroke="black" points="0,-0.5 0,-36.5 140,-36.5 140,-0.5 0,-0.5"/>
|
||||||
|
<text text-anchor="middle" x="70" y="-14.8" font-family="monospace" font-size="14.00">RootPart (Part)</text>
|
||||||
|
</g>
|
||||||
|
<!-- model->root_part -->
|
||||||
|
<g id="edge1" class="edge"><title>model->root_part</title>
|
||||||
|
<path fill="none" stroke="black" d="M152.954,-87.299C137.448,-74.6257 116.168,-57.2335 99.0438,-43.2377"/>
|
||||||
|
<polygon fill="black" stroke="black" points="100.972,-40.2938 91.0147,-36.6754 96.5426,-45.7138 100.972,-40.2938"/>
|
||||||
|
</g>
|
||||||
|
<!-- send_money -->
|
||||||
|
<g id="node3" class="node"><title>send_money</title>
|
||||||
|
<polygon fill="none" stroke="black" points="176,-0.5 176,-36.5 382,-36.5 382,-0.5 176,-0.5"/>
|
||||||
|
<text text-anchor="middle" x="279" y="-14.8" font-family="monospace" font-size="14.00">SendMoney (RemoteEvent)</text>
|
||||||
|
</g>
|
||||||
|
<!-- model->send_money -->
|
||||||
|
<g id="edge2" class="edge"><title>model->send_money</title>
|
||||||
|
<path fill="none" stroke="black" d="M195.248,-87.299C210.904,-74.6257 232.388,-57.2335 249.677,-43.2377"/>
|
||||||
|
<polygon fill="black" stroke="black" points="252.213,-45.6878 257.783,-36.6754 247.809,-40.2471 252.213,-45.6878"/>
|
||||||
|
</g>
|
||||||
|
</g>
|
||||||
|
</svg>
|
||||||
|
After Width: | Height: | Size: 2.1 KiB |
|
Before Width: | Height: | Size: 1.9 KiB |
@@ -2,9 +2,9 @@ This is the documentation home for Rojo.
|
|||||||
|
|
||||||
Available versions of these docs:
|
Available versions of these docs:
|
||||||
|
|
||||||
|
* [Latest version (currently 0.5.x)](https://lpghatguy.github.io/rojo)
|
||||||
* [0.5.x](https://lpghatguy.github.io/rojo/0.5.x)
|
* [0.5.x](https://lpghatguy.github.io/rojo/0.5.x)
|
||||||
* [0.4.x](https://lpghatguy.github.io/rojo/0.4.x)
|
* [0.4.x](https://lpghatguy.github.io/rojo/0.4.x)
|
||||||
* [`master` branch](https://lpghatguy.github.io/rojo/master)
|
|
||||||
|
|
||||||
**Rojo** is a flexible multi-tool designed for creating robust Roblox projects.
|
**Rojo** is a flexible multi-tool designed for creating robust Roblox projects.
|
||||||
|
|
||||||
|
|||||||
@@ -50,9 +50,6 @@ Metadata begins with a dollar sign (`$`), like `$className`. This is so that chi
|
|||||||
|
|
||||||
All other values are considered children, where the key is the instance's name, and the value is an object, repeating the process.
|
All other values are considered children, where the key is the instance's name, and the value is an object, repeating the process.
|
||||||
|
|
||||||
## Migrating `.model.json` Files
|
|
||||||
No upgrade path yet, stay tuned.
|
|
||||||
|
|
||||||
## Migrating Unknown Files
|
## Migrating Unknown Files
|
||||||
If you used Rojo to sync in files as `StringValue` objects, you'll need to make sure those files end with the `txt` extension to preserve this in Rojo 0.5.x.
|
If you used Rojo to sync in files as `StringValue` objects, you'll need to make sure those files end with the `txt` extension to preserve this in Rojo 0.5.x.
|
||||||
|
|
||||||
|
|||||||
@@ -9,6 +9,20 @@ This page aims to describe how Rojo turns files on the filesystem into Roblox ob
|
|||||||
| `*.lua` | `ModuleScript` |
|
| `*.lua` | `ModuleScript` |
|
||||||
| `*.csv` | `LocalizationTable` |
|
| `*.csv` | `LocalizationTable` |
|
||||||
| `*.txt` | `StringValue` |
|
| `*.txt` | `StringValue` |
|
||||||
|
| `*.model.json` | Any |
|
||||||
|
| `*.rbxm` | Any |
|
||||||
|
| `*.rbxmx` | Any |
|
||||||
|
|
||||||
|
## Limitations
|
||||||
|
Not all property types can be synced by Rojo in real-time due to limitations of the Roblox Studio plugin API. In these cases, you can usually generate a place file and open it when you start working on a project.
|
||||||
|
|
||||||
|
Some common cases you might hit are:
|
||||||
|
|
||||||
|
* Binary data (Terrain, CSG, CollectionService tags)
|
||||||
|
* `MeshPart.MeshId`
|
||||||
|
* `HttpService.HttpEnabled`
|
||||||
|
|
||||||
|
For a list of all property types that Rojo can reason about, both when live-syncing and when building place files, look at [rbx_tree's type coverage documentation](https://github.com/LPGhatguy/rbx-tree/tree/master/rbx_tree#coverage).
|
||||||
|
|
||||||
## Folders
|
## Folders
|
||||||
Any directory on the filesystem will turn into a `Folder` instance unless it contains an 'init' script, described below.
|
Any directory on the filesystem will turn into a `Folder` instance unless it contains an 'init' script, described below.
|
||||||
@@ -20,16 +34,68 @@ If a directory contains a file named `init.server.lua`, `init.client.lua`, or `i
|
|||||||
|
|
||||||
For example, these files:
|
For example, these files:
|
||||||
|
|
||||||
* my-game
|
<div align="center">
|
||||||
* init.client.lua
|
<a href="../images/sync-example-files.svg">
|
||||||
* foo.lua
|
<img src="../images/sync-example-files.svg" alt="Tree of files on disk" />
|
||||||
|
</a>
|
||||||
|
</div>
|
||||||
|
|
||||||
Will turn into these instances in Roblox:
|
Will turn into these instances in Roblox:
|
||||||
|
|
||||||

|
<div align="center">
|
||||||
|
<a href="../images/sync-example-instances.svg">
|
||||||
|
<img src="../images/sync-example-instances.svg" alt="Tree of instances in Roblox" />
|
||||||
|
</a>
|
||||||
|
</div>
|
||||||
|
|
||||||
## Localization Tables
|
## Localization Tables
|
||||||
Any CSV files are transformed into `LocalizationTable` instances. Rojo expects these files to follow the same format that Roblox does when importing and exporting localization information.
|
Any CSV files are transformed into `LocalizationTable` instances. Rojo expects these files to follow the same format that Roblox does when importing and exporting localization information.
|
||||||
|
|
||||||
## Plain Text Files
|
## Plain Text Files
|
||||||
Plain text files (`.txt`) files are transformed into `StringValue` instances. This is useful for bringing in text data that can be read by scripts at runtime.
|
Plain text files (`.txt`) files are transformed into `StringValue` instances. This is useful for bringing in text data that can be read by scripts at runtime.
|
||||||
|
|
||||||
|
## JSON Models
|
||||||
|
Files ending in `.model.json` can be used to describe simple models. They're designed to be hand-written and are useful for instances like `RemoteEvent`.
|
||||||
|
|
||||||
|
A JSON model describing a folder containing a `Part` and a `RemoteEvent` could be described as:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"Name": "My Cool Model",
|
||||||
|
"ClassName": "Folder",
|
||||||
|
"Children": [
|
||||||
|
{
|
||||||
|
"Name": "RootPart",
|
||||||
|
"ClassName": "Part",
|
||||||
|
"Properties": {
|
||||||
|
"Size": {
|
||||||
|
"Type": "Vector3",
|
||||||
|
"Value": [4, 4, 4]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "SendMoney",
|
||||||
|
"ClassName": "RemoteEvent"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
It would turn into instances in this shape:
|
||||||
|
|
||||||
|
<div align="center">
|
||||||
|
<a href="../images/sync-example-json-model.svg">
|
||||||
|
<img src="../images/sync-example-json-model.svg" alt="Tree of instances in Roblox" />
|
||||||
|
</a>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
## Binary and XML Models
|
||||||
|
Rojo supports both binary (`.rbxm`) and XML (`.rbxmx`) models generated by Roblox Studio or another tool.
|
||||||
|
|
||||||
|
Not all property types are supported!
|
||||||
|
|
||||||
|
For a rundown of supported types, see:
|
||||||
|
|
||||||
|
* [rbxm Type Coverage](https://github.com/LPGhatguy/rbx-tree/tree/master/rbx_binary#coverage)
|
||||||
|
* [rbxmx Type Coverage](https://github.com/LPGhatguy/rbx-tree/tree/master/rbx_xml#coverage)
|
||||||
@@ -3,23 +3,33 @@
|
|||||||
# Kludged documentation generator to support multiple versions.
|
# Kludged documentation generator to support multiple versions.
|
||||||
# Make sure the `site` folder is a checkout of this repository's `gh-pages`
|
# Make sure the `site` folder is a checkout of this repository's `gh-pages`
|
||||||
# branch.
|
# branch.
|
||||||
# To use, copy this file to `generate-docs.run` so that Git will leave it alone,
|
|
||||||
# then run `generate-docs.run` in the root of the repository.
|
|
||||||
|
|
||||||
set -e
|
set -e
|
||||||
|
|
||||||
CURRENT_BRANCH=$(git rev-parse --abbrev-ref HEAD)
|
REMOTE=$(git remote get-url origin)
|
||||||
|
CHECKOUT="$(mktemp -d)"
|
||||||
|
OUTPUT="$(pwd)/site"
|
||||||
|
|
||||||
echo "Building 0.4.x"
|
if [ -d site ]
|
||||||
git checkout v0.4.x
|
then
|
||||||
git pull
|
cd site
|
||||||
mkdocs build --site-dir site/0.4.x
|
git pull
|
||||||
|
else
|
||||||
|
git clone "$REMOTE" site
|
||||||
|
cd site
|
||||||
|
git checkout gh-pages
|
||||||
|
fi
|
||||||
|
|
||||||
|
git clone "$REMOTE" "$CHECKOUT"
|
||||||
|
cd "$CHECKOUT"
|
||||||
|
|
||||||
echo "Building master"
|
echo "Building master"
|
||||||
git checkout master
|
git checkout master
|
||||||
mkdocs build --site-dir site/master
|
mkdocs build --site-dir "$OUTPUT"
|
||||||
|
|
||||||
echo "Building 0.5.x"
|
echo "Building 0.5.x"
|
||||||
mkdocs build --site-dir site/0.5.x
|
mkdocs build --site-dir "$OUTPUT/0.5.x"
|
||||||
|
|
||||||
git checkout "$CURRENT_BRANCH"
|
echo "Building 0.4.x"
|
||||||
|
git checkout v0.4.x
|
||||||
|
mkdocs build --site-dir "$OUTPUT/0.4.x"
|
||||||
@@ -17,6 +17,9 @@ nav:
|
|||||||
- Sync Details: sync-details.md
|
- Sync Details: sync-details.md
|
||||||
- Migrating from 0.4.x to 0.5.x: migrating-to-epiphany.md
|
- Migrating from 0.4.x to 0.5.x: migrating-to-epiphany.md
|
||||||
|
|
||||||
|
extra_css:
|
||||||
|
- extra.css
|
||||||
|
|
||||||
markdown_extensions:
|
markdown_extensions:
|
||||||
- attr_list
|
- attr_list
|
||||||
- admonition
|
- admonition
|
||||||
|
|||||||
@@ -5,8 +5,10 @@
|
|||||||
|
|
||||||
"ReplicatedStorage": {
|
"ReplicatedStorage": {
|
||||||
"$className": "ReplicatedStorage",
|
"$className": "ReplicatedStorage",
|
||||||
|
|
||||||
"Rojo": {
|
"Rojo": {
|
||||||
"$className": "Folder",
|
"$className": "Folder",
|
||||||
|
|
||||||
"Plugin": {
|
"Plugin": {
|
||||||
"$path": "src"
|
"$path": "src"
|
||||||
},
|
},
|
||||||
@@ -28,8 +30,19 @@
|
|||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
|
||||||
|
"HttpService": {
|
||||||
|
"$className": "HttpService",
|
||||||
|
"$properties": {
|
||||||
|
"HttpEnabled": {
|
||||||
|
"Type": "Bool",
|
||||||
|
"Value": true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
"TestService": {
|
"TestService": {
|
||||||
"$className": "TestService",
|
"$className": "TestService",
|
||||||
|
|
||||||
"TestBootstrap": {
|
"TestBootstrap": {
|
||||||
"$path": "testBootstrap.server.lua"
|
"$path": "testBootstrap.server.lua"
|
||||||
}
|
}
|
||||||
@@ -11,6 +11,14 @@ ApiContext.__index = ApiContext
|
|||||||
-- TODO: Audit cases of errors and create enum values for each of them.
|
-- TODO: Audit cases of errors and create enum values for each of them.
|
||||||
ApiContext.Error = {
|
ApiContext.Error = {
|
||||||
ServerIdMismatch = "ServerIdMismatch",
|
ServerIdMismatch = "ServerIdMismatch",
|
||||||
|
|
||||||
|
-- The server gave an unexpected 400-category error, which may be the
|
||||||
|
-- client's fault.
|
||||||
|
ClientError = "ClientError",
|
||||||
|
|
||||||
|
-- The server gave an unexpected 500-category error, which may be the
|
||||||
|
-- server's fault.
|
||||||
|
ServerError = "ServerError",
|
||||||
}
|
}
|
||||||
|
|
||||||
setmetatable(ApiContext.Error, {
|
setmetatable(ApiContext.Error, {
|
||||||
@@ -19,6 +27,18 @@ setmetatable(ApiContext.Error, {
|
|||||||
end
|
end
|
||||||
})
|
})
|
||||||
|
|
||||||
|
local function rejectFailedRequests(response)
|
||||||
|
if response.code >= 400 then
|
||||||
|
if response.code < 500 then
|
||||||
|
return Promise.reject(ApiContext.Error.ClientError)
|
||||||
|
else
|
||||||
|
return Promise.reject(ApiContext.Error.ServerError)
|
||||||
|
end
|
||||||
|
end
|
||||||
|
|
||||||
|
return response
|
||||||
|
end
|
||||||
|
|
||||||
function ApiContext.new(baseUrl)
|
function ApiContext.new(baseUrl)
|
||||||
assert(type(baseUrl) == "string")
|
assert(type(baseUrl) == "string")
|
||||||
|
|
||||||
@@ -43,6 +63,7 @@ function ApiContext:connect()
|
|||||||
local url = ("%s/api/rojo"):format(self.baseUrl)
|
local url = ("%s/api/rojo"):format(self.baseUrl)
|
||||||
|
|
||||||
return Http.get(url)
|
return Http.get(url)
|
||||||
|
:andThen(rejectFailedRequests)
|
||||||
:andThen(function(response)
|
:andThen(function(response)
|
||||||
local body = response:json()
|
local body = response:json()
|
||||||
|
|
||||||
@@ -102,9 +123,7 @@ function ApiContext:read(ids)
|
|||||||
local url = ("%s/api/read/%s"):format(self.baseUrl, table.concat(ids, ","))
|
local url = ("%s/api/read/%s"):format(self.baseUrl, table.concat(ids, ","))
|
||||||
|
|
||||||
return Http.get(url)
|
return Http.get(url)
|
||||||
:catch(function(err)
|
:andThen(rejectFailedRequests)
|
||||||
return Promise.reject(err)
|
|
||||||
end)
|
|
||||||
:andThen(function(response)
|
:andThen(function(response)
|
||||||
local body = response:json()
|
local body = response:json()
|
||||||
|
|
||||||
@@ -121,14 +140,19 @@ end
|
|||||||
function ApiContext:retrieveMessages()
|
function ApiContext:retrieveMessages()
|
||||||
local url = ("%s/api/subscribe/%s"):format(self.baseUrl, self.messageCursor)
|
local url = ("%s/api/subscribe/%s"):format(self.baseUrl, self.messageCursor)
|
||||||
|
|
||||||
return Http.get(url)
|
local function sendRequest()
|
||||||
:catch(function(err)
|
return Http.get(url)
|
||||||
if err.type == HttpError.Error.Timeout then
|
:catch(function(err)
|
||||||
return self:retrieveMessages()
|
if err.type == HttpError.Error.Timeout then
|
||||||
end
|
return sendRequest()
|
||||||
|
end
|
||||||
|
|
||||||
return Promise.reject(err)
|
return Promise.reject(err)
|
||||||
end)
|
end)
|
||||||
|
end
|
||||||
|
|
||||||
|
return sendRequest()
|
||||||
|
:andThen(rejectFailedRequests)
|
||||||
:andThen(function(response)
|
:andThen(function(response)
|
||||||
local body = response:json()
|
local body = response:json()
|
||||||
|
|
||||||
|
|||||||
@@ -9,25 +9,16 @@ local Assets = {
|
|||||||
},
|
},
|
||||||
},
|
},
|
||||||
Slices = {
|
Slices = {
|
||||||
GrayBox = {
|
RoundBox = {
|
||||||
asset = sheetAsset,
|
asset = "rbxassetid://2773204550",
|
||||||
offset = Vector2.new(147, 433),
|
offset = Vector2.new(0, 0),
|
||||||
size = Vector2.new(38, 36),
|
size = Vector2.new(32, 32),
|
||||||
center = Rect.new(8, 8, 9, 9),
|
center = Rect.new(4, 4, 4, 4),
|
||||||
},
|
|
||||||
GrayButton02 = {
|
|
||||||
asset = sheetAsset,
|
|
||||||
offset = Vector2.new(0, 98),
|
|
||||||
size = Vector2.new(190, 45),
|
|
||||||
center = Rect.new(16, 16, 17, 17),
|
|
||||||
},
|
|
||||||
GrayButton07 = {
|
|
||||||
asset = sheetAsset,
|
|
||||||
offset = Vector2.new(195, 0),
|
|
||||||
size = Vector2.new(49, 49),
|
|
||||||
center = Rect.new(16, 16, 17, 17),
|
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
|
Images = {
|
||||||
|
Logo = "rbxassetid://2773210620",
|
||||||
|
},
|
||||||
StartSession = "",
|
StartSession = "",
|
||||||
SessionActive = "",
|
SessionActive = "",
|
||||||
Configure = "",
|
Configure = "",
|
||||||
|
|||||||
@@ -71,7 +71,6 @@ function App:init()
|
|||||||
})
|
})
|
||||||
|
|
||||||
self.connectButton = nil
|
self.connectButton = nil
|
||||||
self.configButton = nil
|
|
||||||
self.currentSession = nil
|
self.currentSession = nil
|
||||||
|
|
||||||
self.displayedVersion = DevSettings:isEnabled()
|
self.displayedVersion = DevSettings:isEnabled()
|
||||||
@@ -84,7 +83,19 @@ function App:render()
|
|||||||
|
|
||||||
if self.state.sessionStatus == SessionStatus.Connected then
|
if self.state.sessionStatus == SessionStatus.Connected then
|
||||||
children = {
|
children = {
|
||||||
ConnectionActivePanel = e(ConnectionActivePanel),
|
ConnectionActivePanel = e(ConnectionActivePanel, {
|
||||||
|
stopSession = function()
|
||||||
|
Logging.trace("Disconnecting session")
|
||||||
|
|
||||||
|
self.currentSession:disconnect()
|
||||||
|
self.currentSession = nil
|
||||||
|
self:setState({
|
||||||
|
sessionStatus = SessionStatus.Disconnected,
|
||||||
|
})
|
||||||
|
|
||||||
|
Logging.trace("Session terminated by user")
|
||||||
|
end,
|
||||||
|
}),
|
||||||
}
|
}
|
||||||
elseif self.state.sessionStatus == SessionStatus.ConfiguringSession then
|
elseif self.state.sessionStatus == SessionStatus.ConfiguringSession then
|
||||||
children = {
|
children = {
|
||||||
@@ -96,8 +107,7 @@ function App:render()
|
|||||||
address = address,
|
address = address,
|
||||||
port = port,
|
port = port,
|
||||||
onError = function(message)
|
onError = function(message)
|
||||||
Logging.warn("%s", tostring(message))
|
Logging.warn("Rojo session terminated because of an error:\n%s", tostring(message))
|
||||||
Logging.trace("Session terminated due to error")
|
|
||||||
self.currentSession = nil
|
self.currentSession = nil
|
||||||
|
|
||||||
self:setState({
|
self:setState({
|
||||||
@@ -167,15 +177,6 @@ function App:didMount()
|
|||||||
})
|
})
|
||||||
end
|
end
|
||||||
end)
|
end)
|
||||||
|
|
||||||
self.configButton = toolbar:CreateButton(
|
|
||||||
"Configure",
|
|
||||||
"Configure the Rojo plugin",
|
|
||||||
Assets.Configure)
|
|
||||||
self.configButton.ClickableWhenViewportHidden = false
|
|
||||||
self.configButton.Click:Connect(function()
|
|
||||||
self.configButton:SetActive(false)
|
|
||||||
end)
|
|
||||||
end
|
end
|
||||||
|
|
||||||
function App:didUpdate()
|
function App:didUpdate()
|
||||||
|
|||||||
@@ -4,47 +4,45 @@ local Plugin = Rojo.Plugin
|
|||||||
local Roact = require(Rojo.Roact)
|
local Roact = require(Rojo.Roact)
|
||||||
|
|
||||||
local Config = require(Plugin.Config)
|
local Config = require(Plugin.Config)
|
||||||
|
local Version = require(Plugin.Version)
|
||||||
local Assets = require(Plugin.Assets)
|
local Assets = require(Plugin.Assets)
|
||||||
|
local Theme = require(Plugin.Theme)
|
||||||
|
local joinBindings = require(Plugin.joinBindings)
|
||||||
|
|
||||||
local FitList = require(Plugin.Components.FitList)
|
local FitList = require(Plugin.Components.FitList)
|
||||||
local FitText = require(Plugin.Components.FitText)
|
local FitText = require(Plugin.Components.FitText)
|
||||||
local FormButton = require(Plugin.Components.FormButton)
|
local FormButton = require(Plugin.Components.FormButton)
|
||||||
local FormTextInput = require(Plugin.Components.FormTextInput)
|
local FormTextInput = require(Plugin.Components.FormTextInput)
|
||||||
|
|
||||||
local WhiteCross = Assets.Sprites.WhiteCross
|
local RoundBox = Assets.Slices.RoundBox
|
||||||
local GrayBox = Assets.Slices.GrayBox
|
|
||||||
|
|
||||||
local e = Roact.createElement
|
local e = Roact.createElement
|
||||||
|
|
||||||
local TEXT_COLOR = Color3.new(0.05, 0.05, 0.05)
|
|
||||||
local FORM_TEXT_SIZE = 20
|
|
||||||
|
|
||||||
local ConnectPanel = Roact.Component:extend("ConnectPanel")
|
local ConnectPanel = Roact.Component:extend("ConnectPanel")
|
||||||
|
|
||||||
function ConnectPanel:init()
|
function ConnectPanel:init()
|
||||||
self.labelSizes = {}
|
self.footerSize, self.setFooterSize = Roact.createBinding(Vector2.new())
|
||||||
self.labelSize, self.setLabelSize = Roact.createBinding(Vector2.new())
|
self.footerVersionSize, self.setFooterVersionSize = Roact.createBinding(Vector2.new())
|
||||||
|
|
||||||
|
-- This is constructed in init because 'joinBindings' is a hack and we'd
|
||||||
|
-- leak memory constructing it every render. When this kind of feature lands
|
||||||
|
-- in Roact properly, we can do this inline in render without fear.
|
||||||
|
self.footerRestSize = joinBindings(
|
||||||
|
{
|
||||||
|
self.footerSize,
|
||||||
|
self.footerVersionSize,
|
||||||
|
},
|
||||||
|
function(container, other)
|
||||||
|
return UDim2.new(0, container.X - other.X - 16, 0, 32)
|
||||||
|
end
|
||||||
|
)
|
||||||
|
|
||||||
self:setState({
|
self:setState({
|
||||||
address = Config.defaultHost,
|
address = "",
|
||||||
port = Config.defaultPort,
|
port = "",
|
||||||
})
|
})
|
||||||
end
|
end
|
||||||
|
|
||||||
function ConnectPanel:updateLabelSize(name, size)
|
|
||||||
self.labelSizes[name] = size
|
|
||||||
|
|
||||||
local x = 0
|
|
||||||
local y = 0
|
|
||||||
|
|
||||||
for _, size in pairs(self.labelSizes) do
|
|
||||||
x = math.max(x, size.X)
|
|
||||||
y = math.max(y, size.Y)
|
|
||||||
end
|
|
||||||
|
|
||||||
self.setLabelSize(Vector2.new(x, y))
|
|
||||||
end
|
|
||||||
|
|
||||||
function ConnectPanel:render()
|
function ConnectPanel:render()
|
||||||
local startSession = self.props.startSession
|
local startSession = self.props.startSession
|
||||||
local cancel = self.props.cancel
|
local cancel = self.props.cancel
|
||||||
@@ -52,11 +50,11 @@ function ConnectPanel:render()
|
|||||||
return e(FitList, {
|
return e(FitList, {
|
||||||
containerKind = "ImageLabel",
|
containerKind = "ImageLabel",
|
||||||
containerProps = {
|
containerProps = {
|
||||||
Image = GrayBox.asset,
|
Image = RoundBox.asset,
|
||||||
ImageRectOffset = GrayBox.offset,
|
ImageRectOffset = RoundBox.offset,
|
||||||
ImageRectSize = GrayBox.size,
|
ImageRectSize = RoundBox.size,
|
||||||
|
SliceCenter = RoundBox.center,
|
||||||
ScaleType = Enum.ScaleType.Slice,
|
ScaleType = Enum.ScaleType.Slice,
|
||||||
SliceCenter = GrayBox.center,
|
|
||||||
BackgroundTransparency = 1,
|
BackgroundTransparency = 1,
|
||||||
Position = UDim2.new(0.5, 0, 0.5, 0),
|
Position = UDim2.new(0.5, 0, 0.5, 0),
|
||||||
AnchorPoint = Vector2.new(0.5, 0.5),
|
AnchorPoint = Vector2.new(0.5, 0.5),
|
||||||
@@ -65,63 +63,20 @@ function ConnectPanel:render()
|
|||||||
HorizontalAlignment = Enum.HorizontalAlignment.Center,
|
HorizontalAlignment = Enum.HorizontalAlignment.Center,
|
||||||
},
|
},
|
||||||
}, {
|
}, {
|
||||||
Head = e("Frame", {
|
Inputs = e(FitList, {
|
||||||
LayoutOrder = 1,
|
|
||||||
Size = UDim2.new(1, 0, 0, 36),
|
|
||||||
BackgroundTransparency = 1,
|
|
||||||
}, {
|
|
||||||
Padding = e("UIPadding", {
|
|
||||||
PaddingTop = UDim.new(0, 8),
|
|
||||||
PaddingBottom = UDim.new(0, 8),
|
|
||||||
PaddingLeft = UDim.new(0, 8),
|
|
||||||
PaddingRight = UDim.new(0, 8),
|
|
||||||
}),
|
|
||||||
|
|
||||||
Title = e("TextLabel", {
|
|
||||||
Font = Enum.Font.SourceSansBold,
|
|
||||||
TextSize = 22,
|
|
||||||
Text = "Start New Rojo Session",
|
|
||||||
Size = UDim2.new(1, 0, 1, 0),
|
|
||||||
TextXAlignment = Enum.TextXAlignment.Left,
|
|
||||||
BackgroundTransparency = 1,
|
|
||||||
TextColor3 = TEXT_COLOR,
|
|
||||||
}),
|
|
||||||
|
|
||||||
Close = e("ImageButton", {
|
|
||||||
Image = WhiteCross.asset,
|
|
||||||
ImageRectOffset = WhiteCross.offset,
|
|
||||||
ImageRectSize = WhiteCross.size,
|
|
||||||
Size = UDim2.new(0, 18, 0, 18),
|
|
||||||
Position = UDim2.new(1, 0, 0.5, 0),
|
|
||||||
AnchorPoint = Vector2.new(1, 0.5),
|
|
||||||
ImageColor3 = TEXT_COLOR,
|
|
||||||
BackgroundTransparency = 1,
|
|
||||||
[Roact.Event.Activated] = function()
|
|
||||||
cancel()
|
|
||||||
end,
|
|
||||||
}),
|
|
||||||
}),
|
|
||||||
|
|
||||||
Border = e("Frame", {
|
|
||||||
BorderSizePixel = 0,
|
|
||||||
BackgroundColor3 = Color3.new(0.7, 0.7, 0.7),
|
|
||||||
Size = UDim2.new(1, -4, 0, 2),
|
|
||||||
LayoutOrder = 2,
|
|
||||||
}),
|
|
||||||
|
|
||||||
Body = e(FitList, {
|
|
||||||
containerProps = {
|
containerProps = {
|
||||||
BackgroundTransparency = 1,
|
BackgroundTransparency = 1,
|
||||||
LayoutOrder = 3,
|
LayoutOrder = 1,
|
||||||
},
|
},
|
||||||
layoutProps = {
|
layoutProps = {
|
||||||
|
FillDirection = Enum.FillDirection.Horizontal,
|
||||||
Padding = UDim.new(0, 8),
|
Padding = UDim.new(0, 8),
|
||||||
},
|
},
|
||||||
paddingProps = {
|
paddingProps = {
|
||||||
PaddingTop = UDim.new(0, 8),
|
PaddingTop = UDim.new(0, 20),
|
||||||
PaddingBottom = UDim.new(0, 8),
|
PaddingBottom = UDim.new(0, 10),
|
||||||
PaddingLeft = UDim.new(0, 8),
|
PaddingLeft = UDim.new(0, 24),
|
||||||
PaddingRight = UDim.new(0, 8),
|
PaddingRight = UDim.new(0, 24),
|
||||||
},
|
},
|
||||||
}, {
|
}, {
|
||||||
Address = e(FitList, {
|
Address = e(FitList, {
|
||||||
@@ -130,34 +85,25 @@ function ConnectPanel:render()
|
|||||||
BackgroundTransparency = 1,
|
BackgroundTransparency = 1,
|
||||||
},
|
},
|
||||||
layoutProps = {
|
layoutProps = {
|
||||||
FillDirection = Enum.FillDirection.Horizontal,
|
Padding = UDim.new(0, 4),
|
||||||
Padding = UDim.new(0, 8),
|
|
||||||
},
|
},
|
||||||
}, {
|
}, {
|
||||||
Label = e(FitText, {
|
Label = e(FitText, {
|
||||||
MinSize = Vector2.new(0, 24),
|
|
||||||
Kind = "TextLabel",
|
Kind = "TextLabel",
|
||||||
LayoutOrder = 1,
|
LayoutOrder = 1,
|
||||||
BackgroundTransparency = 1,
|
BackgroundTransparency = 1,
|
||||||
TextXAlignment = Enum.TextXAlignment.Left,
|
TextXAlignment = Enum.TextXAlignment.Left,
|
||||||
Font = Enum.Font.SourceSansBold,
|
Font = Theme.TitleFont,
|
||||||
TextSize = FORM_TEXT_SIZE,
|
TextSize = 20,
|
||||||
Text = "Address",
|
Text = "Address",
|
||||||
TextColor3 = TEXT_COLOR,
|
TextColor3 = Theme.AccentColor,
|
||||||
|
|
||||||
[Roact.Change.AbsoluteSize] = function(rbx)
|
|
||||||
self:updateLabelSize("address", rbx.AbsoluteSize)
|
|
||||||
end,
|
|
||||||
}, {
|
|
||||||
Sizing = e("UISizeConstraint", {
|
|
||||||
MinSize = self.labelSize,
|
|
||||||
}),
|
|
||||||
}),
|
}),
|
||||||
|
|
||||||
Input = e(FormTextInput, {
|
Input = e(FormTextInput, {
|
||||||
layoutOrder = 2,
|
layoutOrder = 2,
|
||||||
size = UDim2.new(0, 300, 0, 24),
|
width = UDim.new(0, 220),
|
||||||
value = self.state.address,
|
value = self.state.address,
|
||||||
|
placeholderValue = Config.defaultHost,
|
||||||
onValueChange = function(newValue)
|
onValueChange = function(newValue)
|
||||||
self:setState({
|
self:setState({
|
||||||
address = newValue,
|
address = newValue,
|
||||||
@@ -172,34 +118,25 @@ function ConnectPanel:render()
|
|||||||
BackgroundTransparency = 1,
|
BackgroundTransparency = 1,
|
||||||
},
|
},
|
||||||
layoutProps = {
|
layoutProps = {
|
||||||
FillDirection = Enum.FillDirection.Horizontal,
|
Padding = UDim.new(0, 4),
|
||||||
Padding = UDim.new(0, 8),
|
|
||||||
},
|
},
|
||||||
}, {
|
}, {
|
||||||
Label = e(FitText, {
|
Label = e(FitText, {
|
||||||
MinSize = Vector2.new(0, 24),
|
|
||||||
Kind = "TextLabel",
|
Kind = "TextLabel",
|
||||||
LayoutOrder = 1,
|
LayoutOrder = 1,
|
||||||
BackgroundTransparency = 1,
|
BackgroundTransparency = 1,
|
||||||
TextXAlignment = Enum.TextXAlignment.Left,
|
TextXAlignment = Enum.TextXAlignment.Left,
|
||||||
Font = Enum.Font.SourceSansBold,
|
Font = Theme.TitleFont,
|
||||||
TextSize = FORM_TEXT_SIZE,
|
TextSize = 20,
|
||||||
Text = "Port",
|
Text = "Port",
|
||||||
TextColor3 = TEXT_COLOR,
|
TextColor3 = Theme.AccentColor,
|
||||||
|
|
||||||
[Roact.Change.AbsoluteSize] = function(rbx)
|
|
||||||
self:updateLabelSize("port", rbx.AbsoluteSize)
|
|
||||||
end,
|
|
||||||
}, {
|
|
||||||
Sizing = e("UISizeConstraint", {
|
|
||||||
MinSize = self.labelSize,
|
|
||||||
}),
|
|
||||||
}),
|
}),
|
||||||
|
|
||||||
Input = e(FormTextInput, {
|
Input = e(FormTextInput, {
|
||||||
layoutOrder = 2,
|
layoutOrder = 2,
|
||||||
size = UDim2.new(0, 300, 0, 24),
|
width = UDim.new(0, 80),
|
||||||
value = self.state.port,
|
value = self.state.port,
|
||||||
|
placeholderValue = Config.defaultPort,
|
||||||
onValueChange = function(newValue)
|
onValueChange = function(newValue)
|
||||||
self:setState({
|
self:setState({
|
||||||
port = newValue,
|
port = newValue,
|
||||||
@@ -207,36 +144,117 @@ function ConnectPanel:render()
|
|||||||
end,
|
end,
|
||||||
}),
|
}),
|
||||||
}),
|
}),
|
||||||
|
}),
|
||||||
|
|
||||||
Buttons = e(FitList, {
|
Buttons = e(FitList, {
|
||||||
containerProps = {
|
fitAxes = "Y",
|
||||||
LayoutOrder = 3,
|
containerProps = {
|
||||||
BackgroundTransparency = 1,
|
BackgroundTransparency = 1,
|
||||||
},
|
LayoutOrder = 2,
|
||||||
layoutProps = {
|
Size = UDim2.new(1, 0, 0, 0),
|
||||||
FillDirection = Enum.FillDirection.Horizontal,
|
},
|
||||||
Padding = UDim.new(0, 8),
|
layoutProps = {
|
||||||
},
|
FillDirection = Enum.FillDirection.Horizontal,
|
||||||
|
HorizontalAlignment = Enum.HorizontalAlignment.Right,
|
||||||
|
Padding = UDim.new(0, 8),
|
||||||
|
},
|
||||||
|
paddingProps = {
|
||||||
|
PaddingTop = UDim.new(0, 0),
|
||||||
|
PaddingBottom = UDim.new(0, 20),
|
||||||
|
PaddingLeft = UDim.new(0, 24),
|
||||||
|
PaddingRight = UDim.new(0, 24),
|
||||||
|
},
|
||||||
|
}, {
|
||||||
|
e(FormButton, {
|
||||||
|
layoutOrder = 1,
|
||||||
|
text = "Cancel",
|
||||||
|
onClick = function()
|
||||||
|
if cancel ~= nil then
|
||||||
|
cancel()
|
||||||
|
end
|
||||||
|
end,
|
||||||
|
secondary = true,
|
||||||
|
}),
|
||||||
|
|
||||||
|
e(FormButton, {
|
||||||
|
layoutOrder = 2,
|
||||||
|
text = "Connect",
|
||||||
|
onClick = function()
|
||||||
|
if startSession ~= nil then
|
||||||
|
local address = self.state.address
|
||||||
|
if address:len() == 0 then
|
||||||
|
address = Config.defaultHost
|
||||||
|
end
|
||||||
|
|
||||||
|
local port = self.state.port
|
||||||
|
if port:len() == 0 then
|
||||||
|
port = Config.defaultPort
|
||||||
|
end
|
||||||
|
|
||||||
|
startSession(address, port)
|
||||||
|
end
|
||||||
|
end,
|
||||||
|
}),
|
||||||
|
}),
|
||||||
|
|
||||||
|
Footer = e(FitList, {
|
||||||
|
fitAxes = "Y",
|
||||||
|
containerKind = "ImageLabel",
|
||||||
|
containerProps = {
|
||||||
|
Image = RoundBox.asset,
|
||||||
|
ImageRectOffset = RoundBox.offset + Vector2.new(0, RoundBox.size.Y / 2),
|
||||||
|
ImageRectSize = RoundBox.size * Vector2.new(1, 0.5),
|
||||||
|
SliceCenter = RoundBox.center,
|
||||||
|
ScaleType = Enum.ScaleType.Slice,
|
||||||
|
ImageColor3 = Theme.SecondaryColor,
|
||||||
|
Size = UDim2.new(1, 0, 0, 0),
|
||||||
|
LayoutOrder = 3,
|
||||||
|
BackgroundTransparency = 1,
|
||||||
|
|
||||||
|
[Roact.Change.AbsoluteSize] = function(rbx)
|
||||||
|
self.setFooterSize(rbx.AbsoluteSize)
|
||||||
|
end,
|
||||||
|
},
|
||||||
|
layoutProps = {
|
||||||
|
FillDirection = Enum.FillDirection.Horizontal,
|
||||||
|
HorizontalAlignment = Enum.HorizontalAlignment.Center,
|
||||||
|
VerticalAlignment = Enum.VerticalAlignment.Center,
|
||||||
|
},
|
||||||
|
paddingProps = {
|
||||||
|
PaddingTop = UDim.new(0, 4),
|
||||||
|
PaddingBottom = UDim.new(0, 4),
|
||||||
|
PaddingLeft = UDim.new(0, 8),
|
||||||
|
PaddingRight = UDim.new(0, 8),
|
||||||
|
},
|
||||||
|
}, {
|
||||||
|
LogoContainer = e("Frame", {
|
||||||
|
BackgroundTransparency = 1,
|
||||||
|
|
||||||
|
Size = self.footerRestSize,
|
||||||
}, {
|
}, {
|
||||||
e(FormButton, {
|
Logo = e("ImageLabel", {
|
||||||
text = "Start",
|
Image = Assets.Images.Logo,
|
||||||
onClick = function()
|
Size = UDim2.new(0, 80, 0, 40),
|
||||||
if startSession ~= nil then
|
ScaleType = Enum.ScaleType.Fit,
|
||||||
startSession(self.state.address, self.state.port)
|
BackgroundTransparency = 1,
|
||||||
end
|
Position = UDim2.new(0, 0, 1, -10),
|
||||||
end,
|
AnchorPoint = Vector2.new(0, 1),
|
||||||
}),
|
}),
|
||||||
|
}),
|
||||||
|
|
||||||
e(FormButton, {
|
Version = e(FitText, {
|
||||||
text = "Cancel",
|
Font = Theme.TitleFont,
|
||||||
onClick = function()
|
TextSize = 18,
|
||||||
if cancel ~= nil then
|
Text = Version.display(Config.version),
|
||||||
cancel()
|
TextXAlignment = Enum.TextXAlignment.Right,
|
||||||
end
|
TextColor3 = Theme.LightTextColor,
|
||||||
end,
|
BackgroundTransparency = 1,
|
||||||
}),
|
|
||||||
})
|
[Roact.Change.AbsoluteSize] = function(rbx)
|
||||||
})
|
self.setFooterVersionSize(rbx.AbsoluteSize)
|
||||||
|
end,
|
||||||
|
}),
|
||||||
|
}),
|
||||||
})
|
})
|
||||||
end
|
end
|
||||||
|
|
||||||
|
|||||||
@@ -1,38 +1,66 @@
|
|||||||
local Roact = require(script:FindFirstAncestor("Rojo").Roact)
|
local Roact = require(script:FindFirstAncestor("Rojo").Roact)
|
||||||
|
|
||||||
local Assets = require(script.Parent.Parent.Assets)
|
local Plugin = script:FindFirstAncestor("Plugin")
|
||||||
|
|
||||||
local FitList = require(script.Parent.FitList)
|
local Theme = require(Plugin.Theme)
|
||||||
local FitText = require(script.Parent.FitText)
|
local Assets = require(Plugin.Assets)
|
||||||
|
|
||||||
|
local FitList = require(Plugin.Components.FitList)
|
||||||
|
local FitText = require(Plugin.Components.FitText)
|
||||||
|
|
||||||
local e = Roact.createElement
|
local e = Roact.createElement
|
||||||
|
|
||||||
local GrayBox = Assets.Slices.GrayBox
|
local RoundBox = Assets.Slices.RoundBox
|
||||||
|
local WhiteCross = Assets.Sprites.WhiteCross
|
||||||
|
|
||||||
local ConnectionActivePanel = Roact.Component:extend("ConnectionActivePanel")
|
local function ConnectionActivePanel(props)
|
||||||
|
local stopSession = props.stopSession
|
||||||
|
|
||||||
function ConnectionActivePanel:render()
|
|
||||||
return e(FitList, {
|
return e(FitList, {
|
||||||
containerKind = "ImageButton",
|
containerKind = "ImageLabel",
|
||||||
containerProps = {
|
containerProps = {
|
||||||
Image = GrayBox.asset,
|
Image = RoundBox.asset,
|
||||||
ImageRectOffset = GrayBox.offset,
|
ImageRectOffset = RoundBox.offset + Vector2.new(0, RoundBox.size.Y / 2),
|
||||||
ImageRectSize = GrayBox.size,
|
ImageRectSize = RoundBox.size * Vector2.new(1, 0.5),
|
||||||
SliceCenter = GrayBox.center,
|
SliceCenter = Rect.new(4, 4, 4, 4),
|
||||||
ScaleType = Enum.ScaleType.Slice,
|
ScaleType = Enum.ScaleType.Slice,
|
||||||
BackgroundTransparency = 1,
|
BackgroundTransparency = 1,
|
||||||
Position = UDim2.new(0.5, 0, 0, 0),
|
Position = UDim2.new(0.5, 0, 0, 0),
|
||||||
AnchorPoint = Vector2.new(0.5, 0),
|
AnchorPoint = Vector2.new(0.5, 0),
|
||||||
},
|
},
|
||||||
|
layoutProps = {
|
||||||
|
FillDirection = Enum.FillDirection.Horizontal,
|
||||||
|
VerticalAlignment = Enum.VerticalAlignment.Center,
|
||||||
|
},
|
||||||
}, {
|
}, {
|
||||||
Text = e(FitText, {
|
Text = e(FitText, {
|
||||||
Padding = Vector2.new(12, 6),
|
Padding = Vector2.new(12, 6),
|
||||||
Font = Enum.Font.SourceSans,
|
Font = Theme.ButtonFont,
|
||||||
TextSize = 18,
|
TextSize = 18,
|
||||||
Text = "Rojo Connected",
|
Text = "Rojo Connected",
|
||||||
TextColor3 = Color3.new(0.05, 0.05, 0.05),
|
TextColor3 = Theme.PrimaryColor,
|
||||||
BackgroundTransparency = 1,
|
BackgroundTransparency = 1,
|
||||||
}),
|
}),
|
||||||
|
|
||||||
|
CloseContainer = e("ImageButton", {
|
||||||
|
Size = UDim2.new(0, 30, 0, 30),
|
||||||
|
BackgroundTransparency = 1,
|
||||||
|
|
||||||
|
[Roact.Event.Activated] = function()
|
||||||
|
stopSession()
|
||||||
|
end,
|
||||||
|
}, {
|
||||||
|
CloseImage = e("ImageLabel", {
|
||||||
|
Size = UDim2.new(0, 16, 0, 16),
|
||||||
|
Position = UDim2.new(0.5, 0, 0.5, 0),
|
||||||
|
AnchorPoint = Vector2.new(0.5, 0.5),
|
||||||
|
Image = WhiteCross.asset,
|
||||||
|
ImageRectOffset = WhiteCross.offset,
|
||||||
|
ImageRectSize = WhiteCross.size,
|
||||||
|
ImageColor3 = Theme.PrimaryColor,
|
||||||
|
BackgroundTransparency = 1,
|
||||||
|
}),
|
||||||
|
}),
|
||||||
})
|
})
|
||||||
end
|
end
|
||||||
|
|
||||||
|
|||||||
@@ -12,6 +12,7 @@ end
|
|||||||
|
|
||||||
function FitList:render()
|
function FitList:render()
|
||||||
local containerKind = self.props.containerKind or "Frame"
|
local containerKind = self.props.containerKind or "Frame"
|
||||||
|
local fitAxes = self.props.fitAxes or "XY"
|
||||||
local containerProps = self.props.containerProps
|
local containerProps = self.props.containerProps
|
||||||
local layoutProps = self.props.layoutProps
|
local layoutProps = self.props.layoutProps
|
||||||
local paddingProps = self.props.paddingProps
|
local paddingProps = self.props.paddingProps
|
||||||
@@ -25,15 +26,27 @@ function FitList:render()
|
|||||||
["$Layout"] = e("UIListLayout", Dictionary.merge({
|
["$Layout"] = e("UIListLayout", Dictionary.merge({
|
||||||
SortOrder = Enum.SortOrder.LayoutOrder,
|
SortOrder = Enum.SortOrder.LayoutOrder,
|
||||||
[Roact.Change.AbsoluteContentSize] = function(instance)
|
[Roact.Change.AbsoluteContentSize] = function(instance)
|
||||||
local size = instance.AbsoluteContentSize
|
local contentSize = instance.AbsoluteContentSize
|
||||||
|
|
||||||
if paddingProps ~= nil then
|
if paddingProps ~= nil then
|
||||||
size = size + Vector2.new(
|
contentSize = contentSize + Vector2.new(
|
||||||
paddingProps.PaddingLeft.Offset + paddingProps.PaddingRight.Offset,
|
paddingProps.PaddingLeft.Offset + paddingProps.PaddingRight.Offset,
|
||||||
paddingProps.PaddingTop.Offset + paddingProps.PaddingBottom.Offset)
|
paddingProps.PaddingTop.Offset + paddingProps.PaddingBottom.Offset)
|
||||||
end
|
end
|
||||||
|
|
||||||
self.setSize(UDim2.new(0, size.X, 0, size.Y))
|
local combinedSize
|
||||||
|
|
||||||
|
if fitAxes == "X" then
|
||||||
|
combinedSize = UDim2.new(0, contentSize.X, containerProps.Size.Y.Scale, containerProps.Size.Y.Offset)
|
||||||
|
elseif fitAxes == "Y" then
|
||||||
|
combinedSize = UDim2.new(containerProps.Size.X.Scale, containerProps.Size.X.Offset, 0, contentSize.Y)
|
||||||
|
elseif fitAxes == "XY" then
|
||||||
|
combinedSize = UDim2.new(0, contentSize.X, 0, contentSize.Y)
|
||||||
|
else
|
||||||
|
error("Invalid fitAxes value")
|
||||||
|
end
|
||||||
|
|
||||||
|
self.setSize(combinedSize)
|
||||||
end,
|
end,
|
||||||
}, layoutProps)),
|
}, layoutProps)),
|
||||||
|
|
||||||
|
|||||||
@@ -4,28 +4,41 @@ local Plugin = Rojo.Plugin
|
|||||||
local Roact = require(Rojo.Roact)
|
local Roact = require(Rojo.Roact)
|
||||||
|
|
||||||
local Assets = require(Plugin.Assets)
|
local Assets = require(Plugin.Assets)
|
||||||
|
local Theme = require(Plugin.Theme)
|
||||||
local FitList = require(Plugin.Components.FitList)
|
local FitList = require(Plugin.Components.FitList)
|
||||||
local FitText = require(Plugin.Components.FitText)
|
local FitText = require(Plugin.Components.FitText)
|
||||||
|
|
||||||
local e = Roact.createElement
|
local e = Roact.createElement
|
||||||
|
|
||||||
local GrayButton07 = Assets.Slices.GrayButton07
|
local RoundBox = Assets.Slices.RoundBox
|
||||||
|
|
||||||
local function FormButton(props)
|
local function FormButton(props)
|
||||||
local text = props.text
|
local text = props.text
|
||||||
local layoutOrder = props.layoutOrder
|
local layoutOrder = props.layoutOrder
|
||||||
local onClick = props.onClick
|
local onClick = props.onClick
|
||||||
|
|
||||||
|
local textColor
|
||||||
|
local backgroundColor
|
||||||
|
|
||||||
|
if props.secondary then
|
||||||
|
textColor = Theme.AccentColor
|
||||||
|
backgroundColor = Theme.SecondaryColor
|
||||||
|
else
|
||||||
|
textColor = Theme.SecondaryColor
|
||||||
|
backgroundColor = Theme.AccentColor
|
||||||
|
end
|
||||||
|
|
||||||
return e(FitList, {
|
return e(FitList, {
|
||||||
containerKind = "ImageButton",
|
containerKind = "ImageButton",
|
||||||
containerProps = {
|
containerProps = {
|
||||||
LayoutOrder = layoutOrder,
|
LayoutOrder = layoutOrder,
|
||||||
BackgroundTransparency = 1,
|
BackgroundTransparency = 1,
|
||||||
Image = GrayButton07.asset,
|
Image = RoundBox.asset,
|
||||||
ImageRectOffset = GrayButton07.offset,
|
ImageRectOffset = RoundBox.offset,
|
||||||
ImageRectSize = GrayButton07.size,
|
ImageRectSize = RoundBox.size,
|
||||||
|
SliceCenter = RoundBox.center,
|
||||||
ScaleType = Enum.ScaleType.Slice,
|
ScaleType = Enum.ScaleType.Slice,
|
||||||
SliceCenter = GrayButton07.center,
|
ImageColor3 = backgroundColor,
|
||||||
|
|
||||||
[Roact.Event.Activated] = function()
|
[Roact.Event.Activated] = function()
|
||||||
if onClick ~= nil then
|
if onClick ~= nil then
|
||||||
@@ -37,10 +50,10 @@ local function FormButton(props)
|
|||||||
Text = e(FitText, {
|
Text = e(FitText, {
|
||||||
Kind = "TextLabel",
|
Kind = "TextLabel",
|
||||||
Text = text,
|
Text = text,
|
||||||
TextSize = 22,
|
TextSize = 18,
|
||||||
Font = Enum.Font.SourceSansBold,
|
TextColor3 = textColor,
|
||||||
Padding = Vector2.new(14, 6),
|
Font = Theme.ButtonFont,
|
||||||
TextColor3 = Color3.new(0.05, 0.05, 0.05),
|
Padding = Vector2.new(16, 8),
|
||||||
BackgroundTransparency = 1,
|
BackgroundTransparency = 1,
|
||||||
}),
|
}),
|
||||||
})
|
})
|
||||||
|
|||||||
@@ -4,42 +4,75 @@ local Plugin = Rojo.Plugin
|
|||||||
local Roact = require(Rojo.Roact)
|
local Roact = require(Rojo.Roact)
|
||||||
|
|
||||||
local Assets = require(Plugin.Assets)
|
local Assets = require(Plugin.Assets)
|
||||||
|
local Theme = require(Plugin.Theme)
|
||||||
|
|
||||||
local e = Roact.createElement
|
local e = Roact.createElement
|
||||||
|
|
||||||
local GrayBox = Assets.Slices.GrayBox
|
local RoundBox = Assets.Slices.RoundBox
|
||||||
|
|
||||||
local function FormTextInput(props)
|
local TEXT_SIZE = 22
|
||||||
local value = props.value
|
local PADDING = 8
|
||||||
local onValueChange = props.onValueChange
|
|
||||||
local layoutOrder = props.layoutOrder
|
local FormTextInput = Roact.Component:extend("FormTextInput")
|
||||||
local size = props.size
|
|
||||||
|
function FormTextInput:init()
|
||||||
|
self:setState({
|
||||||
|
focused = false,
|
||||||
|
})
|
||||||
|
end
|
||||||
|
|
||||||
|
function FormTextInput:render()
|
||||||
|
local value = self.props.value
|
||||||
|
local placeholderValue = self.props.placeholderValue
|
||||||
|
local onValueChange = self.props.onValueChange
|
||||||
|
local layoutOrder = self.props.layoutOrder
|
||||||
|
local width = self.props.width
|
||||||
|
|
||||||
|
local shownPlaceholder
|
||||||
|
if self.state.focused then
|
||||||
|
shownPlaceholder = ""
|
||||||
|
else
|
||||||
|
shownPlaceholder = placeholderValue
|
||||||
|
end
|
||||||
|
|
||||||
return e("ImageLabel", {
|
return e("ImageLabel", {
|
||||||
LayoutOrder = layoutOrder,
|
LayoutOrder = layoutOrder,
|
||||||
Image = GrayBox.asset,
|
Image = RoundBox.asset,
|
||||||
ImageRectOffset = GrayBox.offset,
|
ImageRectOffset = RoundBox.offset,
|
||||||
ImageRectSize = GrayBox.size,
|
ImageRectSize = RoundBox.size,
|
||||||
ScaleType = Enum.ScaleType.Slice,
|
ScaleType = Enum.ScaleType.Slice,
|
||||||
SliceCenter = GrayBox.center,
|
SliceCenter = RoundBox.center,
|
||||||
Size = size,
|
ImageColor3 = Theme.SecondaryColor,
|
||||||
|
Size = UDim2.new(width.Scale, width.Offset, 0, TEXT_SIZE + PADDING * 2),
|
||||||
BackgroundTransparency = 1,
|
BackgroundTransparency = 1,
|
||||||
}, {
|
}, {
|
||||||
InputInner = e("TextBox", {
|
InputInner = e("TextBox", {
|
||||||
BackgroundTransparency = 1,
|
BackgroundTransparency = 1,
|
||||||
Size = UDim2.new(1, -8, 1, -8),
|
Size = UDim2.new(1, -PADDING * 2, 1, -PADDING * 2),
|
||||||
Position = UDim2.new(0.5, 0, 0.5, 0),
|
Position = UDim2.new(0.5, 0, 0.5, 0),
|
||||||
AnchorPoint = Vector2.new(0.5, 0.5),
|
AnchorPoint = Vector2.new(0.5, 0.5),
|
||||||
Font = Enum.Font.SourceSans,
|
Font = Theme.InputFont,
|
||||||
ClearTextOnFocus = false,
|
ClearTextOnFocus = false,
|
||||||
TextXAlignment = Enum.TextXAlignment.Left,
|
TextXAlignment = Enum.TextXAlignment.Center,
|
||||||
TextSize = 20,
|
TextSize = TEXT_SIZE,
|
||||||
Text = value,
|
Text = value,
|
||||||
TextColor3 = Color3.new(0.05, 0.05, 0.05),
|
PlaceholderText = shownPlaceholder,
|
||||||
|
PlaceholderColor3 = Theme.AccentLightColor,
|
||||||
|
TextColor3 = Theme.AccentColor,
|
||||||
|
|
||||||
[Roact.Change.Text] = function(rbx)
|
[Roact.Change.Text] = function(rbx)
|
||||||
onValueChange(rbx.Text)
|
onValueChange(rbx.Text)
|
||||||
end,
|
end,
|
||||||
|
[Roact.Event.Focused] = function()
|
||||||
|
self:setState({
|
||||||
|
focused = true,
|
||||||
|
})
|
||||||
|
end,
|
||||||
|
[Roact.Event.FocusLost] = function()
|
||||||
|
self:setState({
|
||||||
|
focused = false,
|
||||||
|
})
|
||||||
|
end,
|
||||||
}),
|
}),
|
||||||
})
|
})
|
||||||
end
|
end
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
return {
|
return {
|
||||||
codename = "Epiphany",
|
codename = "Epiphany",
|
||||||
version = {0, 5, 0, "-alpha.0"},
|
version = {0, 5, 0, "-alpha.3"},
|
||||||
expectedServerVersionString = "0.5.0 or newer",
|
expectedServerVersionString = "0.5.0 or newer",
|
||||||
protocolVersion = 2,
|
protocolVersion = 2,
|
||||||
defaultHost = "localhost",
|
defaultHost = "localhost",
|
||||||
|
|||||||
@@ -1,12 +1,22 @@
|
|||||||
local Config = require(script.Parent.Config)
|
local Config = require(script.Parent.Config)
|
||||||
|
|
||||||
|
local VALUES = {
|
||||||
|
LogLevel = {
|
||||||
|
type = "IntValue",
|
||||||
|
defaultUserValue = 2,
|
||||||
|
defaultDevValue = 3,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
local CONTAINER_NAME = "RojoDevSettings" .. Config.codename
|
||||||
|
|
||||||
local function getValueContainer()
|
local function getValueContainer()
|
||||||
return game:FindFirstChild("RojoDev-" .. Config.codename)
|
return game:FindFirstChild(CONTAINER_NAME)
|
||||||
end
|
end
|
||||||
|
|
||||||
local valueContainer = getValueContainer()
|
local valueContainer = getValueContainer()
|
||||||
|
|
||||||
local function getValue(name)
|
local function getStoredValue(name)
|
||||||
if valueContainer == nil then
|
if valueContainer == nil then
|
||||||
return nil
|
return nil
|
||||||
end
|
end
|
||||||
@@ -20,7 +30,7 @@ local function getValue(name)
|
|||||||
return valueObject.Value
|
return valueObject.Value
|
||||||
end
|
end
|
||||||
|
|
||||||
local function setValue(name, kind, value)
|
local function setStoredValue(name, kind, value)
|
||||||
local object = valueContainer:FindFirstChild(name)
|
local object = valueContainer:FindFirstChild(name)
|
||||||
|
|
||||||
if object == nil then
|
if object == nil then
|
||||||
@@ -37,11 +47,13 @@ local function createAllValues()
|
|||||||
|
|
||||||
if valueContainer == nil then
|
if valueContainer == nil then
|
||||||
valueContainer = Instance.new("Folder")
|
valueContainer = Instance.new("Folder")
|
||||||
valueContainer.Name = "RojoDev-" .. Config.codename
|
valueContainer.Name = CONTAINER_NAME
|
||||||
valueContainer.Parent = game
|
valueContainer.Parent = game
|
||||||
end
|
end
|
||||||
|
|
||||||
setValue("LogLevel", "IntValue", getValue("LogLevel") or 2)
|
for name, value in pairs(VALUES) do
|
||||||
|
setStoredValue(name, value.type, value.defaultDevValue)
|
||||||
|
end
|
||||||
end
|
end
|
||||||
|
|
||||||
_G[("ROJO_%s_DEV_CREATE"):format(Config.codename:upper())] = createAllValues
|
_G[("ROJO_%s_DEV_CREATE"):format(Config.codename:upper())] = createAllValues
|
||||||
@@ -53,7 +65,7 @@ function DevSettings:isEnabled()
|
|||||||
end
|
end
|
||||||
|
|
||||||
function DevSettings:getLogLevel()
|
function DevSettings:getLogLevel()
|
||||||
return getValue("LogLevel")
|
return getStoredValue("LogLevel") or VALUES.LogLevel.defaultUserValue
|
||||||
end
|
end
|
||||||
|
|
||||||
return DevSettings
|
return DevSettings
|
||||||
@@ -31,4 +31,4 @@ function HttpResponse:json()
|
|||||||
return HttpService:JSONDecode(self.body)
|
return HttpService:JSONDecode(self.body)
|
||||||
end
|
end
|
||||||
|
|
||||||
return HttpResponse
|
return HttpResponse
|
||||||
@@ -1,7 +1,5 @@
|
|||||||
local DevSettings = require(script.Parent.DevSettings)
|
local DevSettings = require(script.Parent.DevSettings)
|
||||||
|
|
||||||
local testLogLevel = nil
|
|
||||||
|
|
||||||
local Level = {
|
local Level = {
|
||||||
Error = 0,
|
Error = 0,
|
||||||
Warning = 1,
|
Warning = 1,
|
||||||
@@ -9,17 +7,14 @@ local Level = {
|
|||||||
Trace = 3,
|
Trace = 3,
|
||||||
}
|
}
|
||||||
|
|
||||||
|
local testLogLevel = nil
|
||||||
|
|
||||||
local function getLogLevel()
|
local function getLogLevel()
|
||||||
if testLogLevel ~= nil then
|
if testLogLevel ~= nil then
|
||||||
return testLogLevel
|
return testLogLevel
|
||||||
end
|
end
|
||||||
|
|
||||||
local devValue = DevSettings:getLogLevel()
|
return DevSettings:getLogLevel()
|
||||||
if devValue ~= nil then
|
|
||||||
return devValue
|
|
||||||
end
|
|
||||||
|
|
||||||
return Level.Info
|
|
||||||
end
|
end
|
||||||
|
|
||||||
local function addTags(tag, message)
|
local function addTags(tag, message)
|
||||||
|
|||||||
@@ -22,18 +22,18 @@ function Session.new(config)
|
|||||||
api:connect()
|
api:connect()
|
||||||
:andThen(function()
|
:andThen(function()
|
||||||
if self.disconnected then
|
if self.disconnected then
|
||||||
return Promise.resolve()
|
return
|
||||||
end
|
end
|
||||||
|
|
||||||
return api:read({api.rootInstanceId})
|
return api:read({api.rootInstanceId})
|
||||||
:andThen(function(response)
|
end)
|
||||||
if self.disconnected then
|
:andThen(function(response)
|
||||||
return Promise.resolve()
|
if self.disconnected then
|
||||||
end
|
return
|
||||||
|
end
|
||||||
|
|
||||||
self.reconciler:reconcile(response.instances, api.rootInstanceId, game)
|
self.reconciler:reconcile(response.instances, api.rootInstanceId, game)
|
||||||
return self:__processMessages()
|
return self:__processMessages()
|
||||||
end)
|
|
||||||
end)
|
end)
|
||||||
:catch(function(message)
|
:catch(function(message)
|
||||||
self.disconnected = true
|
self.disconnected = true
|
||||||
|
|||||||
20
plugin/src/Theme.lua
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
local Theme = {
|
||||||
|
ButtonFont = Enum.Font.GothamSemibold,
|
||||||
|
InputFont = Enum.Font.Code,
|
||||||
|
TitleFont = Enum.Font.GothamBold,
|
||||||
|
MainFont = Enum.Font.Gotham,
|
||||||
|
|
||||||
|
AccentColor = Color3.fromRGB(136, 0, 27),
|
||||||
|
AccentLightColor = Color3.fromRGB(210, 145, 157),
|
||||||
|
PrimaryColor = Color3.fromRGB(20, 20, 20),
|
||||||
|
SecondaryColor = Color3.fromRGB(235, 235, 235),
|
||||||
|
LightTextColor = Color3.fromRGB(140, 140, 140),
|
||||||
|
}
|
||||||
|
|
||||||
|
setmetatable(Theme, {
|
||||||
|
__index = function(_, key)
|
||||||
|
error(("%s is not a valid member of Theme"):format(key), 2)
|
||||||
|
end
|
||||||
|
})
|
||||||
|
|
||||||
|
return Theme
|
||||||
34
plugin/src/joinBindings.lua
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
--[[
|
||||||
|
joinBindings is a crazy hack that allows combining multiple Roact bindings
|
||||||
|
in the same spirit as `map`.
|
||||||
|
|
||||||
|
It's implemented in terms of Roact internals that will probably break at
|
||||||
|
some point; please don't do that or use this module in your own code!
|
||||||
|
]]
|
||||||
|
|
||||||
|
local Binding = require(script:FindFirstAncestor("Rojo").Roact.Binding)
|
||||||
|
|
||||||
|
local function evaluate(fun, bindings)
|
||||||
|
local input = {}
|
||||||
|
|
||||||
|
for index, binding in ipairs(bindings) do
|
||||||
|
input[index] = binding:getValue()
|
||||||
|
end
|
||||||
|
|
||||||
|
return fun(unpack(input, 1, #bindings))
|
||||||
|
end
|
||||||
|
|
||||||
|
local function joinBindings(bindings, joinFunction)
|
||||||
|
local initialValue = evaluate(joinFunction, bindings)
|
||||||
|
local binding, setValue = Binding.create(initialValue)
|
||||||
|
|
||||||
|
for _, binding in ipairs(bindings) do
|
||||||
|
Binding.subscribe(binding, function()
|
||||||
|
setValue(evaluate(joinFunction, bindings))
|
||||||
|
end)
|
||||||
|
end
|
||||||
|
|
||||||
|
return binding
|
||||||
|
end
|
||||||
|
|
||||||
|
return joinBindings
|
||||||
@@ -1,6 +1,6 @@
|
|||||||
[package]
|
[package]
|
||||||
name = "rojo"
|
name = "rojo"
|
||||||
version = "0.5.0-alpha.0"
|
version = "0.5.0-alpha.3"
|
||||||
authors = ["Lucien Greathouse <me@lpghatguy.com>"]
|
authors = ["Lucien Greathouse <me@lpghatguy.com>"]
|
||||||
description = "A tool to create robust Roblox projects"
|
description = "A tool to create robust Roblox projects"
|
||||||
license = "MIT"
|
license = "MIT"
|
||||||
@@ -22,7 +22,7 @@ bundle-plugin = []
|
|||||||
[dependencies]
|
[dependencies]
|
||||||
clap = "2.27"
|
clap = "2.27"
|
||||||
csv = "1.0"
|
csv = "1.0"
|
||||||
env_logger = "0.5"
|
env_logger = "0.6"
|
||||||
failure = "0.1.3"
|
failure = "0.1.3"
|
||||||
log = "0.4"
|
log = "0.4"
|
||||||
maplit = "1.0.1"
|
maplit = "1.0.1"
|
||||||
@@ -35,9 +35,9 @@ serde = "1.0"
|
|||||||
serde_derive = "1.0"
|
serde_derive = "1.0"
|
||||||
serde_json = "1.0"
|
serde_json = "1.0"
|
||||||
uuid = { version = "0.7", features = ["v4", "serde"] }
|
uuid = { version = "0.7", features = ["v4", "serde"] }
|
||||||
rbx_tree = "0.1.0"
|
rbx_tree = "0.2.0"
|
||||||
rbx_xml = "0.1.0"
|
rbx_xml = "0.2.0"
|
||||||
rbx_binary = "0.1.0"
|
rbx_binary = "0.2.0"
|
||||||
|
|
||||||
[dev-dependencies]
|
[dev-dependencies]
|
||||||
tempfile = "3.0"
|
tempfile = "3.0"
|
||||||
|
|||||||
54
server/assets/index.html
Normal file
@@ -0,0 +1,54 @@
|
|||||||
|
<!DOCTYPE html>
|
||||||
|
<html>
|
||||||
|
<head>
|
||||||
|
<title>Rojo</title>
|
||||||
|
<style>
|
||||||
|
* {
|
||||||
|
margin: 0;
|
||||||
|
padding: 0;
|
||||||
|
font: inherit;
|
||||||
|
}
|
||||||
|
|
||||||
|
html {
|
||||||
|
font-family: sans-serif;
|
||||||
|
height: 100%;
|
||||||
|
}
|
||||||
|
|
||||||
|
body {
|
||||||
|
height: 100%;
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
justify-content: center;
|
||||||
|
}
|
||||||
|
|
||||||
|
.main {
|
||||||
|
padding: 1rem;
|
||||||
|
text-align: center;
|
||||||
|
margin: 0 auto;
|
||||||
|
width: 100%;
|
||||||
|
max-width: 60rem;
|
||||||
|
background-color: #efefef;
|
||||||
|
border: 1px solid #666;
|
||||||
|
border-radius: 4px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.title {
|
||||||
|
font-size: 2rem;
|
||||||
|
font-weight: bold;
|
||||||
|
}
|
||||||
|
|
||||||
|
.docs {
|
||||||
|
font-size: 1.5rem;
|
||||||
|
font-weight: bold;
|
||||||
|
}
|
||||||
|
</style>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
|
||||||
|
<div class="main">
|
||||||
|
<h1 class="title">Rojo Live Sync is up and running!</h1>
|
||||||
|
<a class="docs" href="https://lpghatguy.github.io/rojo">Rojo Documentation</a>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
@@ -1,12 +1,12 @@
|
|||||||
#[macro_use] extern crate log;
|
|
||||||
|
|
||||||
use std::{
|
use std::{
|
||||||
path::{Path, PathBuf},
|
|
||||||
env,
|
env,
|
||||||
|
panic,
|
||||||
|
path::{Path, PathBuf},
|
||||||
process,
|
process,
|
||||||
};
|
};
|
||||||
|
|
||||||
use clap::clap_app;
|
use log::error;
|
||||||
|
use clap::{clap_app, ArgMatches};
|
||||||
|
|
||||||
use librojo::commands;
|
use librojo::commands;
|
||||||
|
|
||||||
@@ -20,11 +20,16 @@ fn make_path_absolute(value: &Path) -> PathBuf {
|
|||||||
}
|
}
|
||||||
|
|
||||||
fn main() {
|
fn main() {
|
||||||
env_logger::Builder::from_default_env()
|
{
|
||||||
.default_format_timestamp(false)
|
let log_env = env_logger::Env::default()
|
||||||
.init();
|
.default_filter_or("warn");
|
||||||
|
|
||||||
let mut app = clap_app!(Rojo =>
|
env_logger::Builder::from_env(log_env)
|
||||||
|
.default_format_timestamp(false)
|
||||||
|
.init();
|
||||||
|
}
|
||||||
|
|
||||||
|
let app = clap_app!(Rojo =>
|
||||||
(version: env!("CARGO_PKG_VERSION"))
|
(version: env!("CARGO_PKG_VERSION"))
|
||||||
(author: env!("CARGO_PKG_AUTHORS"))
|
(author: env!("CARGO_PKG_AUTHORS"))
|
||||||
(about: env!("CARGO_PKG_DESCRIPTION"))
|
(about: env!("CARGO_PKG_DESCRIPTION"))
|
||||||
@@ -56,117 +61,144 @@ fn main() {
|
|||||||
)
|
)
|
||||||
);
|
);
|
||||||
|
|
||||||
// `get_matches` consumes self for some reason.
|
let matches = app.get_matches();
|
||||||
let matches = app.clone().get_matches();
|
|
||||||
|
|
||||||
match matches.subcommand() {
|
let result = panic::catch_unwind(|| match matches.subcommand() {
|
||||||
("init", Some(sub_matches)) => {
|
("init", Some(sub_matches)) => start_init(sub_matches),
|
||||||
let fuzzy_project_path = make_path_absolute(Path::new(sub_matches.value_of("PATH").unwrap_or("")));
|
("serve", Some(sub_matches)) => start_serve(sub_matches),
|
||||||
let kind = sub_matches.value_of("kind");
|
("build", Some(sub_matches)) => start_build(sub_matches),
|
||||||
|
("upload", Some(sub_matches)) => start_upload(sub_matches),
|
||||||
|
_ => eprintln!("Usage: rojo <SUBCOMMAND>\nUse 'rojo help' for more help."),
|
||||||
|
});
|
||||||
|
|
||||||
let options = commands::InitOptions {
|
if let Err(error) = result {
|
||||||
fuzzy_project_path,
|
let message = match error.downcast_ref::<&str>() {
|
||||||
kind,
|
Some(message) => message.to_string(),
|
||||||
};
|
None => match error.downcast_ref::<String>() {
|
||||||
|
Some(message) => message.clone(),
|
||||||
|
None => "<no message>".to_string(),
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
match commands::init(&options) {
|
show_crash_message(&message);
|
||||||
Ok(_) => {},
|
process::exit(1);
|
||||||
Err(e) => {
|
}
|
||||||
error!("{}", e);
|
}
|
||||||
process::exit(1);
|
|
||||||
},
|
fn show_crash_message(message: &str) {
|
||||||
}
|
error!("Rojo crashed!");
|
||||||
|
error!("This is a bug in Rojo.");
|
||||||
|
error!("");
|
||||||
|
error!("Please consider filing a bug: https://github.com/LPGhatguy/rojo/issues");
|
||||||
|
error!("");
|
||||||
|
error!("Details: {}", message);
|
||||||
|
}
|
||||||
|
|
||||||
|
fn start_init(sub_matches: &ArgMatches) {
|
||||||
|
let fuzzy_project_path = make_path_absolute(Path::new(sub_matches.value_of("PATH").unwrap_or("")));
|
||||||
|
let kind = sub_matches.value_of("kind");
|
||||||
|
|
||||||
|
let options = commands::InitOptions {
|
||||||
|
fuzzy_project_path,
|
||||||
|
kind,
|
||||||
|
};
|
||||||
|
|
||||||
|
match commands::init(&options) {
|
||||||
|
Ok(_) => {},
|
||||||
|
Err(e) => {
|
||||||
|
error!("{}", e);
|
||||||
|
process::exit(1);
|
||||||
},
|
},
|
||||||
("serve", Some(sub_matches)) => {
|
}
|
||||||
let fuzzy_project_path = match sub_matches.value_of("PROJECT") {
|
}
|
||||||
Some(v) => make_path_absolute(Path::new(v)),
|
|
||||||
None => std::env::current_dir().unwrap(),
|
|
||||||
};
|
|
||||||
|
|
||||||
let port = match sub_matches.value_of("port") {
|
fn start_serve(sub_matches: &ArgMatches) {
|
||||||
Some(v) => match v.parse::<u16>() {
|
let fuzzy_project_path = match sub_matches.value_of("PROJECT") {
|
||||||
Ok(port) => Some(port),
|
Some(v) => make_path_absolute(Path::new(v)),
|
||||||
Err(_) => {
|
None => std::env::current_dir().unwrap(),
|
||||||
error!("Invalid port {}", v);
|
};
|
||||||
process::exit(1);
|
|
||||||
},
|
|
||||||
},
|
|
||||||
None => None,
|
|
||||||
};
|
|
||||||
|
|
||||||
let options = commands::ServeOptions {
|
let port = match sub_matches.value_of("port") {
|
||||||
fuzzy_project_path,
|
Some(v) => match v.parse::<u16>() {
|
||||||
port,
|
Ok(port) => Some(port),
|
||||||
};
|
Err(_) => {
|
||||||
|
error!("Invalid port {}", v);
|
||||||
match commands::serve(&options) {
|
process::exit(1);
|
||||||
Ok(_) => {},
|
},
|
||||||
Err(e) => {
|
|
||||||
error!("{}", e);
|
|
||||||
process::exit(1);
|
|
||||||
},
|
|
||||||
}
|
|
||||||
},
|
},
|
||||||
("build", Some(sub_matches)) => {
|
None => None,
|
||||||
let fuzzy_project_path = match sub_matches.value_of("PROJECT") {
|
};
|
||||||
Some(v) => make_path_absolute(Path::new(v)),
|
|
||||||
None => std::env::current_dir().unwrap(),
|
|
||||||
};
|
|
||||||
|
|
||||||
let output_file = make_path_absolute(Path::new(sub_matches.value_of("output").unwrap()));
|
let options = commands::ServeOptions {
|
||||||
|
fuzzy_project_path,
|
||||||
|
port,
|
||||||
|
};
|
||||||
|
|
||||||
let options = commands::BuildOptions {
|
match commands::serve(&options) {
|
||||||
fuzzy_project_path,
|
Ok(_) => {},
|
||||||
output_file,
|
Err(e) => {
|
||||||
output_kind: None, // TODO: Accept from argument
|
error!("{}", e);
|
||||||
};
|
process::exit(1);
|
||||||
|
|
||||||
match commands::build(&options) {
|
|
||||||
Ok(_) => {},
|
|
||||||
Err(e) => {
|
|
||||||
error!("{}", e);
|
|
||||||
process::exit(1);
|
|
||||||
},
|
|
||||||
}
|
|
||||||
},
|
},
|
||||||
("upload", Some(sub_matches)) => {
|
}
|
||||||
let fuzzy_project_path = match sub_matches.value_of("PROJECT") {
|
}
|
||||||
Some(v) => make_path_absolute(Path::new(v)),
|
|
||||||
None => std::env::current_dir().unwrap(),
|
|
||||||
};
|
|
||||||
|
|
||||||
let kind = sub_matches.value_of("kind");
|
fn start_build(sub_matches: &ArgMatches) {
|
||||||
let security_cookie = sub_matches.value_of("cookie").unwrap();
|
let fuzzy_project_path = match sub_matches.value_of("PROJECT") {
|
||||||
|
Some(v) => make_path_absolute(Path::new(v)),
|
||||||
|
None => std::env::current_dir().unwrap(),
|
||||||
|
};
|
||||||
|
|
||||||
let asset_id: u64 = {
|
let output_file = make_path_absolute(Path::new(sub_matches.value_of("output").unwrap()));
|
||||||
let arg = sub_matches.value_of("asset_id").unwrap();
|
|
||||||
|
|
||||||
match arg.parse() {
|
let options = commands::BuildOptions {
|
||||||
Ok(v) => v,
|
fuzzy_project_path,
|
||||||
Err(_) => {
|
output_file,
|
||||||
error!("Invalid place ID {}", arg);
|
output_kind: None, // TODO: Accept from argument
|
||||||
process::exit(1);
|
};
|
||||||
},
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
let options = commands::UploadOptions {
|
match commands::build(&options) {
|
||||||
fuzzy_project_path,
|
Ok(_) => {},
|
||||||
security_cookie: security_cookie.to_string(),
|
Err(e) => {
|
||||||
asset_id,
|
error!("{}", e);
|
||||||
kind,
|
process::exit(1);
|
||||||
};
|
|
||||||
|
|
||||||
match commands::upload(&options) {
|
|
||||||
Ok(_) => {},
|
|
||||||
Err(e) => {
|
|
||||||
error!("{}", e);
|
|
||||||
process::exit(1);
|
|
||||||
},
|
|
||||||
}
|
|
||||||
},
|
},
|
||||||
_ => {
|
}
|
||||||
app.print_help().expect("Could not print help text to stdout!");
|
}
|
||||||
|
|
||||||
|
fn start_upload(sub_matches: &ArgMatches) {
|
||||||
|
let fuzzy_project_path = match sub_matches.value_of("PROJECT") {
|
||||||
|
Some(v) => make_path_absolute(Path::new(v)),
|
||||||
|
None => std::env::current_dir().unwrap(),
|
||||||
|
};
|
||||||
|
|
||||||
|
let kind = sub_matches.value_of("kind");
|
||||||
|
let security_cookie = sub_matches.value_of("cookie").unwrap();
|
||||||
|
|
||||||
|
let asset_id: u64 = {
|
||||||
|
let arg = sub_matches.value_of("asset_id").unwrap();
|
||||||
|
|
||||||
|
match arg.parse() {
|
||||||
|
Ok(v) => v,
|
||||||
|
Err(_) => {
|
||||||
|
error!("Invalid place ID {}", arg);
|
||||||
|
process::exit(1);
|
||||||
|
},
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
let options = commands::UploadOptions {
|
||||||
|
fuzzy_project_path,
|
||||||
|
security_cookie: security_cookie.to_string(),
|
||||||
|
asset_id,
|
||||||
|
kind,
|
||||||
|
};
|
||||||
|
|
||||||
|
match commands::upload(&options) {
|
||||||
|
Ok(_) => {},
|
||||||
|
Err(e) => {
|
||||||
|
error!("{}", e);
|
||||||
|
process::exit(1);
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -4,12 +4,13 @@ use std::{
|
|||||||
io,
|
io,
|
||||||
};
|
};
|
||||||
|
|
||||||
|
use log::info;
|
||||||
use failure::Fail;
|
use failure::Fail;
|
||||||
|
|
||||||
use crate::{
|
use crate::{
|
||||||
rbx_session::construct_oneoff_tree,
|
rbx_session::construct_oneoff_tree,
|
||||||
project::{Project, ProjectLoadFuzzyError},
|
project::{Project, ProjectLoadFuzzyError},
|
||||||
imfs::Imfs,
|
imfs::{Imfs, FsError},
|
||||||
};
|
};
|
||||||
|
|
||||||
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
|
||||||
@@ -54,32 +55,19 @@ pub enum BuildError {
|
|||||||
XmlModelEncodeError(rbx_xml::EncodeError),
|
XmlModelEncodeError(rbx_xml::EncodeError),
|
||||||
|
|
||||||
#[fail(display = "Binary model file error")]
|
#[fail(display = "Binary model file error")]
|
||||||
BinaryModelEncodeError(rbx_binary::EncodeError)
|
BinaryModelEncodeError(rbx_binary::EncodeError),
|
||||||
|
|
||||||
|
#[fail(display = "{}", _0)]
|
||||||
|
FsError(#[fail(cause)] FsError),
|
||||||
}
|
}
|
||||||
|
|
||||||
impl From<ProjectLoadFuzzyError> for BuildError {
|
impl_from!(BuildError {
|
||||||
fn from(error: ProjectLoadFuzzyError) -> BuildError {
|
ProjectLoadFuzzyError => ProjectLoadError,
|
||||||
BuildError::ProjectLoadError(error)
|
io::Error => IoError,
|
||||||
}
|
rbx_xml::EncodeError => XmlModelEncodeError,
|
||||||
}
|
rbx_binary::EncodeError => BinaryModelEncodeError,
|
||||||
|
FsError => FsError,
|
||||||
impl From<io::Error> for BuildError {
|
});
|
||||||
fn from(error: io::Error) -> BuildError {
|
|
||||||
BuildError::IoError(error)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
impl From<rbx_xml::EncodeError> for BuildError {
|
|
||||||
fn from(error: rbx_xml::EncodeError) -> BuildError {
|
|
||||||
BuildError::XmlModelEncodeError(error)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
impl From<rbx_binary::EncodeError> for BuildError {
|
|
||||||
fn from(error: rbx_binary::EncodeError) -> BuildError {
|
|
||||||
BuildError::BinaryModelEncodeError(error)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn build(options: &BuildOptions) -> Result<(), BuildError> {
|
pub fn build(options: &BuildOptions) -> Result<(), BuildError> {
|
||||||
let output_kind = options.output_kind
|
let output_kind = options.output_kind
|
||||||
@@ -91,6 +79,7 @@ pub fn build(options: &BuildOptions) -> Result<(), BuildError> {
|
|||||||
info!("Looking for project at {}", options.fuzzy_project_path.display());
|
info!("Looking for project at {}", options.fuzzy_project_path.display());
|
||||||
|
|
||||||
let project = Project::load_fuzzy(&options.fuzzy_project_path)?;
|
let project = Project::load_fuzzy(&options.fuzzy_project_path)?;
|
||||||
|
project.check_compatibility();
|
||||||
|
|
||||||
info!("Found project at {}", project.file_location.display());
|
info!("Found project at {}", project.file_location.display());
|
||||||
info!("Using project {:#?}", project);
|
info!("Using project {:#?}", project);
|
||||||
|
|||||||
@@ -15,11 +15,9 @@ pub enum InitError {
|
|||||||
ProjectInitError(#[fail(cause)] ProjectInitError)
|
ProjectInitError(#[fail(cause)] ProjectInitError)
|
||||||
}
|
}
|
||||||
|
|
||||||
impl From<ProjectInitError> for InitError {
|
impl_from!(InitError {
|
||||||
fn from(error: ProjectInitError) -> InitError {
|
ProjectInitError => ProjectInitError,
|
||||||
InitError::ProjectInitError(error)
|
});
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Debug)]
|
#[derive(Debug)]
|
||||||
pub struct InitOptions<'a> {
|
pub struct InitOptions<'a> {
|
||||||
|
|||||||
@@ -3,12 +3,14 @@ use std::{
|
|||||||
sync::Arc,
|
sync::Arc,
|
||||||
};
|
};
|
||||||
|
|
||||||
|
use log::info;
|
||||||
use failure::Fail;
|
use failure::Fail;
|
||||||
|
|
||||||
use crate::{
|
use crate::{
|
||||||
project::{Project, ProjectLoadFuzzyError},
|
project::{Project, ProjectLoadFuzzyError},
|
||||||
web::Server,
|
web::Server,
|
||||||
session::Session,
|
imfs::FsError,
|
||||||
|
live_session::LiveSession,
|
||||||
};
|
};
|
||||||
|
|
||||||
const DEFAULT_PORT: u16 = 34872;
|
const DEFAULT_PORT: u16 = 34872;
|
||||||
@@ -23,24 +25,27 @@ pub struct ServeOptions {
|
|||||||
pub enum ServeError {
|
pub enum ServeError {
|
||||||
#[fail(display = "Project load error: {}", _0)]
|
#[fail(display = "Project load error: {}", _0)]
|
||||||
ProjectLoadError(#[fail(cause)] ProjectLoadFuzzyError),
|
ProjectLoadError(#[fail(cause)] ProjectLoadFuzzyError),
|
||||||
|
|
||||||
|
#[fail(display = "{}", _0)]
|
||||||
|
FsError(#[fail(cause)] FsError),
|
||||||
}
|
}
|
||||||
|
|
||||||
impl From<ProjectLoadFuzzyError> for ServeError {
|
impl_from!(ServeError {
|
||||||
fn from(error: ProjectLoadFuzzyError) -> ServeError {
|
ProjectLoadFuzzyError => ProjectLoadError,
|
||||||
ServeError::ProjectLoadError(error)
|
FsError => FsError,
|
||||||
}
|
});
|
||||||
}
|
|
||||||
|
|
||||||
pub fn serve(options: &ServeOptions) -> Result<(), ServeError> {
|
pub fn serve(options: &ServeOptions) -> Result<(), ServeError> {
|
||||||
info!("Looking for project at {}", options.fuzzy_project_path.display());
|
info!("Looking for project at {}", options.fuzzy_project_path.display());
|
||||||
|
|
||||||
let project = Arc::new(Project::load_fuzzy(&options.fuzzy_project_path)?);
|
let project = Arc::new(Project::load_fuzzy(&options.fuzzy_project_path)?);
|
||||||
|
project.check_compatibility();
|
||||||
|
|
||||||
info!("Found project at {}", project.file_location.display());
|
info!("Found project at {}", project.file_location.display());
|
||||||
info!("Using project {:#?}", project);
|
info!("Using project {:#?}", project);
|
||||||
|
|
||||||
let session = Arc::new(Session::new(Arc::clone(&project)).unwrap());
|
let live_session = Arc::new(LiveSession::new(Arc::clone(&project))?);
|
||||||
let server = Server::new(Arc::clone(&session));
|
let server = Server::new(Arc::clone(&live_session));
|
||||||
|
|
||||||
let port = options.port
|
let port = options.port
|
||||||
.or(project.serve_port)
|
.or(project.serve_port)
|
||||||
|
|||||||
@@ -3,6 +3,7 @@ use std::{
|
|||||||
io,
|
io,
|
||||||
};
|
};
|
||||||
|
|
||||||
|
use log::info;
|
||||||
use failure::Fail;
|
use failure::Fail;
|
||||||
|
|
||||||
use reqwest::header::{ACCEPT, USER_AGENT, CONTENT_TYPE, COOKIE};
|
use reqwest::header::{ACCEPT, USER_AGENT, CONTENT_TYPE, COOKIE};
|
||||||
@@ -10,7 +11,7 @@ use reqwest::header::{ACCEPT, USER_AGENT, CONTENT_TYPE, COOKIE};
|
|||||||
use crate::{
|
use crate::{
|
||||||
rbx_session::construct_oneoff_tree,
|
rbx_session::construct_oneoff_tree,
|
||||||
project::{Project, ProjectLoadFuzzyError},
|
project::{Project, ProjectLoadFuzzyError},
|
||||||
imfs::Imfs,
|
imfs::{Imfs, FsError},
|
||||||
};
|
};
|
||||||
|
|
||||||
#[derive(Debug, Fail)]
|
#[derive(Debug, Fail)]
|
||||||
@@ -32,31 +33,18 @@ pub enum UploadError {
|
|||||||
|
|
||||||
#[fail(display = "XML model file error")]
|
#[fail(display = "XML model file error")]
|
||||||
XmlModelEncodeError(rbx_xml::EncodeError),
|
XmlModelEncodeError(rbx_xml::EncodeError),
|
||||||
|
|
||||||
|
#[fail(display = "{}", _0)]
|
||||||
|
FsError(#[fail(cause)] FsError),
|
||||||
}
|
}
|
||||||
|
|
||||||
impl From<ProjectLoadFuzzyError> for UploadError {
|
impl_from!(UploadError {
|
||||||
fn from(error: ProjectLoadFuzzyError) -> UploadError {
|
ProjectLoadFuzzyError => ProjectLoadError,
|
||||||
UploadError::ProjectLoadError(error)
|
io::Error => IoError,
|
||||||
}
|
reqwest::Error => HttpError,
|
||||||
}
|
rbx_xml::EncodeError => XmlModelEncodeError,
|
||||||
|
FsError => FsError,
|
||||||
impl From<io::Error> for UploadError {
|
});
|
||||||
fn from(error: io::Error) -> UploadError {
|
|
||||||
UploadError::IoError(error)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
impl From<reqwest::Error> for UploadError {
|
|
||||||
fn from(error: reqwest::Error) -> UploadError {
|
|
||||||
UploadError::HttpError(error)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
impl From<rbx_xml::EncodeError> for UploadError {
|
|
||||||
fn from(error: rbx_xml::EncodeError) -> UploadError {
|
|
||||||
UploadError::XmlModelEncodeError(error)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Debug)]
|
#[derive(Debug)]
|
||||||
pub struct UploadOptions<'a> {
|
pub struct UploadOptions<'a> {
|
||||||
@@ -72,6 +60,7 @@ pub fn upload(options: &UploadOptions) -> Result<(), UploadError> {
|
|||||||
info!("Looking for project at {}", options.fuzzy_project_path.display());
|
info!("Looking for project at {}", options.fuzzy_project_path.display());
|
||||||
|
|
||||||
let project = Project::load_fuzzy(&options.fuzzy_project_path)?;
|
let project = Project::load_fuzzy(&options.fuzzy_project_path)?;
|
||||||
|
project.check_compatibility();
|
||||||
|
|
||||||
info!("Found project at {}", project.file_location.display());
|
info!("Found project at {}", project.file_location.display());
|
||||||
info!("Using project {:#?}", project);
|
info!("Using project {:#?}", project);
|
||||||
|
|||||||
@@ -1,9 +1,12 @@
|
|||||||
use std::{
|
use std::{
|
||||||
sync::{mpsc, Arc, Mutex},
|
sync::{mpsc, Arc, Mutex},
|
||||||
time::Duration,
|
time::Duration,
|
||||||
|
path::Path,
|
||||||
|
ops::Deref,
|
||||||
thread,
|
thread,
|
||||||
};
|
};
|
||||||
|
|
||||||
|
use log::{warn, trace};
|
||||||
use notify::{
|
use notify::{
|
||||||
self,
|
self,
|
||||||
DebouncedEvent,
|
DebouncedEvent,
|
||||||
@@ -19,97 +22,121 @@ use crate::{
|
|||||||
|
|
||||||
const WATCH_TIMEOUT: Duration = Duration::from_millis(100);
|
const WATCH_TIMEOUT: Duration = Duration::from_millis(100);
|
||||||
|
|
||||||
fn handle_event(imfs: &Mutex<Imfs>, rbx_session: &Mutex<RbxSession>, event: DebouncedEvent) {
|
/// Watches for changes on the filesystem and links together the in-memory
|
||||||
|
/// filesystem and in-memory Roblox tree.
|
||||||
|
pub struct FsWatcher {
|
||||||
|
watcher: RecommendedWatcher,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl FsWatcher {
|
||||||
|
/// Start a new FS watcher, watching all of the roots currently attached to
|
||||||
|
/// the given Imfs.
|
||||||
|
///
|
||||||
|
/// `rbx_session` is optional to make testing easier. If it isn't `None`,
|
||||||
|
/// events will be passed to it after they're given to the Imfs.
|
||||||
|
pub fn start(imfs: Arc<Mutex<Imfs>>, rbx_session: Option<Arc<Mutex<RbxSession>>>) -> FsWatcher {
|
||||||
|
let (watch_tx, watch_rx) = mpsc::channel();
|
||||||
|
|
||||||
|
let mut watcher = notify::watcher(watch_tx, WATCH_TIMEOUT)
|
||||||
|
.expect("Could not create filesystem watcher");
|
||||||
|
|
||||||
|
{
|
||||||
|
let imfs = imfs.lock().unwrap();
|
||||||
|
|
||||||
|
for root_path in imfs.get_roots() {
|
||||||
|
watcher.watch(root_path, RecursiveMode::Recursive)
|
||||||
|
.expect("Could not watch directory");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
{
|
||||||
|
let imfs = Arc::clone(&imfs);
|
||||||
|
let rbx_session = rbx_session.as_ref().map(Arc::clone);
|
||||||
|
|
||||||
|
thread::spawn(move || {
|
||||||
|
trace!("Watcher thread started");
|
||||||
|
while let Ok(event) = watch_rx.recv() {
|
||||||
|
// handle_fs_event expects an Option<&Mutex<T>>, but we have
|
||||||
|
// an Option<Arc<Mutex<T>>>, so we coerce with Deref.
|
||||||
|
let session_ref = rbx_session.as_ref().map(Deref::deref);
|
||||||
|
|
||||||
|
handle_fs_event(&imfs, session_ref, event);
|
||||||
|
}
|
||||||
|
trace!("Watcher thread stopped");
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
FsWatcher {
|
||||||
|
watcher,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn stop_watching_path(&mut self, path: &Path) {
|
||||||
|
match self.watcher.unwatch(path) {
|
||||||
|
Ok(_) => {},
|
||||||
|
Err(e) => {
|
||||||
|
warn!("Could not unwatch path {}: {}", path.display(), e);
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn handle_fs_event(imfs: &Mutex<Imfs>, rbx_session: Option<&Mutex<RbxSession>>, event: DebouncedEvent) {
|
||||||
match event {
|
match event {
|
||||||
DebouncedEvent::Create(path) => {
|
DebouncedEvent::Create(path) => {
|
||||||
|
trace!("Path created: {}", path.display());
|
||||||
|
|
||||||
{
|
{
|
||||||
let mut imfs = imfs.lock().unwrap();
|
let mut imfs = imfs.lock().unwrap();
|
||||||
imfs.path_created(&path).unwrap();
|
imfs.path_created(&path).unwrap();
|
||||||
}
|
}
|
||||||
|
|
||||||
{
|
if let Some(rbx_session) = rbx_session {
|
||||||
let mut rbx_session = rbx_session.lock().unwrap();
|
let mut rbx_session = rbx_session.lock().unwrap();
|
||||||
rbx_session.path_created(&path);
|
rbx_session.path_created(&path);
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
DebouncedEvent::Write(path) => {
|
DebouncedEvent::Write(path) => {
|
||||||
|
trace!("Path created: {}", path.display());
|
||||||
|
|
||||||
{
|
{
|
||||||
let mut imfs = imfs.lock().unwrap();
|
let mut imfs = imfs.lock().unwrap();
|
||||||
imfs.path_updated(&path).unwrap();
|
imfs.path_updated(&path).unwrap();
|
||||||
}
|
}
|
||||||
|
|
||||||
{
|
if let Some(rbx_session) = rbx_session {
|
||||||
let mut rbx_session = rbx_session.lock().unwrap();
|
let mut rbx_session = rbx_session.lock().unwrap();
|
||||||
rbx_session.path_updated(&path);
|
rbx_session.path_updated(&path);
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
DebouncedEvent::Remove(path) => {
|
DebouncedEvent::Remove(path) => {
|
||||||
|
trace!("Path removed: {}", path.display());
|
||||||
|
|
||||||
{
|
{
|
||||||
let mut imfs = imfs.lock().unwrap();
|
let mut imfs = imfs.lock().unwrap();
|
||||||
imfs.path_removed(&path).unwrap();
|
imfs.path_removed(&path).unwrap();
|
||||||
}
|
}
|
||||||
|
|
||||||
{
|
if let Some(rbx_session) = rbx_session {
|
||||||
let mut rbx_session = rbx_session.lock().unwrap();
|
let mut rbx_session = rbx_session.lock().unwrap();
|
||||||
rbx_session.path_removed(&path);
|
rbx_session.path_removed(&path);
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
DebouncedEvent::Rename(from_path, to_path) => {
|
DebouncedEvent::Rename(from_path, to_path) => {
|
||||||
|
trace!("Path renamed: {} to {}", from_path.display(), to_path.display());
|
||||||
|
|
||||||
{
|
{
|
||||||
let mut imfs = imfs.lock().unwrap();
|
let mut imfs = imfs.lock().unwrap();
|
||||||
imfs.path_moved(&from_path, &to_path).unwrap();
|
imfs.path_moved(&from_path, &to_path).unwrap();
|
||||||
}
|
}
|
||||||
|
|
||||||
{
|
if let Some(rbx_session) = rbx_session {
|
||||||
let mut rbx_session = rbx_session.lock().unwrap();
|
let mut rbx_session = rbx_session.lock().unwrap();
|
||||||
rbx_session.path_renamed(&from_path, &to_path);
|
rbx_session.path_renamed(&from_path, &to_path);
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
_ => {},
|
other => {
|
||||||
}
|
trace!("Unhandled FS event: {:?}", other);
|
||||||
}
|
},
|
||||||
|
|
||||||
/// Watches for changes on the filesystem and links together the in-memory
|
|
||||||
/// filesystem and in-memory Roblox tree.
|
|
||||||
pub struct FsWatcher {
|
|
||||||
#[allow(unused)]
|
|
||||||
watchers: Vec<RecommendedWatcher>,
|
|
||||||
}
|
|
||||||
|
|
||||||
impl FsWatcher {
|
|
||||||
pub fn start(imfs: Arc<Mutex<Imfs>>, rbx_session: Arc<Mutex<RbxSession>>) -> FsWatcher {
|
|
||||||
let mut watchers = Vec::new();
|
|
||||||
|
|
||||||
{
|
|
||||||
let imfs_temp = imfs.lock().unwrap();
|
|
||||||
|
|
||||||
for root_path in imfs_temp.get_roots() {
|
|
||||||
let (watch_tx, watch_rx) = mpsc::channel();
|
|
||||||
|
|
||||||
let mut watcher = notify::watcher(watch_tx, WATCH_TIMEOUT)
|
|
||||||
.expect("Could not create `notify` watcher");
|
|
||||||
|
|
||||||
watcher.watch(root_path, RecursiveMode::Recursive)
|
|
||||||
.expect("Could not watch directory");
|
|
||||||
|
|
||||||
watchers.push(watcher);
|
|
||||||
|
|
||||||
let imfs = Arc::clone(&imfs);
|
|
||||||
let rbx_session = Arc::clone(&rbx_session);
|
|
||||||
let root_path = root_path.clone();
|
|
||||||
|
|
||||||
thread::spawn(move || {
|
|
||||||
info!("Watcher thread ({}) started", root_path.display());
|
|
||||||
while let Ok(event) = watch_rx.recv() {
|
|
||||||
handle_event(&imfs, &rbx_session, event);
|
|
||||||
}
|
|
||||||
info!("Watcher thread ({}) stopped", root_path.display());
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
FsWatcher {
|
|
||||||
watchers,
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1,13 +1,41 @@
|
|||||||
use std::{
|
use std::{
|
||||||
collections::{HashMap, HashSet},
|
collections::{HashMap, HashSet},
|
||||||
path::{Path, PathBuf},
|
path::{self, Path, PathBuf},
|
||||||
|
fmt,
|
||||||
fs,
|
fs,
|
||||||
io,
|
io,
|
||||||
};
|
};
|
||||||
|
|
||||||
|
use failure::Fail;
|
||||||
|
use serde_derive::{Serialize, Deserialize};
|
||||||
|
|
||||||
use crate::project::{Project, ProjectNode};
|
use crate::project::{Project, ProjectNode};
|
||||||
|
|
||||||
fn add_sync_points(imfs: &mut Imfs, project_node: &ProjectNode) -> io::Result<()> {
|
/// A wrapper around io::Error that also attaches the path associated with the
|
||||||
|
/// error.
|
||||||
|
#[derive(Debug, Fail)]
|
||||||
|
pub struct FsError {
|
||||||
|
#[fail(cause)]
|
||||||
|
inner: io::Error,
|
||||||
|
path: PathBuf,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl FsError {
|
||||||
|
fn new<P: Into<PathBuf>>(inner: io::Error, path: P) -> FsError {
|
||||||
|
FsError {
|
||||||
|
inner,
|
||||||
|
path: path.into(),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl fmt::Display for FsError {
|
||||||
|
fn fmt(&self, output: &mut fmt::Formatter) -> fmt::Result {
|
||||||
|
write!(output, "{}: {}", self.path.display(), self.inner)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn add_sync_points(imfs: &mut Imfs, project_node: &ProjectNode) -> Result<(), FsError> {
|
||||||
match project_node {
|
match project_node {
|
||||||
ProjectNode::Instance(node) => {
|
ProjectNode::Instance(node) => {
|
||||||
for child in node.children.values() {
|
for child in node.children.values() {
|
||||||
@@ -22,9 +50,12 @@ fn add_sync_points(imfs: &mut Imfs, project_node: &ProjectNode) -> io::Result<()
|
|||||||
Ok(())
|
Ok(())
|
||||||
}
|
}
|
||||||
|
|
||||||
/// The in-memory filesystem keeps a mirror of all files being watcher by Rojo
|
/// The in-memory filesystem keeps a mirror of all files being watched by Rojo
|
||||||
/// in order to deduplicate file changes in the case of bidirectional syncing
|
/// in order to deduplicate file changes in the case of bidirectional syncing
|
||||||
/// from Roblox Studio.
|
/// from Roblox Studio.
|
||||||
|
///
|
||||||
|
/// It also enables Rojo to quickly generate React-like snapshots to make
|
||||||
|
/// reasoning about instances and how they relate to files easier.
|
||||||
#[derive(Debug, Clone)]
|
#[derive(Debug, Clone)]
|
||||||
pub struct Imfs {
|
pub struct Imfs {
|
||||||
items: HashMap<PathBuf, ImfsItem>,
|
items: HashMap<PathBuf, ImfsItem>,
|
||||||
@@ -39,7 +70,7 @@ impl Imfs {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn add_roots_from_project(&mut self, project: &Project) -> io::Result<()> {
|
pub fn add_roots_from_project(&mut self, project: &Project) -> Result<(), FsError> {
|
||||||
add_sync_points(self, &project.tree)
|
add_sync_points(self, &project.tree)
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -58,30 +89,42 @@ impl Imfs {
|
|||||||
self.items.get(path)
|
self.items.get(path)
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn add_root(&mut self, path: &Path) -> io::Result<()> {
|
pub fn add_root(&mut self, path: &Path) -> Result<(), FsError> {
|
||||||
debug_assert!(path.is_absolute());
|
debug_assert!(path.is_absolute());
|
||||||
debug_assert!(!self.is_within_roots(path));
|
debug_assert!(!self.is_within_roots(path));
|
||||||
|
|
||||||
self.roots.insert(path.to_path_buf());
|
self.roots.insert(path.to_path_buf());
|
||||||
|
|
||||||
self.read_from_disk(path)
|
self.descend_and_read_from_disk(path)
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn path_created(&mut self, path: &Path) -> io::Result<()> {
|
pub fn remove_root(&mut self, path: &Path) {
|
||||||
|
debug_assert!(path.is_absolute());
|
||||||
|
|
||||||
|
if self.roots.get(path).is_some() {
|
||||||
|
self.remove_item(path);
|
||||||
|
|
||||||
|
if let Some(parent_path) = path.parent() {
|
||||||
|
self.unlink_child(parent_path, path);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn path_created(&mut self, path: &Path) -> Result<(), FsError> {
|
||||||
debug_assert!(path.is_absolute());
|
debug_assert!(path.is_absolute());
|
||||||
debug_assert!(self.is_within_roots(path));
|
debug_assert!(self.is_within_roots(path));
|
||||||
|
|
||||||
self.read_from_disk(path)
|
self.descend_and_read_from_disk(path)
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn path_updated(&mut self, path: &Path) -> io::Result<()> {
|
pub fn path_updated(&mut self, path: &Path) -> Result<(), FsError> {
|
||||||
debug_assert!(path.is_absolute());
|
debug_assert!(path.is_absolute());
|
||||||
debug_assert!(self.is_within_roots(path));
|
debug_assert!(self.is_within_roots(path));
|
||||||
|
|
||||||
self.read_from_disk(path)
|
self.descend_and_read_from_disk(path)
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn path_removed(&mut self, path: &Path) -> io::Result<()> {
|
pub fn path_removed(&mut self, path: &Path) -> Result<(), FsError> {
|
||||||
debug_assert!(path.is_absolute());
|
debug_assert!(path.is_absolute());
|
||||||
debug_assert!(self.is_within_roots(path));
|
debug_assert!(self.is_within_roots(path));
|
||||||
|
|
||||||
@@ -94,12 +137,7 @@ impl Imfs {
|
|||||||
Ok(())
|
Ok(())
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn path_moved(&mut self, from_path: &Path, to_path: &Path) -> io::Result<()> {
|
pub fn path_moved(&mut self, from_path: &Path, to_path: &Path) -> Result<(), FsError> {
|
||||||
debug_assert!(from_path.is_absolute());
|
|
||||||
debug_assert!(self.is_within_roots(from_path));
|
|
||||||
debug_assert!(to_path.is_absolute());
|
|
||||||
debug_assert!(self.is_within_roots(to_path));
|
|
||||||
|
|
||||||
self.path_removed(from_path)?;
|
self.path_removed(from_path)?;
|
||||||
self.path_created(to_path)?;
|
self.path_created(to_path)?;
|
||||||
Ok(())
|
Ok(())
|
||||||
@@ -130,9 +168,7 @@ impl Imfs {
|
|||||||
Some(ImfsItem::Directory(directory)) => {
|
Some(ImfsItem::Directory(directory)) => {
|
||||||
directory.children.remove(child);
|
directory.children.remove(child);
|
||||||
},
|
},
|
||||||
_ => {
|
_ => {},
|
||||||
panic!("Tried to unlink child of path that wasn't a directory!");
|
|
||||||
},
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -151,11 +187,44 @@ impl Imfs {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
fn read_from_disk(&mut self, path: &Path) -> io::Result<()> {
|
fn descend_and_read_from_disk(&mut self, path: &Path) -> Result<(), FsError> {
|
||||||
let metadata = fs::metadata(path)?;
|
let root_path = self.get_root_path(path)
|
||||||
|
.expect("Tried to descent and read for path that wasn't within roots!");
|
||||||
|
|
||||||
|
// If this path is a root, we should read the entire thing.
|
||||||
|
if root_path == path {
|
||||||
|
self.read_from_disk(path)?;
|
||||||
|
return Ok(());
|
||||||
|
}
|
||||||
|
|
||||||
|
let relative_path = path.strip_prefix(root_path).unwrap();
|
||||||
|
let mut current_path = root_path.to_path_buf();
|
||||||
|
|
||||||
|
for component in relative_path.components() {
|
||||||
|
match component {
|
||||||
|
path::Component::Normal(name) => {
|
||||||
|
let next_path = current_path.join(name);
|
||||||
|
|
||||||
|
if self.items.contains_key(&next_path) {
|
||||||
|
current_path = next_path;
|
||||||
|
} else {
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
_ => unreachable!(),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
self.read_from_disk(¤t_path)
|
||||||
|
}
|
||||||
|
|
||||||
|
fn read_from_disk(&mut self, path: &Path) -> Result<(), FsError> {
|
||||||
|
let metadata = fs::metadata(path)
|
||||||
|
.map_err(|e| FsError::new(e, path))?;
|
||||||
|
|
||||||
if metadata.is_file() {
|
if metadata.is_file() {
|
||||||
let contents = fs::read(path)?;
|
let contents = fs::read(path)
|
||||||
|
.map_err(|e| FsError::new(e, path))?;
|
||||||
let item = ImfsItem::File(ImfsFile {
|
let item = ImfsItem::File(ImfsFile {
|
||||||
path: path.to_path_buf(),
|
path: path.to_path_buf(),
|
||||||
contents,
|
contents,
|
||||||
@@ -176,8 +245,13 @@ impl Imfs {
|
|||||||
|
|
||||||
self.items.insert(path.to_path_buf(), item);
|
self.items.insert(path.to_path_buf(), item);
|
||||||
|
|
||||||
for entry in fs::read_dir(path)? {
|
let dir_children = fs::read_dir(path)
|
||||||
let entry = entry?;
|
.map_err(|e| FsError::new(e, path))?;
|
||||||
|
|
||||||
|
for entry in dir_children {
|
||||||
|
let entry = entry
|
||||||
|
.map_err(|e| FsError::new(e, path))?;
|
||||||
|
|
||||||
let child_path = entry.path();
|
let child_path = entry.path();
|
||||||
|
|
||||||
self.read_from_disk(&child_path)?;
|
self.read_from_disk(&child_path)?;
|
||||||
@@ -193,6 +267,16 @@ impl Imfs {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fn get_root_path<'a>(&'a self, path: &Path) -> Option<&'a Path> {
|
||||||
|
for root_path in &self.roots {
|
||||||
|
if path.starts_with(root_path) {
|
||||||
|
return Some(root_path)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
None
|
||||||
|
}
|
||||||
|
|
||||||
fn is_within_roots(&self, path: &Path) -> bool {
|
fn is_within_roots(&self, path: &Path) -> bool {
|
||||||
for root_path in &self.roots {
|
for root_path in &self.roots {
|
||||||
if path.starts_with(root_path) {
|
if path.starts_with(root_path) {
|
||||||
|
|||||||
18
server/src/impl_from.rs
Normal file
@@ -0,0 +1,18 @@
|
|||||||
|
/// Implements 'From' for a list of variants, intended for use with error enums
|
||||||
|
/// that are wrapping a number of errors from other methods.
|
||||||
|
#[macro_export]
|
||||||
|
macro_rules! impl_from {
|
||||||
|
(
|
||||||
|
$enum_name: ident {
|
||||||
|
$($error_type: ty => $variant_name: ident),* $(,)*
|
||||||
|
}
|
||||||
|
) => {
|
||||||
|
$(
|
||||||
|
impl From<$error_type> for $enum_name {
|
||||||
|
fn from(error: $error_type) -> $enum_name {
|
||||||
|
$enum_name::$variant_name(error)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
)*
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -1,13 +1,8 @@
|
|||||||
|
// Macros
|
||||||
#[macro_use]
|
#[macro_use]
|
||||||
extern crate log;
|
pub mod impl_from;
|
||||||
|
|
||||||
#[macro_use]
|
// Other modules
|
||||||
extern crate serde_derive;
|
|
||||||
|
|
||||||
#[cfg(test)]
|
|
||||||
extern crate tempfile;
|
|
||||||
|
|
||||||
// pub mod roblox_studio;
|
|
||||||
pub mod commands;
|
pub mod commands;
|
||||||
pub mod fs_watcher;
|
pub mod fs_watcher;
|
||||||
pub mod imfs;
|
pub mod imfs;
|
||||||
@@ -16,8 +11,9 @@ pub mod path_map;
|
|||||||
pub mod project;
|
pub mod project;
|
||||||
pub mod rbx_session;
|
pub mod rbx_session;
|
||||||
pub mod rbx_snapshot;
|
pub mod rbx_snapshot;
|
||||||
pub mod session;
|
pub mod live_session;
|
||||||
pub mod session_id;
|
pub mod session_id;
|
||||||
|
pub mod snapshot_reconciler;
|
||||||
pub mod visualize;
|
pub mod visualize;
|
||||||
pub mod web;
|
pub mod web;
|
||||||
pub mod web_util;
|
pub mod web_util;
|
||||||
@@ -1,19 +1,19 @@
|
|||||||
use std::{
|
use std::{
|
||||||
sync::{Arc, Mutex},
|
sync::{Arc, Mutex},
|
||||||
io,
|
|
||||||
};
|
};
|
||||||
|
|
||||||
use crate::{
|
use crate::{
|
||||||
|
fs_watcher::FsWatcher,
|
||||||
|
imfs::{Imfs, FsError},
|
||||||
message_queue::MessageQueue,
|
message_queue::MessageQueue,
|
||||||
project::Project,
|
project::Project,
|
||||||
imfs::Imfs,
|
|
||||||
session_id::SessionId,
|
|
||||||
rbx_session::RbxSession,
|
rbx_session::RbxSession,
|
||||||
rbx_snapshot::InstanceChanges,
|
session_id::SessionId,
|
||||||
fs_watcher::FsWatcher,
|
snapshot_reconciler::InstanceChanges,
|
||||||
};
|
};
|
||||||
|
|
||||||
pub struct Session {
|
/// Contains all of the state for a Rojo live-sync session.
|
||||||
|
pub struct LiveSession {
|
||||||
pub project: Arc<Project>,
|
pub project: Arc<Project>,
|
||||||
pub session_id: SessionId,
|
pub session_id: SessionId,
|
||||||
pub message_queue: Arc<MessageQueue<InstanceChanges>>,
|
pub message_queue: Arc<MessageQueue<InstanceChanges>>,
|
||||||
@@ -22,8 +22,8 @@ pub struct Session {
|
|||||||
_fs_watcher: FsWatcher,
|
_fs_watcher: FsWatcher,
|
||||||
}
|
}
|
||||||
|
|
||||||
impl Session {
|
impl LiveSession {
|
||||||
pub fn new(project: Arc<Project>) -> io::Result<Session> {
|
pub fn new(project: Arc<Project>) -> Result<LiveSession, FsError> {
|
||||||
let imfs = {
|
let imfs = {
|
||||||
let mut imfs = Imfs::new();
|
let mut imfs = Imfs::new();
|
||||||
imfs.add_roots_from_project(&project)?;
|
imfs.add_roots_from_project(&project)?;
|
||||||
@@ -40,12 +40,12 @@ impl Session {
|
|||||||
|
|
||||||
let fs_watcher = FsWatcher::start(
|
let fs_watcher = FsWatcher::start(
|
||||||
Arc::clone(&imfs),
|
Arc::clone(&imfs),
|
||||||
Arc::clone(&rbx_session),
|
Some(Arc::clone(&rbx_session)),
|
||||||
);
|
);
|
||||||
|
|
||||||
let session_id = SessionId::new();
|
let session_id = SessionId::new();
|
||||||
|
|
||||||
Ok(Session {
|
Ok(LiveSession {
|
||||||
project,
|
project,
|
||||||
session_id,
|
session_id,
|
||||||
message_queue,
|
message_queue,
|
||||||
@@ -19,6 +19,10 @@ pub fn get_listener_id() -> ListenerId {
|
|||||||
ListenerId(LAST_ID.fetch_add(1, Ordering::SeqCst))
|
ListenerId(LAST_ID.fetch_add(1, Ordering::SeqCst))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// A message queue with persistent history that can be subscribed to.
|
||||||
|
///
|
||||||
|
/// Definitely non-optimal, but a simple design that works well for the
|
||||||
|
/// synchronous web server Rojo uses, Rouille.
|
||||||
#[derive(Default)]
|
#[derive(Default)]
|
||||||
pub struct MessageQueue<T> {
|
pub struct MessageQueue<T> {
|
||||||
messages: RwLock<Vec<T>>,
|
messages: RwLock<Vec<T>>,
|
||||||
|
|||||||
@@ -1,16 +1,21 @@
|
|||||||
use std::{
|
use std::{
|
||||||
|
collections::hash_map,
|
||||||
path::{self, Path, PathBuf},
|
path::{self, Path, PathBuf},
|
||||||
collections::{HashMap, HashSet},
|
collections::{HashMap, HashSet},
|
||||||
};
|
};
|
||||||
|
|
||||||
|
use serde_derive::Serialize;
|
||||||
|
use log::warn;
|
||||||
|
|
||||||
#[derive(Debug, Serialize)]
|
#[derive(Debug, Serialize)]
|
||||||
struct PathMapNode<T> {
|
struct PathMapNode<T> {
|
||||||
value: T,
|
value: T,
|
||||||
children: HashSet<PathBuf>,
|
children: HashSet<PathBuf>,
|
||||||
}
|
}
|
||||||
|
|
||||||
/// A map from paths to instance IDs, with a bit of additional data that enables
|
/// A map from paths to another type, like instance IDs, with a bit of
|
||||||
/// removing a path and all of its child paths from the tree more quickly.
|
/// additional data that enables removing a path and all of its child paths from
|
||||||
|
/// the tree more quickly.
|
||||||
#[derive(Debug, Serialize)]
|
#[derive(Debug, Serialize)]
|
||||||
pub struct PathMap<T> {
|
pub struct PathMap<T> {
|
||||||
nodes: HashMap<PathBuf, PathMapNode<T>>,
|
nodes: HashMap<PathBuf, PathMapNode<T>>,
|
||||||
@@ -27,6 +32,16 @@ impl<T> PathMap<T> {
|
|||||||
self.nodes.get(path).map(|v| &v.value)
|
self.nodes.get(path).map(|v| &v.value)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
pub fn get_mut(&mut self, path: &Path) -> Option<&mut T> {
|
||||||
|
self.nodes.get_mut(path).map(|v| &mut v.value)
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn entry<'a>(&'a mut self, path: PathBuf) -> Entry<'a, T> {
|
||||||
|
Entry {
|
||||||
|
internal: self.nodes.entry(path),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
pub fn insert(&mut self, path: PathBuf, value: T) {
|
pub fn insert(&mut self, path: PathBuf, value: T) {
|
||||||
if let Some(parent_path) = path.parent() {
|
if let Some(parent_path) = path.parent() {
|
||||||
if let Some(parent) = self.nodes.get_mut(parent_path) {
|
if let Some(parent) = self.nodes.get_mut(parent_path) {
|
||||||
@@ -71,6 +86,14 @@ impl<T> PathMap<T> {
|
|||||||
Some(root_value)
|
Some(root_value)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Traverses the route between `start_path` and `target_path` and returns
|
||||||
|
/// the path closest to `target_path` in the tree.
|
||||||
|
///
|
||||||
|
/// This is useful when trying to determine what paths need to be marked as
|
||||||
|
/// altered when a change to a path is registered. Depending on the order of
|
||||||
|
/// FS events, a file remove event could be followed by that file's
|
||||||
|
/// directory being removed, in which case we should process that
|
||||||
|
/// directory's parent.
|
||||||
pub fn descend(&self, start_path: &Path, target_path: &Path) -> PathBuf {
|
pub fn descend(&self, start_path: &Path, target_path: &Path) -> PathBuf {
|
||||||
let relative_path = target_path.strip_prefix(start_path)
|
let relative_path = target_path.strip_prefix(start_path)
|
||||||
.expect("target_path did not begin with start_path");
|
.expect("target_path did not begin with start_path");
|
||||||
@@ -93,4 +116,28 @@ impl<T> PathMap<T> {
|
|||||||
|
|
||||||
current_path
|
current_path
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
pub struct Entry<'a, T> {
|
||||||
|
internal: hash_map::Entry<'a, PathBuf, PathMapNode<T>>,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<'a, T> Entry<'a, T> {
|
||||||
|
pub fn or_insert(self, value: T) -> &'a mut T {
|
||||||
|
&mut self.internal.or_insert(PathMapNode {
|
||||||
|
value,
|
||||||
|
children: HashSet::new(),
|
||||||
|
}).value
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl<'a, T> Entry<'a, T>
|
||||||
|
where T: Default
|
||||||
|
{
|
||||||
|
pub fn or_default(self) -> &'a mut T {
|
||||||
|
&mut self.internal.or_insert(PathMapNode {
|
||||||
|
value: Default::default(),
|
||||||
|
children: HashSet::new(),
|
||||||
|
}).value
|
||||||
|
}
|
||||||
}
|
}
|
||||||
@@ -6,13 +6,17 @@ use std::{
|
|||||||
path::{Path, PathBuf},
|
path::{Path, PathBuf},
|
||||||
};
|
};
|
||||||
|
|
||||||
use maplit::hashmap;
|
use log::warn;
|
||||||
use failure::Fail;
|
use failure::Fail;
|
||||||
|
use maplit::hashmap;
|
||||||
use rbx_tree::RbxValue;
|
use rbx_tree::RbxValue;
|
||||||
|
use serde_derive::{Serialize, Deserialize};
|
||||||
|
|
||||||
pub static PROJECT_FILENAME: &'static str = "roblox-project.json";
|
pub static PROJECT_FILENAME: &'static str = "default.project.json";
|
||||||
|
pub static COMPAT_PROJECT_FILENAME: &'static str = "roblox-project.json";
|
||||||
|
|
||||||
// Serde is silly.
|
// Methods used for Serde's default value system, which doesn't support using
|
||||||
|
// value literals directly, only functions that return values.
|
||||||
const fn yeah() -> bool {
|
const fn yeah() -> bool {
|
||||||
true
|
true
|
||||||
}
|
}
|
||||||
@@ -21,6 +25,40 @@ const fn is_true(value: &bool) -> bool {
|
|||||||
*value
|
*value
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// SourceProject is the format that users author projects on-disk. Since we
|
||||||
|
/// want to do things like transforming paths to be absolute before handing them
|
||||||
|
/// off to the rest of Rojo, we use this intermediate struct.
|
||||||
|
#[derive(Debug, Serialize, Deserialize)]
|
||||||
|
#[serde(rename_all = "camelCase")]
|
||||||
|
struct SourceProject {
|
||||||
|
name: String,
|
||||||
|
tree: SourceProjectNode,
|
||||||
|
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
serve_port: Option<u16>,
|
||||||
|
|
||||||
|
#[serde(skip_serializing_if = "Option::is_none")]
|
||||||
|
serve_place_ids: Option<HashSet<u64>>,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl SourceProject {
|
||||||
|
/// Consumes the SourceProject and yields a Project, ready for prime-time.
|
||||||
|
pub fn into_project(self, project_file_location: &Path) -> Project {
|
||||||
|
let tree = self.tree.into_project_node(project_file_location);
|
||||||
|
|
||||||
|
Project {
|
||||||
|
name: self.name,
|
||||||
|
tree,
|
||||||
|
serve_port: self.serve_port,
|
||||||
|
serve_place_ids: self.serve_place_ids,
|
||||||
|
file_location: PathBuf::from(project_file_location),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Similar to SourceProject, the structure of nodes in the project tree is
|
||||||
|
/// slightly different on-disk than how we want to handle them in the rest of
|
||||||
|
/// Rojo.
|
||||||
#[derive(Debug, Serialize, Deserialize)]
|
#[derive(Debug, Serialize, Deserialize)]
|
||||||
#[serde(untagged)]
|
#[serde(untagged)]
|
||||||
enum SourceProjectNode {
|
enum SourceProjectNode {
|
||||||
@@ -44,6 +82,7 @@ enum SourceProjectNode {
|
|||||||
}
|
}
|
||||||
|
|
||||||
impl SourceProjectNode {
|
impl SourceProjectNode {
|
||||||
|
/// Consumes the SourceProjectNode and turns it into a ProjectNode.
|
||||||
pub fn into_project_node(self, project_file_location: &Path) -> ProjectNode {
|
pub fn into_project_node(self, project_file_location: &Path) -> ProjectNode {
|
||||||
match self {
|
match self {
|
||||||
SourceProjectNode::Instance { class_name, mut children, properties, ignore_unknown_instances } => {
|
SourceProjectNode::Instance { class_name, mut children, properties, ignore_unknown_instances } => {
|
||||||
@@ -78,31 +117,7 @@ impl SourceProjectNode {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[derive(Debug, Serialize, Deserialize)]
|
/// Error returned by Project::load_exact
|
||||||
#[serde(rename_all = "camelCase")]
|
|
||||||
struct SourceProject {
|
|
||||||
name: String,
|
|
||||||
tree: SourceProjectNode,
|
|
||||||
#[serde(skip_serializing_if = "Option::is_none")]
|
|
||||||
serve_port: Option<u16>,
|
|
||||||
#[serde(skip_serializing_if = "Option::is_none")]
|
|
||||||
serve_place_ids: Option<HashSet<u64>>,
|
|
||||||
}
|
|
||||||
|
|
||||||
impl SourceProject {
|
|
||||||
pub fn into_project(self, project_file_location: &Path) -> Project {
|
|
||||||
let tree = self.tree.into_project_node(project_file_location);
|
|
||||||
|
|
||||||
Project {
|
|
||||||
name: self.name,
|
|
||||||
tree,
|
|
||||||
serve_port: self.serve_port,
|
|
||||||
serve_place_ids: self.serve_place_ids,
|
|
||||||
file_location: PathBuf::from(project_file_location),
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Debug, Fail)]
|
#[derive(Debug, Fail)]
|
||||||
pub enum ProjectLoadExactError {
|
pub enum ProjectLoadExactError {
|
||||||
#[fail(display = "IO error: {}", _0)]
|
#[fail(display = "IO error: {}", _0)]
|
||||||
@@ -112,6 +127,7 @@ pub enum ProjectLoadExactError {
|
|||||||
JsonError(#[fail(cause)] serde_json::Error),
|
JsonError(#[fail(cause)] serde_json::Error),
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Error returned by Project::load_fuzzy
|
||||||
#[derive(Debug, Fail)]
|
#[derive(Debug, Fail)]
|
||||||
pub enum ProjectLoadFuzzyError {
|
pub enum ProjectLoadFuzzyError {
|
||||||
#[fail(display = "Project not found")]
|
#[fail(display = "Project not found")]
|
||||||
@@ -133,6 +149,7 @@ impl From<ProjectLoadExactError> for ProjectLoadFuzzyError {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Error returned by Project::init_place and Project::init_model
|
||||||
#[derive(Debug, Fail)]
|
#[derive(Debug, Fail)]
|
||||||
pub enum ProjectInitError {
|
pub enum ProjectInitError {
|
||||||
AlreadyExists(PathBuf),
|
AlreadyExists(PathBuf),
|
||||||
@@ -150,6 +167,7 @@ impl fmt::Display for ProjectInitError {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Error returned by Project::save
|
||||||
#[derive(Debug, Fail)]
|
#[derive(Debug, Fail)]
|
||||||
pub enum ProjectSaveError {
|
pub enum ProjectSaveError {
|
||||||
#[fail(display = "JSON error: {}", _0)]
|
#[fail(display = "JSON error: {}", _0)]
|
||||||
@@ -340,17 +358,23 @@ impl Project {
|
|||||||
// TODO: Check for specific error kinds, convert 'not found' to Result.
|
// TODO: Check for specific error kinds, convert 'not found' to Result.
|
||||||
let location_metadata = fs::metadata(start_location).ok()?;
|
let location_metadata = fs::metadata(start_location).ok()?;
|
||||||
|
|
||||||
// If this is a file, we should assume it's the config we want
|
// If this is a file, assume it's the config the user was looking for.
|
||||||
if location_metadata.is_file() {
|
if location_metadata.is_file() {
|
||||||
return Some(start_location.to_path_buf());
|
return Some(start_location.to_path_buf());
|
||||||
} else if location_metadata.is_dir() {
|
} else if location_metadata.is_dir() {
|
||||||
let with_file = start_location.join(PROJECT_FILENAME);
|
let with_file = start_location.join(PROJECT_FILENAME);
|
||||||
|
|
||||||
if let Ok(with_file_metadata) = fs::metadata(&with_file) {
|
if let Ok(file_metadata) = fs::metadata(&with_file) {
|
||||||
if with_file_metadata.is_file() {
|
if file_metadata.is_file() {
|
||||||
return Some(with_file);
|
return Some(with_file);
|
||||||
} else {
|
}
|
||||||
return None;
|
}
|
||||||
|
|
||||||
|
let with_compat_file = start_location.join(COMPAT_PROJECT_FILENAME);
|
||||||
|
|
||||||
|
if let Ok(file_metadata) = fs::metadata(&with_compat_file) {
|
||||||
|
if file_metadata.is_file() {
|
||||||
|
return Some(with_compat_file);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -389,6 +413,25 @@ impl Project {
|
|||||||
Ok(())
|
Ok(())
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// Checks if there are any compatibility issues with this project file and
|
||||||
|
/// warns the user if there are any.
|
||||||
|
pub fn check_compatibility(&self) {
|
||||||
|
let file_name = self.file_location
|
||||||
|
.file_name().unwrap()
|
||||||
|
.to_str().expect("Project file path was not valid Unicode!");
|
||||||
|
|
||||||
|
if file_name == COMPAT_PROJECT_FILENAME {
|
||||||
|
warn!("Rojo's default project file name changed in 0.5.0-alpha3.");
|
||||||
|
warn!("Support for the old project file name will be dropped before 0.5.0 releases.");
|
||||||
|
warn!("Your project file is named {}", COMPAT_PROJECT_FILENAME);
|
||||||
|
warn!("Rename your project file to {}", PROJECT_FILENAME);
|
||||||
|
} else if !file_name.ends_with(".project.json") {
|
||||||
|
warn!("Starting in Rojo 0.5.0-alpha3, it's recommended to give all project files the");
|
||||||
|
warn!(".project.json extension. This helps Rojo differentiate project files from");
|
||||||
|
warn!("other JSON files!");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
fn to_source_project(&self) -> SourceProject {
|
fn to_source_project(&self) -> SourceProject {
|
||||||
SourceProject {
|
SourceProject {
|
||||||
name: self.name.clone(),
|
name: self.name.clone(),
|
||||||
|
|||||||
@@ -1,33 +1,47 @@
|
|||||||
use std::{
|
use std::{
|
||||||
borrow::Cow,
|
borrow::Cow,
|
||||||
collections::HashMap,
|
collections::HashMap,
|
||||||
fmt,
|
|
||||||
path::{Path, PathBuf},
|
path::{Path, PathBuf},
|
||||||
str,
|
str,
|
||||||
sync::{Arc, Mutex},
|
sync::{Arc, Mutex},
|
||||||
};
|
};
|
||||||
|
|
||||||
use failure::Fail;
|
use serde_derive::{Serialize, Deserialize};
|
||||||
|
use log::{info, trace};
|
||||||
use rbx_tree::{RbxTree, RbxInstanceProperties, RbxValue, RbxId};
|
use rbx_tree::{RbxTree, RbxId};
|
||||||
|
|
||||||
use crate::{
|
use crate::{
|
||||||
project::{Project, ProjectNode, InstanceProjectNodeMetadata},
|
project::Project,
|
||||||
message_queue::MessageQueue,
|
message_queue::MessageQueue,
|
||||||
imfs::{Imfs, ImfsItem, ImfsFile},
|
imfs::{Imfs, ImfsItem},
|
||||||
path_map::PathMap,
|
path_map::PathMap,
|
||||||
rbx_snapshot::{RbxSnapshotInstance, InstanceChanges, snapshot_from_tree, reify_root, reconcile_subtree},
|
rbx_snapshot::{SnapshotContext, snapshot_project_tree, snapshot_imfs_path},
|
||||||
|
snapshot_reconciler::{InstanceChanges, reify_root, reconcile_subtree},
|
||||||
};
|
};
|
||||||
|
|
||||||
const INIT_SCRIPT: &str = "init.lua";
|
const INIT_SCRIPT: &str = "init.lua";
|
||||||
const INIT_SERVER_SCRIPT: &str = "init.server.lua";
|
const INIT_SERVER_SCRIPT: &str = "init.server.lua";
|
||||||
const INIT_CLIENT_SCRIPT: &str = "init.client.lua";
|
const INIT_CLIENT_SCRIPT: &str = "init.client.lua";
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Default, Serialize, Deserialize)]
|
||||||
|
pub struct MetadataPerPath {
|
||||||
|
pub instance_id: Option<RbxId>,
|
||||||
|
pub instance_name: Option<String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Default, Serialize, Deserialize)]
|
||||||
|
pub struct MetadataPerInstance {
|
||||||
|
pub source_path: Option<PathBuf>,
|
||||||
|
pub ignore_unknown_instances: bool,
|
||||||
|
}
|
||||||
|
|
||||||
pub struct RbxSession {
|
pub struct RbxSession {
|
||||||
tree: RbxTree,
|
tree: RbxTree,
|
||||||
path_map: PathMap<RbxId>,
|
|
||||||
instance_metadata_map: HashMap<RbxId, InstanceProjectNodeMetadata>,
|
// TODO(#105): Change metadata_per_path to PathMap<Vec<MetadataPerPath>> for
|
||||||
sync_point_names: HashMap<PathBuf, String>,
|
// path aliasing.
|
||||||
|
metadata_per_path: PathMap<MetadataPerPath>,
|
||||||
|
metadata_per_instance: HashMap<RbxId, MetadataPerInstance>,
|
||||||
message_queue: Arc<MessageQueue<InstanceChanges>>,
|
message_queue: Arc<MessageQueue<InstanceChanges>>,
|
||||||
imfs: Arc<Mutex<Imfs>>,
|
imfs: Arc<Mutex<Imfs>>,
|
||||||
}
|
}
|
||||||
@@ -38,20 +52,18 @@ impl RbxSession {
|
|||||||
imfs: Arc<Mutex<Imfs>>,
|
imfs: Arc<Mutex<Imfs>>,
|
||||||
message_queue: Arc<MessageQueue<InstanceChanges>>,
|
message_queue: Arc<MessageQueue<InstanceChanges>>,
|
||||||
) -> RbxSession {
|
) -> RbxSession {
|
||||||
let mut sync_point_names = HashMap::new();
|
let mut metadata_per_path = PathMap::new();
|
||||||
let mut path_map = PathMap::new();
|
let mut metadata_per_instance = HashMap::new();
|
||||||
let mut instance_metadata_map = HashMap::new();
|
|
||||||
|
|
||||||
let tree = {
|
let tree = {
|
||||||
let temp_imfs = imfs.lock().unwrap();
|
let temp_imfs = imfs.lock().unwrap();
|
||||||
construct_initial_tree(&project, &temp_imfs, &mut path_map, &mut instance_metadata_map, &mut sync_point_names)
|
reify_initial_tree(&project, &temp_imfs, &mut metadata_per_path, &mut metadata_per_instance)
|
||||||
};
|
};
|
||||||
|
|
||||||
RbxSession {
|
RbxSession {
|
||||||
tree,
|
tree,
|
||||||
path_map,
|
metadata_per_path,
|
||||||
instance_metadata_map,
|
metadata_per_instance,
|
||||||
sync_point_names,
|
|
||||||
message_queue,
|
message_queue,
|
||||||
imfs,
|
imfs,
|
||||||
}
|
}
|
||||||
@@ -68,8 +80,7 @@ impl RbxSession {
|
|||||||
.expect("Path was outside in-memory filesystem roots");
|
.expect("Path was outside in-memory filesystem roots");
|
||||||
|
|
||||||
// Find the closest instance in the tree that currently exists
|
// Find the closest instance in the tree that currently exists
|
||||||
let mut path_to_snapshot = self.path_map.descend(root_path, path);
|
let mut path_to_snapshot = self.metadata_per_path.descend(root_path, path);
|
||||||
let &instance_id = self.path_map.get(&path_to_snapshot).unwrap();
|
|
||||||
|
|
||||||
// If this is a file that might affect its parent if modified, we
|
// If this is a file that might affect its parent if modified, we
|
||||||
// should snapshot its parent instead.
|
// should snapshot its parent instead.
|
||||||
@@ -82,7 +93,22 @@ impl RbxSession {
|
|||||||
|
|
||||||
trace!("Snapshotting path {}", path_to_snapshot.display());
|
trace!("Snapshotting path {}", path_to_snapshot.display());
|
||||||
|
|
||||||
let maybe_snapshot = snapshot_instances_from_imfs(&imfs, &path_to_snapshot, &mut self.sync_point_names)
|
let path_metadata = self.metadata_per_path.get(&path_to_snapshot).unwrap();
|
||||||
|
|
||||||
|
trace!("Metadata for path: {:?}", path_metadata);
|
||||||
|
|
||||||
|
let instance_id = path_metadata.instance_id
|
||||||
|
.expect("Instance did not exist in tree");
|
||||||
|
|
||||||
|
// If this instance is a sync point, pull its name out of our
|
||||||
|
// per-path metadata store.
|
||||||
|
let instance_name = path_metadata.instance_name.as_ref()
|
||||||
|
.map(|value| Cow::Owned(value.to_owned()));
|
||||||
|
|
||||||
|
let mut context = SnapshotContext {
|
||||||
|
metadata_per_path: &mut self.metadata_per_path,
|
||||||
|
};
|
||||||
|
let maybe_snapshot = snapshot_imfs_path(&imfs, &mut context, &path_to_snapshot, instance_name)
|
||||||
.unwrap_or_else(|_| panic!("Could not generate instance snapshot for path {}", path_to_snapshot.display()));
|
.unwrap_or_else(|_| panic!("Could not generate instance snapshot for path {}", path_to_snapshot.display()));
|
||||||
|
|
||||||
let snapshot = match maybe_snapshot {
|
let snapshot = match maybe_snapshot {
|
||||||
@@ -99,8 +125,8 @@ impl RbxSession {
|
|||||||
&mut self.tree,
|
&mut self.tree,
|
||||||
instance_id,
|
instance_id,
|
||||||
&snapshot,
|
&snapshot,
|
||||||
&mut self.path_map,
|
&mut self.metadata_per_path,
|
||||||
&mut self.instance_metadata_map,
|
&mut self.metadata_per_instance,
|
||||||
&mut changes,
|
&mut changes,
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
@@ -127,10 +153,14 @@ impl RbxSession {
|
|||||||
// If the path doesn't exist or is a directory, we don't care if it
|
// If the path doesn't exist or is a directory, we don't care if it
|
||||||
// updated
|
// updated
|
||||||
match imfs.get(path) {
|
match imfs.get(path) {
|
||||||
Some(ImfsItem::Directory(_)) | None => {
|
Some(ImfsItem::Directory(_)) => {
|
||||||
trace!("Updated path was a directory, ignoring.");
|
trace!("Updated path was a directory, ignoring.");
|
||||||
return;
|
return;
|
||||||
},
|
},
|
||||||
|
None => {
|
||||||
|
trace!("Updated path did not exist in IMFS, ignoring.");
|
||||||
|
return;
|
||||||
|
},
|
||||||
Some(ImfsItem::File(_)) => {},
|
Some(ImfsItem::File(_)) => {},
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -140,13 +170,13 @@ impl RbxSession {
|
|||||||
|
|
||||||
pub fn path_removed(&mut self, path: &Path) {
|
pub fn path_removed(&mut self, path: &Path) {
|
||||||
info!("Path removed: {}", path.display());
|
info!("Path removed: {}", path.display());
|
||||||
self.path_map.remove(path);
|
self.metadata_per_path.remove(path);
|
||||||
self.path_created_or_updated(path);
|
self.path_created_or_updated(path);
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn path_renamed(&mut self, from_path: &Path, to_path: &Path) {
|
pub fn path_renamed(&mut self, from_path: &Path, to_path: &Path) {
|
||||||
info!("Path renamed from {} to {}", from_path.display(), to_path.display());
|
info!("Path renamed from {} to {}", from_path.display(), to_path.display());
|
||||||
self.path_map.remove(from_path);
|
self.metadata_per_path.remove(from_path);
|
||||||
self.path_created_or_updated(from_path);
|
self.path_created_or_updated(from_path);
|
||||||
self.path_created_or_updated(to_path);
|
self.path_created_or_updated(to_path);
|
||||||
}
|
}
|
||||||
@@ -155,385 +185,36 @@ impl RbxSession {
|
|||||||
&self.tree
|
&self.tree
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn get_instance_metadata(&self, id: RbxId) -> Option<&InstanceProjectNodeMetadata> {
|
pub fn get_instance_metadata(&self, id: RbxId) -> Option<&MetadataPerInstance> {
|
||||||
self.instance_metadata_map.get(&id)
|
self.metadata_per_instance.get(&id)
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn debug_get_path_map(&self) -> &PathMap<RbxId> {
|
pub fn debug_get_metadata_per_path(&self) -> &PathMap<MetadataPerPath> {
|
||||||
&self.path_map
|
&self.metadata_per_path
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn construct_oneoff_tree(project: &Project, imfs: &Imfs) -> RbxTree {
|
pub fn construct_oneoff_tree(project: &Project, imfs: &Imfs) -> RbxTree {
|
||||||
let mut path_map = PathMap::new();
|
let mut metadata_per_path = PathMap::new();
|
||||||
let mut instance_metadata_map = HashMap::new();
|
let mut metadata_per_instance = HashMap::new();
|
||||||
let mut sync_point_names = HashMap::new();
|
reify_initial_tree(project, imfs, &mut metadata_per_path, &mut metadata_per_instance)
|
||||||
construct_initial_tree(project, imfs, &mut path_map, &mut instance_metadata_map, &mut sync_point_names)
|
|
||||||
}
|
}
|
||||||
|
|
||||||
fn construct_initial_tree(
|
fn reify_initial_tree(
|
||||||
project: &Project,
|
project: &Project,
|
||||||
imfs: &Imfs,
|
imfs: &Imfs,
|
||||||
path_map: &mut PathMap<RbxId>,
|
metadata_per_path: &mut PathMap<MetadataPerPath>,
|
||||||
instance_metadata_map: &mut HashMap<RbxId, InstanceProjectNodeMetadata>,
|
metadata_per_instance: &mut HashMap<RbxId, MetadataPerInstance>,
|
||||||
sync_point_names: &mut HashMap<PathBuf, String>,
|
|
||||||
) -> RbxTree {
|
) -> RbxTree {
|
||||||
let snapshot = construct_project_node(
|
let mut context = SnapshotContext {
|
||||||
imfs,
|
metadata_per_path,
|
||||||
&project.name,
|
};
|
||||||
&project.tree,
|
let snapshot = snapshot_project_tree(imfs, &mut context, project)
|
||||||
sync_point_names,
|
.expect("Could not snapshot project tree")
|
||||||
);
|
.expect("Project did not produce any instances");
|
||||||
|
|
||||||
let mut changes = InstanceChanges::default();
|
let mut changes = InstanceChanges::default();
|
||||||
let tree = reify_root(&snapshot, path_map, instance_metadata_map, &mut changes);
|
let tree = reify_root(&snapshot, metadata_per_path, metadata_per_instance, &mut changes);
|
||||||
|
|
||||||
tree
|
tree
|
||||||
}
|
|
||||||
|
|
||||||
fn construct_project_node<'a>(
|
|
||||||
imfs: &'a Imfs,
|
|
||||||
instance_name: &'a str,
|
|
||||||
project_node: &'a ProjectNode,
|
|
||||||
sync_point_names: &mut HashMap<PathBuf, String>,
|
|
||||||
) -> RbxSnapshotInstance<'a> {
|
|
||||||
match project_node {
|
|
||||||
ProjectNode::Instance(node) => {
|
|
||||||
let mut children = Vec::new();
|
|
||||||
|
|
||||||
for (child_name, child_project_node) in &node.children {
|
|
||||||
children.push(construct_project_node(imfs, child_name, child_project_node, sync_point_names));
|
|
||||||
}
|
|
||||||
|
|
||||||
RbxSnapshotInstance {
|
|
||||||
class_name: Cow::Borrowed(&node.class_name),
|
|
||||||
name: Cow::Borrowed(instance_name),
|
|
||||||
properties: node.properties.clone(),
|
|
||||||
children,
|
|
||||||
source_path: None,
|
|
||||||
metadata: Some(node.metadata.clone()),
|
|
||||||
}
|
|
||||||
},
|
|
||||||
ProjectNode::SyncPoint(node) => {
|
|
||||||
// TODO: Propagate errors upward instead of dying
|
|
||||||
let mut snapshot = snapshot_instances_from_imfs(imfs, &node.path, sync_point_names)
|
|
||||||
.expect("Could not reify nodes from Imfs")
|
|
||||||
.expect("Sync point node did not result in an instance");
|
|
||||||
|
|
||||||
snapshot.name = Cow::Borrowed(instance_name);
|
|
||||||
sync_point_names.insert(node.path.clone(), instance_name.to_string());
|
|
||||||
|
|
||||||
snapshot
|
|
||||||
},
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Debug, Clone, Copy)]
|
|
||||||
enum FileType {
|
|
||||||
ModuleScript,
|
|
||||||
ServerScript,
|
|
||||||
ClientScript,
|
|
||||||
StringValue,
|
|
||||||
LocalizationTable,
|
|
||||||
XmlModel,
|
|
||||||
BinaryModel,
|
|
||||||
}
|
|
||||||
|
|
||||||
fn get_trailing<'a>(input: &'a str, trailer: &str) -> Option<&'a str> {
|
|
||||||
if input.ends_with(trailer) {
|
|
||||||
let end = input.len().saturating_sub(trailer.len());
|
|
||||||
Some(&input[..end])
|
|
||||||
} else {
|
|
||||||
None
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn classify_file(file: &ImfsFile) -> Option<(&str, FileType)> {
|
|
||||||
static EXTENSIONS_TO_TYPES: &[(&str, FileType)] = &[
|
|
||||||
(".server.lua", FileType::ServerScript),
|
|
||||||
(".client.lua", FileType::ClientScript),
|
|
||||||
(".lua", FileType::ModuleScript),
|
|
||||||
(".csv", FileType::LocalizationTable),
|
|
||||||
(".txt", FileType::StringValue),
|
|
||||||
(".rbxmx", FileType::XmlModel),
|
|
||||||
(".rbxm", FileType::BinaryModel),
|
|
||||||
];
|
|
||||||
|
|
||||||
let file_name = file.path.file_name()?.to_str()?;
|
|
||||||
|
|
||||||
for (extension, file_type) in EXTENSIONS_TO_TYPES {
|
|
||||||
if let Some(instance_name) = get_trailing(file_name, extension) {
|
|
||||||
return Some((instance_name, *file_type))
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
None
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Debug, Serialize, Deserialize)]
|
|
||||||
#[serde(rename_all = "PascalCase")]
|
|
||||||
struct LocalizationEntryCsv {
|
|
||||||
key: String,
|
|
||||||
context: String,
|
|
||||||
example: String,
|
|
||||||
source: String,
|
|
||||||
#[serde(flatten)]
|
|
||||||
values: HashMap<String, String>,
|
|
||||||
}
|
|
||||||
|
|
||||||
impl LocalizationEntryCsv {
|
|
||||||
fn to_json(self) -> LocalizationEntryJson {
|
|
||||||
LocalizationEntryJson {
|
|
||||||
key: self.key,
|
|
||||||
context: self.context,
|
|
||||||
example: self.example,
|
|
||||||
source: self.source,
|
|
||||||
values: self.values,
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Debug, Serialize, Deserialize)]
|
|
||||||
#[serde(rename_all = "camelCase")]
|
|
||||||
struct LocalizationEntryJson {
|
|
||||||
key: String,
|
|
||||||
context: String,
|
|
||||||
example: String,
|
|
||||||
source: String,
|
|
||||||
values: HashMap<String, String>,
|
|
||||||
}
|
|
||||||
|
|
||||||
#[derive(Debug, Fail)]
|
|
||||||
enum SnapshotError {
|
|
||||||
DidNotExist(PathBuf),
|
|
||||||
|
|
||||||
// TODO: Add file path to the error message?
|
|
||||||
Utf8Error {
|
|
||||||
#[fail(cause)]
|
|
||||||
inner: str::Utf8Error,
|
|
||||||
path: PathBuf,
|
|
||||||
},
|
|
||||||
|
|
||||||
XmlModelDecodeError {
|
|
||||||
inner: rbx_xml::DecodeError,
|
|
||||||
path: PathBuf,
|
|
||||||
},
|
|
||||||
|
|
||||||
BinaryModelDecodeError {
|
|
||||||
inner: rbx_binary::DecodeError,
|
|
||||||
path: PathBuf,
|
|
||||||
},
|
|
||||||
}
|
|
||||||
|
|
||||||
impl fmt::Display for SnapshotError {
|
|
||||||
fn fmt(&self, output: &mut fmt::Formatter) -> fmt::Result {
|
|
||||||
match self {
|
|
||||||
SnapshotError::DidNotExist(path) => write!(output, "Path did not exist: {}", path.display()),
|
|
||||||
SnapshotError::Utf8Error { inner, path } => {
|
|
||||||
write!(output, "Invalid UTF-8: {} in path {}", inner, path.display())
|
|
||||||
},
|
|
||||||
SnapshotError::XmlModelDecodeError { inner, path } => {
|
|
||||||
write!(output, "Malformed rbxmx model: {:?} in path {}", inner, path.display())
|
|
||||||
},
|
|
||||||
SnapshotError::BinaryModelDecodeError { inner, path } => {
|
|
||||||
write!(output, "Malformed rbxm model: {:?} in path {}", inner, path.display())
|
|
||||||
},
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn snapshot_xml_model<'a>(
|
|
||||||
instance_name: Cow<'a, str>,
|
|
||||||
file: &ImfsFile,
|
|
||||||
) -> Result<Option<RbxSnapshotInstance<'a>>, SnapshotError> {
|
|
||||||
let mut temp_tree = RbxTree::new(RbxInstanceProperties {
|
|
||||||
name: "Temp".to_owned(),
|
|
||||||
class_name: "Folder".to_owned(),
|
|
||||||
properties: HashMap::new(),
|
|
||||||
});
|
|
||||||
|
|
||||||
let root_id = temp_tree.get_root_id();
|
|
||||||
rbx_xml::decode(&mut temp_tree, root_id, file.contents.as_slice())
|
|
||||||
.map_err(|inner| SnapshotError::XmlModelDecodeError {
|
|
||||||
inner,
|
|
||||||
path: file.path.clone(),
|
|
||||||
})?;
|
|
||||||
|
|
||||||
let root_instance = temp_tree.get_instance(root_id).unwrap();
|
|
||||||
let children = root_instance.get_children_ids();
|
|
||||||
|
|
||||||
match children.len() {
|
|
||||||
0 => Ok(None),
|
|
||||||
1 => {
|
|
||||||
let mut snapshot = snapshot_from_tree(&temp_tree, children[0]).unwrap();
|
|
||||||
snapshot.name = instance_name;
|
|
||||||
Ok(Some(snapshot))
|
|
||||||
},
|
|
||||||
_ => panic!("Rojo doesn't have support for model files with multiple roots yet"),
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn snapshot_binary_model<'a>(
|
|
||||||
instance_name: Cow<'a, str>,
|
|
||||||
file: &ImfsFile,
|
|
||||||
) -> Result<Option<RbxSnapshotInstance<'a>>, SnapshotError> {
|
|
||||||
let mut temp_tree = RbxTree::new(RbxInstanceProperties {
|
|
||||||
name: "Temp".to_owned(),
|
|
||||||
class_name: "Folder".to_owned(),
|
|
||||||
properties: HashMap::new(),
|
|
||||||
});
|
|
||||||
|
|
||||||
let root_id = temp_tree.get_root_id();
|
|
||||||
rbx_binary::decode(&mut temp_tree, root_id, file.contents.as_slice())
|
|
||||||
.map_err(|inner| SnapshotError::BinaryModelDecodeError {
|
|
||||||
inner,
|
|
||||||
path: file.path.clone(),
|
|
||||||
})?;
|
|
||||||
|
|
||||||
let root_instance = temp_tree.get_instance(root_id).unwrap();
|
|
||||||
let children = root_instance.get_children_ids();
|
|
||||||
|
|
||||||
match children.len() {
|
|
||||||
0 => Ok(None),
|
|
||||||
1 => {
|
|
||||||
let mut snapshot = snapshot_from_tree(&temp_tree, children[0]).unwrap();
|
|
||||||
snapshot.name = instance_name;
|
|
||||||
Ok(Some(snapshot))
|
|
||||||
},
|
|
||||||
_ => panic!("Rojo doesn't have support for model files with multiple roots yet"),
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
fn snapshot_instances_from_imfs<'a>(
|
|
||||||
imfs: &'a Imfs,
|
|
||||||
imfs_path: &Path,
|
|
||||||
sync_point_names: &HashMap<PathBuf, String>,
|
|
||||||
) -> Result<Option<RbxSnapshotInstance<'a>>, SnapshotError> {
|
|
||||||
match imfs.get(imfs_path) {
|
|
||||||
Some(ImfsItem::File(file)) => {
|
|
||||||
let (instance_name, file_type) = match classify_file(file) {
|
|
||||||
Some(info) => info,
|
|
||||||
None => return Ok(None),
|
|
||||||
};
|
|
||||||
|
|
||||||
let instance_name = if let Some(actual_name) = sync_point_names.get(imfs_path) {
|
|
||||||
Cow::Owned(actual_name.clone())
|
|
||||||
} else {
|
|
||||||
Cow::Borrowed(instance_name)
|
|
||||||
};
|
|
||||||
|
|
||||||
let class_name = match file_type {
|
|
||||||
FileType::ModuleScript => "ModuleScript",
|
|
||||||
FileType::ServerScript => "Script",
|
|
||||||
FileType::ClientScript => "LocalScript",
|
|
||||||
FileType::StringValue => "StringValue",
|
|
||||||
FileType::LocalizationTable => "LocalizationTable",
|
|
||||||
FileType::XmlModel => return snapshot_xml_model(instance_name, file),
|
|
||||||
FileType::BinaryModel => return snapshot_binary_model(instance_name, file),
|
|
||||||
};
|
|
||||||
|
|
||||||
let contents = str::from_utf8(&file.contents)
|
|
||||||
.map_err(|inner| SnapshotError::Utf8Error {
|
|
||||||
inner,
|
|
||||||
path: imfs_path.to_path_buf(),
|
|
||||||
})?;
|
|
||||||
|
|
||||||
let mut properties = HashMap::new();
|
|
||||||
|
|
||||||
match file_type {
|
|
||||||
FileType::ModuleScript | FileType::ServerScript | FileType::ClientScript => {
|
|
||||||
properties.insert(String::from("Source"), RbxValue::String {
|
|
||||||
value: contents.to_string(),
|
|
||||||
});
|
|
||||||
},
|
|
||||||
FileType::StringValue => {
|
|
||||||
properties.insert(String::from("Value"), RbxValue::String {
|
|
||||||
value: contents.to_string(),
|
|
||||||
});
|
|
||||||
},
|
|
||||||
FileType::LocalizationTable => {
|
|
||||||
let entries: Vec<LocalizationEntryJson> = csv::Reader::from_reader(contents.as_bytes())
|
|
||||||
.deserialize()
|
|
||||||
.map(|result| result.expect("Malformed localization table found!"))
|
|
||||||
.map(LocalizationEntryCsv::to_json)
|
|
||||||
.collect();
|
|
||||||
|
|
||||||
let table_contents = serde_json::to_string(&entries)
|
|
||||||
.expect("Could not encode JSON for localization table");
|
|
||||||
|
|
||||||
properties.insert(String::from("Contents"), RbxValue::String {
|
|
||||||
value: table_contents,
|
|
||||||
});
|
|
||||||
},
|
|
||||||
FileType::XmlModel | FileType::BinaryModel => unreachable!(),
|
|
||||||
}
|
|
||||||
|
|
||||||
Ok(Some(RbxSnapshotInstance {
|
|
||||||
name: instance_name,
|
|
||||||
class_name: Cow::Borrowed(class_name),
|
|
||||||
properties,
|
|
||||||
children: Vec::new(),
|
|
||||||
source_path: Some(file.path.clone()),
|
|
||||||
metadata: None,
|
|
||||||
}))
|
|
||||||
},
|
|
||||||
Some(ImfsItem::Directory(directory)) => {
|
|
||||||
// TODO: Expand init support to handle server and client scripts
|
|
||||||
let init_path = directory.path.join(INIT_SCRIPT);
|
|
||||||
let init_server_path = directory.path.join(INIT_SERVER_SCRIPT);
|
|
||||||
let init_client_path = directory.path.join(INIT_CLIENT_SCRIPT);
|
|
||||||
|
|
||||||
let mut instance = if directory.children.contains(&init_path) {
|
|
||||||
snapshot_instances_from_imfs(imfs, &init_path, sync_point_names)?
|
|
||||||
.expect("Could not snapshot instance from file that existed!")
|
|
||||||
} else if directory.children.contains(&init_server_path) {
|
|
||||||
snapshot_instances_from_imfs(imfs, &init_server_path, sync_point_names)?
|
|
||||||
.expect("Could not snapshot instance from file that existed!")
|
|
||||||
} else if directory.children.contains(&init_client_path) {
|
|
||||||
snapshot_instances_from_imfs(imfs, &init_client_path, sync_point_names)?
|
|
||||||
.expect("Could not snapshot instance from file that existed!")
|
|
||||||
} else {
|
|
||||||
RbxSnapshotInstance {
|
|
||||||
class_name: Cow::Borrowed("Folder"),
|
|
||||||
name: Cow::Borrowed(""),
|
|
||||||
properties: HashMap::new(),
|
|
||||||
children: Vec::new(),
|
|
||||||
source_path: Some(directory.path.clone()),
|
|
||||||
metadata: None,
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
// We have to be careful not to lose instance names that are
|
|
||||||
// specified in the project manifest. We store them in
|
|
||||||
// sync_point_names when the original tree is constructed.
|
|
||||||
instance.name = if let Some(actual_name) = sync_point_names.get(&directory.path) {
|
|
||||||
Cow::Owned(actual_name.clone())
|
|
||||||
} else {
|
|
||||||
Cow::Borrowed(directory.path
|
|
||||||
.file_name().expect("Could not extract file name")
|
|
||||||
.to_str().expect("Could not convert path to UTF-8"))
|
|
||||||
};
|
|
||||||
|
|
||||||
for child_path in &directory.children {
|
|
||||||
match child_path.file_name().unwrap().to_str().unwrap() {
|
|
||||||
INIT_SCRIPT | INIT_SERVER_SCRIPT | INIT_CLIENT_SCRIPT => {
|
|
||||||
// The existence of files with these names modifies the
|
|
||||||
// parent instance and is handled above, so we can skip
|
|
||||||
// them here.
|
|
||||||
},
|
|
||||||
_ => {
|
|
||||||
match snapshot_instances_from_imfs(imfs, child_path, sync_point_names)? {
|
|
||||||
Some(child) => {
|
|
||||||
instance.children.push(child);
|
|
||||||
},
|
|
||||||
None => {},
|
|
||||||
}
|
|
||||||
},
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
Ok(Some(instance))
|
|
||||||
},
|
|
||||||
None => Err(SnapshotError::DidNotExist(imfs_path.to_path_buf())),
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
@@ -1,307 +1,544 @@
|
|||||||
use std::{
|
use std::{
|
||||||
str,
|
|
||||||
borrow::Cow,
|
borrow::Cow,
|
||||||
collections::{HashMap, HashSet},
|
collections::HashMap,
|
||||||
fmt,
|
fmt,
|
||||||
path::PathBuf,
|
path::{Path, PathBuf},
|
||||||
|
str,
|
||||||
};
|
};
|
||||||
|
|
||||||
use rbx_tree::{RbxTree, RbxId, RbxInstanceProperties, RbxValue};
|
use failure::Fail;
|
||||||
|
use log::info;
|
||||||
|
use maplit::hashmap;
|
||||||
|
use rbx_tree::{RbxTree, RbxValue, RbxInstanceProperties};
|
||||||
|
use serde_derive::{Serialize, Deserialize};
|
||||||
|
|
||||||
use crate::{
|
use crate::{
|
||||||
|
imfs::{
|
||||||
|
Imfs,
|
||||||
|
ImfsItem,
|
||||||
|
ImfsFile,
|
||||||
|
ImfsDirectory,
|
||||||
|
},
|
||||||
|
project::{
|
||||||
|
Project,
|
||||||
|
ProjectNode,
|
||||||
|
InstanceProjectNode,
|
||||||
|
SyncPointProjectNode,
|
||||||
|
},
|
||||||
|
snapshot_reconciler::{
|
||||||
|
RbxSnapshotInstance,
|
||||||
|
snapshot_from_tree,
|
||||||
|
},
|
||||||
path_map::PathMap,
|
path_map::PathMap,
|
||||||
project::InstanceProjectNodeMetadata,
|
// TODO: Move MetadataPerPath into this module?
|
||||||
|
rbx_session::{MetadataPerPath, MetadataPerInstance},
|
||||||
};
|
};
|
||||||
|
|
||||||
#[derive(Debug, Clone, Default, Serialize, Deserialize)]
|
const INIT_MODULE_NAME: &str = "init.lua";
|
||||||
pub struct InstanceChanges {
|
const INIT_SERVER_NAME: &str = "init.server.lua";
|
||||||
pub added: HashSet<RbxId>,
|
const INIT_CLIENT_NAME: &str = "init.client.lua";
|
||||||
pub removed: HashSet<RbxId>,
|
|
||||||
pub updated: HashSet<RbxId>,
|
pub type SnapshotResult<'a> = Result<Option<RbxSnapshotInstance<'a>>, SnapshotError>;
|
||||||
|
|
||||||
|
pub struct SnapshotContext<'meta> {
|
||||||
|
pub metadata_per_path: &'meta mut PathMap<MetadataPerPath>,
|
||||||
}
|
}
|
||||||
|
|
||||||
impl fmt::Display for InstanceChanges {
|
#[derive(Debug, Fail)]
|
||||||
|
pub enum SnapshotError {
|
||||||
|
DidNotExist(PathBuf),
|
||||||
|
|
||||||
|
Utf8Error {
|
||||||
|
#[fail(cause)]
|
||||||
|
inner: str::Utf8Error,
|
||||||
|
path: PathBuf,
|
||||||
|
},
|
||||||
|
|
||||||
|
JsonModelDecodeError {
|
||||||
|
inner: serde_json::Error,
|
||||||
|
path: PathBuf,
|
||||||
|
},
|
||||||
|
|
||||||
|
XmlModelDecodeError {
|
||||||
|
inner: rbx_xml::DecodeError,
|
||||||
|
path: PathBuf,
|
||||||
|
},
|
||||||
|
|
||||||
|
BinaryModelDecodeError {
|
||||||
|
inner: rbx_binary::DecodeError,
|
||||||
|
path: PathBuf,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
impl fmt::Display for SnapshotError {
|
||||||
fn fmt(&self, output: &mut fmt::Formatter) -> fmt::Result {
|
fn fmt(&self, output: &mut fmt::Formatter) -> fmt::Result {
|
||||||
writeln!(output, "InstanceChanges {{")?;
|
match self {
|
||||||
|
SnapshotError::DidNotExist(path) => write!(output, "Path did not exist: {}", path.display()),
|
||||||
if !self.added.is_empty() {
|
SnapshotError::Utf8Error { inner, path } => {
|
||||||
writeln!(output, " Added:")?;
|
write!(output, "Invalid UTF-8: {} in path {}", inner, path.display())
|
||||||
for id in &self.added {
|
},
|
||||||
writeln!(output, " {}", id)?;
|
SnapshotError::JsonModelDecodeError { inner, path } => {
|
||||||
}
|
write!(output, "Malformed .model.json model: {:?} in path {}", inner, path.display())
|
||||||
|
},
|
||||||
|
SnapshotError::XmlModelDecodeError { inner, path } => {
|
||||||
|
write!(output, "Malformed rbxmx model: {:?} in path {}", inner, path.display())
|
||||||
|
},
|
||||||
|
SnapshotError::BinaryModelDecodeError { inner, path } => {
|
||||||
|
write!(output, "Malformed rbxm model: {:?} in path {}", inner, path.display())
|
||||||
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
if !self.removed.is_empty() {
|
|
||||||
writeln!(output, " Removed:")?;
|
|
||||||
for id in &self.removed {
|
|
||||||
writeln!(output, " {}", id)?;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if !self.updated.is_empty() {
|
|
||||||
writeln!(output, " Updated:")?;
|
|
||||||
for id in &self.updated {
|
|
||||||
writeln!(output, " {}", id)?;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
writeln!(output, "}}")
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
impl InstanceChanges {
|
pub fn snapshot_project_tree<'source>(
|
||||||
pub fn is_empty(&self) -> bool {
|
imfs: &'source Imfs,
|
||||||
self.added.is_empty() && self.removed.is_empty() && self.updated.is_empty()
|
context: &mut SnapshotContext,
|
||||||
|
project: &'source Project,
|
||||||
|
) -> SnapshotResult<'source> {
|
||||||
|
snapshot_project_node(imfs, context, &project.tree, Cow::Borrowed(&project.name))
|
||||||
|
}
|
||||||
|
|
||||||
|
fn snapshot_project_node<'source>(
|
||||||
|
imfs: &'source Imfs,
|
||||||
|
context: &mut SnapshotContext,
|
||||||
|
node: &'source ProjectNode,
|
||||||
|
instance_name: Cow<'source, str>,
|
||||||
|
) -> SnapshotResult<'source> {
|
||||||
|
match node {
|
||||||
|
ProjectNode::Instance(instance_node) => snapshot_instance_node(imfs, context, instance_node, instance_name),
|
||||||
|
ProjectNode::SyncPoint(sync_node) => snapshot_sync_point_node(imfs, context, sync_node, instance_name),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
#[derive(Debug)]
|
fn snapshot_instance_node<'source>(
|
||||||
pub struct RbxSnapshotInstance<'a> {
|
imfs: &'source Imfs,
|
||||||
pub name: Cow<'a, str>,
|
context: &mut SnapshotContext,
|
||||||
pub class_name: Cow<'a, str>,
|
node: &'source InstanceProjectNode,
|
||||||
pub properties: HashMap<String, RbxValue>,
|
instance_name: Cow<'source, str>,
|
||||||
pub children: Vec<RbxSnapshotInstance<'a>>,
|
) -> SnapshotResult<'source> {
|
||||||
pub source_path: Option<PathBuf>,
|
|
||||||
pub metadata: Option<InstanceProjectNodeMetadata>,
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn snapshot_from_tree(tree: &RbxTree, id: RbxId) -> Option<RbxSnapshotInstance<'static>> {
|
|
||||||
let instance = tree.get_instance(id)?;
|
|
||||||
|
|
||||||
let mut children = Vec::new();
|
let mut children = Vec::new();
|
||||||
for &child_id in instance.get_children_ids() {
|
|
||||||
children.push(snapshot_from_tree(tree, child_id)?);
|
for (child_name, child_project_node) in &node.children {
|
||||||
|
if let Some(child) = snapshot_project_node(imfs, context, child_project_node, Cow::Borrowed(child_name))? {
|
||||||
|
children.push(child);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
Some(RbxSnapshotInstance {
|
Ok(Some(RbxSnapshotInstance {
|
||||||
name: Cow::Owned(instance.name.to_owned()),
|
class_name: Cow::Borrowed(&node.class_name),
|
||||||
class_name: Cow::Owned(instance.class_name.to_owned()),
|
name: instance_name,
|
||||||
properties: instance.properties.clone(),
|
properties: node.properties.clone(),
|
||||||
children,
|
children,
|
||||||
source_path: None,
|
metadata: MetadataPerInstance {
|
||||||
metadata: None,
|
source_path: None,
|
||||||
})
|
ignore_unknown_instances: node.metadata.ignore_unknown_instances,
|
||||||
|
},
|
||||||
|
}))
|
||||||
}
|
}
|
||||||
|
|
||||||
pub fn reify_root(
|
fn snapshot_sync_point_node<'source>(
|
||||||
snapshot: &RbxSnapshotInstance,
|
imfs: &'source Imfs,
|
||||||
path_map: &mut PathMap<RbxId>,
|
context: &mut SnapshotContext,
|
||||||
instance_metadata_map: &mut HashMap<RbxId, InstanceProjectNodeMetadata>,
|
node: &'source SyncPointProjectNode,
|
||||||
changes: &mut InstanceChanges,
|
instance_name: Cow<'source, str>,
|
||||||
) -> RbxTree {
|
) -> SnapshotResult<'source> {
|
||||||
let instance = reify_core(snapshot);
|
let maybe_snapshot = snapshot_imfs_path(imfs, context, &node.path, Some(instance_name))?;
|
||||||
let mut tree = RbxTree::new(instance);
|
|
||||||
let root_id = tree.get_root_id();
|
|
||||||
|
|
||||||
if let Some(source_path) = &snapshot.source_path {
|
// If the snapshot resulted in no instances, like if it targets an unknown
|
||||||
path_map.insert(source_path.clone(), root_id);
|
// file or an empty model file, we can early-return.
|
||||||
}
|
let snapshot = match maybe_snapshot {
|
||||||
|
Some(snapshot) => snapshot,
|
||||||
if let Some(metadata) = &snapshot.metadata {
|
None => return Ok(None),
|
||||||
instance_metadata_map.insert(root_id, metadata.clone());
|
|
||||||
}
|
|
||||||
|
|
||||||
changes.added.insert(root_id);
|
|
||||||
|
|
||||||
for child in &snapshot.children {
|
|
||||||
reify_subtree(child, &mut tree, root_id, path_map, instance_metadata_map, changes);
|
|
||||||
}
|
|
||||||
|
|
||||||
tree
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn reify_subtree(
|
|
||||||
snapshot: &RbxSnapshotInstance,
|
|
||||||
tree: &mut RbxTree,
|
|
||||||
parent_id: RbxId,
|
|
||||||
path_map: &mut PathMap<RbxId>,
|
|
||||||
instance_metadata_map: &mut HashMap<RbxId, InstanceProjectNodeMetadata>,
|
|
||||||
changes: &mut InstanceChanges,
|
|
||||||
) {
|
|
||||||
let instance = reify_core(snapshot);
|
|
||||||
let id = tree.insert_instance(instance, parent_id);
|
|
||||||
|
|
||||||
if let Some(source_path) = &snapshot.source_path {
|
|
||||||
path_map.insert(source_path.clone(), id);
|
|
||||||
}
|
|
||||||
|
|
||||||
if let Some(metadata) = &snapshot.metadata {
|
|
||||||
instance_metadata_map.insert(id, metadata.clone());
|
|
||||||
}
|
|
||||||
|
|
||||||
changes.added.insert(id);
|
|
||||||
|
|
||||||
for child in &snapshot.children {
|
|
||||||
reify_subtree(child, tree, id, path_map, instance_metadata_map, changes);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn reconcile_subtree(
|
|
||||||
tree: &mut RbxTree,
|
|
||||||
id: RbxId,
|
|
||||||
snapshot: &RbxSnapshotInstance,
|
|
||||||
path_map: &mut PathMap<RbxId>,
|
|
||||||
instance_metadata_map: &mut HashMap<RbxId, InstanceProjectNodeMetadata>,
|
|
||||||
changes: &mut InstanceChanges,
|
|
||||||
) {
|
|
||||||
if let Some(source_path) = &snapshot.source_path {
|
|
||||||
path_map.insert(source_path.clone(), id);
|
|
||||||
}
|
|
||||||
|
|
||||||
if let Some(metadata) = &snapshot.metadata {
|
|
||||||
instance_metadata_map.insert(id, metadata.clone());
|
|
||||||
}
|
|
||||||
|
|
||||||
if reconcile_instance_properties(tree.get_instance_mut(id).unwrap(), snapshot) {
|
|
||||||
changes.updated.insert(id);
|
|
||||||
}
|
|
||||||
|
|
||||||
reconcile_instance_children(tree, id, snapshot, path_map, instance_metadata_map, changes);
|
|
||||||
}
|
|
||||||
|
|
||||||
fn reify_core(snapshot: &RbxSnapshotInstance) -> RbxInstanceProperties {
|
|
||||||
let mut properties = HashMap::new();
|
|
||||||
|
|
||||||
for (key, value) in &snapshot.properties {
|
|
||||||
properties.insert(key.clone(), value.clone());
|
|
||||||
}
|
|
||||||
|
|
||||||
let instance = RbxInstanceProperties {
|
|
||||||
name: snapshot.name.to_string(),
|
|
||||||
class_name: snapshot.class_name.to_string(),
|
|
||||||
properties,
|
|
||||||
};
|
};
|
||||||
|
|
||||||
instance
|
// Otherwise, we can log the name of the sync point we just snapshotted.
|
||||||
|
let path_meta = context.metadata_per_path.entry(node.path.to_owned()).or_default();
|
||||||
|
path_meta.instance_name = Some(snapshot.name.clone().into_owned());
|
||||||
|
|
||||||
|
Ok(Some(snapshot))
|
||||||
}
|
}
|
||||||
|
|
||||||
fn reconcile_instance_properties(instance: &mut RbxInstanceProperties, snapshot: &RbxSnapshotInstance) -> bool {
|
pub fn snapshot_imfs_path<'source>(
|
||||||
let mut has_diffs = false;
|
imfs: &'source Imfs,
|
||||||
|
context: &mut SnapshotContext,
|
||||||
if instance.name != snapshot.name {
|
path: &Path,
|
||||||
instance.name = snapshot.name.to_string();
|
instance_name: Option<Cow<'source, str>>,
|
||||||
has_diffs = true;
|
) -> SnapshotResult<'source> {
|
||||||
|
// If the given path doesn't exist in the in-memory filesystem, we consider
|
||||||
|
// that an error.
|
||||||
|
match imfs.get(path) {
|
||||||
|
Some(imfs_item) => snapshot_imfs_item(imfs, context, imfs_item, instance_name),
|
||||||
|
None => return Err(SnapshotError::DidNotExist(path.to_owned())),
|
||||||
}
|
}
|
||||||
|
|
||||||
if instance.class_name != snapshot.class_name {
|
|
||||||
instance.class_name = snapshot.class_name.to_string();
|
|
||||||
has_diffs = true;
|
|
||||||
}
|
|
||||||
|
|
||||||
let mut property_updates = HashMap::new();
|
|
||||||
|
|
||||||
for (key, instance_value) in &instance.properties {
|
|
||||||
match snapshot.properties.get(key) {
|
|
||||||
Some(snapshot_value) => {
|
|
||||||
if snapshot_value != instance_value {
|
|
||||||
property_updates.insert(key.clone(), Some(snapshot_value.clone()));
|
|
||||||
}
|
|
||||||
},
|
|
||||||
None => {
|
|
||||||
property_updates.insert(key.clone(), None);
|
|
||||||
},
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
for (key, snapshot_value) in &snapshot.properties {
|
|
||||||
if property_updates.contains_key(key) {
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
match instance.properties.get(key) {
|
|
||||||
Some(instance_value) => {
|
|
||||||
if snapshot_value != instance_value {
|
|
||||||
property_updates.insert(key.clone(), Some(snapshot_value.clone()));
|
|
||||||
}
|
|
||||||
},
|
|
||||||
None => {
|
|
||||||
property_updates.insert(key.clone(), Some(snapshot_value.clone()));
|
|
||||||
},
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
has_diffs = has_diffs || !property_updates.is_empty();
|
|
||||||
|
|
||||||
for (key, change) in property_updates.drain() {
|
|
||||||
match change {
|
|
||||||
Some(value) => instance.properties.insert(key, value),
|
|
||||||
None => instance.properties.remove(&key),
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
has_diffs
|
|
||||||
}
|
}
|
||||||
|
|
||||||
fn reconcile_instance_children(
|
fn snapshot_imfs_item<'source>(
|
||||||
tree: &mut RbxTree,
|
imfs: &'source Imfs,
|
||||||
id: RbxId,
|
context: &mut SnapshotContext,
|
||||||
snapshot: &RbxSnapshotInstance,
|
item: &'source ImfsItem,
|
||||||
path_map: &mut PathMap<RbxId>,
|
instance_name: Option<Cow<'source, str>>,
|
||||||
instance_metadata_map: &mut HashMap<RbxId, InstanceProjectNodeMetadata>,
|
) -> SnapshotResult<'source> {
|
||||||
changes: &mut InstanceChanges,
|
match item {
|
||||||
) {
|
ImfsItem::File(file) => snapshot_imfs_file(file, instance_name),
|
||||||
let mut visited_snapshot_indices = HashSet::new();
|
ImfsItem::Directory(directory) => snapshot_imfs_directory(imfs, context, directory, instance_name),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
let mut children_to_update: Vec<(RbxId, &RbxSnapshotInstance)> = Vec::new();
|
fn snapshot_imfs_directory<'source>(
|
||||||
let mut children_to_add: Vec<&RbxSnapshotInstance> = Vec::new();
|
imfs: &'source Imfs,
|
||||||
let mut children_to_remove: Vec<RbxId> = Vec::new();
|
context: &mut SnapshotContext,
|
||||||
|
directory: &'source ImfsDirectory,
|
||||||
|
instance_name: Option<Cow<'source, str>>,
|
||||||
|
) -> SnapshotResult<'source> {
|
||||||
|
let init_path = directory.path.join(INIT_MODULE_NAME);
|
||||||
|
let init_server_path = directory.path.join(INIT_SERVER_NAME);
|
||||||
|
let init_client_path = directory.path.join(INIT_CLIENT_NAME);
|
||||||
|
|
||||||
let children_ids = tree.get_instance(id).unwrap().get_children_ids();
|
let snapshot_name = instance_name
|
||||||
|
.unwrap_or_else(|| {
|
||||||
|
Cow::Borrowed(directory.path
|
||||||
|
.file_name().expect("Could not extract file name")
|
||||||
|
.to_str().expect("Could not convert path to UTF-8"))
|
||||||
|
});
|
||||||
|
|
||||||
// Find all instances that were removed or updated, which we derive by
|
let mut snapshot = if directory.children.contains(&init_path) {
|
||||||
// trying to pair up existing instances to snapshots.
|
snapshot_imfs_path(imfs, context, &init_path, Some(snapshot_name))?.unwrap()
|
||||||
for &child_id in children_ids {
|
} else if directory.children.contains(&init_server_path) {
|
||||||
let child_instance = tree.get_instance(child_id).unwrap();
|
snapshot_imfs_path(imfs, context, &init_server_path, Some(snapshot_name))?.unwrap()
|
||||||
|
} else if directory.children.contains(&init_client_path) {
|
||||||
// Locate a matching snapshot for this instance
|
snapshot_imfs_path(imfs, context, &init_client_path, Some(snapshot_name))?.unwrap()
|
||||||
let mut matching_snapshot = None;
|
} else {
|
||||||
for (snapshot_index, child_snapshot) in snapshot.children.iter().enumerate() {
|
RbxSnapshotInstance {
|
||||||
if visited_snapshot_indices.contains(&snapshot_index) {
|
class_name: Cow::Borrowed("Folder"),
|
||||||
continue;
|
name: snapshot_name,
|
||||||
}
|
properties: HashMap::new(),
|
||||||
|
children: Vec::new(),
|
||||||
// We assume that instances with the same name are probably pretty
|
metadata: MetadataPerInstance {
|
||||||
// similar. This heuristic is similar to React's reconciliation
|
source_path: None,
|
||||||
// strategy.
|
ignore_unknown_instances: false,
|
||||||
if child_snapshot.name == child_instance.name {
|
|
||||||
visited_snapshot_indices.insert(snapshot_index);
|
|
||||||
matching_snapshot = Some(child_snapshot);
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
match matching_snapshot {
|
|
||||||
Some(child_snapshot) => {
|
|
||||||
children_to_update.push((child_instance.get_id(), child_snapshot));
|
|
||||||
},
|
},
|
||||||
None => {
|
}
|
||||||
children_to_remove.push(child_instance.get_id());
|
};
|
||||||
|
|
||||||
|
snapshot.metadata.source_path = Some(directory.path.to_owned());
|
||||||
|
|
||||||
|
for child_path in &directory.children {
|
||||||
|
let child_name = child_path
|
||||||
|
.file_name().expect("Couldn't extract file name")
|
||||||
|
.to_str().expect("Couldn't convert file name to UTF-8");
|
||||||
|
|
||||||
|
match child_name {
|
||||||
|
INIT_MODULE_NAME | INIT_SERVER_NAME | INIT_CLIENT_NAME => {
|
||||||
|
// The existence of files with these names modifies the
|
||||||
|
// parent instance and is handled above, so we can skip
|
||||||
|
// them here.
|
||||||
|
},
|
||||||
|
_ => {
|
||||||
|
if let Some(child) = snapshot_imfs_path(imfs, context, child_path, None)? {
|
||||||
|
snapshot.children.push(child);
|
||||||
|
}
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Find all instancs that were added, which is just the snapshots we didn't
|
Ok(Some(snapshot))
|
||||||
// match up to existing instances above.
|
}
|
||||||
for (snapshot_index, child_snapshot) in snapshot.children.iter().enumerate() {
|
|
||||||
if !visited_snapshot_indices.contains(&snapshot_index) {
|
|
||||||
children_to_add.push(child_snapshot);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
for child_snapshot in &children_to_add {
|
fn snapshot_imfs_file<'source>(
|
||||||
reify_subtree(child_snapshot, tree, id, path_map, instance_metadata_map, changes);
|
file: &'source ImfsFile,
|
||||||
}
|
instance_name: Option<Cow<'source, str>>,
|
||||||
|
) -> SnapshotResult<'source> {
|
||||||
|
let extension = file.path.extension()
|
||||||
|
.map(|v| v.to_str().expect("Could not convert extension to UTF-8"));
|
||||||
|
|
||||||
for child_id in &children_to_remove {
|
let mut maybe_snapshot = match extension {
|
||||||
if let Some(subtree) = tree.remove_instance(*child_id) {
|
Some("lua") => snapshot_lua_file(file)?,
|
||||||
for id in subtree.iter_all_ids() {
|
Some("csv") => snapshot_csv_file(file)?,
|
||||||
instance_metadata_map.remove(&id);
|
Some("txt") => snapshot_txt_file(file)?,
|
||||||
changes.removed.insert(id);
|
Some("rbxmx") => snapshot_xml_model_file(file)?,
|
||||||
|
Some("rbxm") => snapshot_binary_model_file(file)?,
|
||||||
|
Some("json") => {
|
||||||
|
let file_stem = file.path
|
||||||
|
.file_stem().expect("Could not extract file stem")
|
||||||
|
.to_str().expect("Could not convert path to UTF-8");
|
||||||
|
|
||||||
|
if file_stem.ends_with(".model") {
|
||||||
|
snapshot_json_model_file(file)?
|
||||||
|
} else {
|
||||||
|
None
|
||||||
}
|
}
|
||||||
|
},
|
||||||
|
Some(_) | None => None,
|
||||||
|
};
|
||||||
|
|
||||||
|
if let Some(snapshot) = maybe_snapshot.as_mut() {
|
||||||
|
// Carefully preserve name from project manifest if present.
|
||||||
|
if let Some(snapshot_name) = instance_name {
|
||||||
|
snapshot.name = snapshot_name;
|
||||||
}
|
}
|
||||||
|
} else {
|
||||||
|
info!("File generated no snapshot: {}", file.path.display());
|
||||||
}
|
}
|
||||||
|
|
||||||
for (child_id, child_snapshot) in &children_to_update {
|
Ok(maybe_snapshot)
|
||||||
reconcile_subtree(tree, *child_id, child_snapshot, path_map, instance_metadata_map, changes);
|
}
|
||||||
|
|
||||||
|
fn snapshot_lua_file<'source>(
|
||||||
|
file: &'source ImfsFile,
|
||||||
|
) -> SnapshotResult<'source> {
|
||||||
|
let file_stem = file.path
|
||||||
|
.file_stem().expect("Could not extract file stem")
|
||||||
|
.to_str().expect("Could not convert path to UTF-8");
|
||||||
|
|
||||||
|
let (instance_name, class_name) = if let Some(name) = match_trailing(file_stem, ".server") {
|
||||||
|
(name, "Script")
|
||||||
|
} else if let Some(name) = match_trailing(file_stem, ".client") {
|
||||||
|
(name, "LocalScript")
|
||||||
|
} else {
|
||||||
|
(file_stem, "ModuleScript")
|
||||||
|
};
|
||||||
|
|
||||||
|
let contents = str::from_utf8(&file.contents)
|
||||||
|
.map_err(|inner| SnapshotError::Utf8Error {
|
||||||
|
inner,
|
||||||
|
path: file.path.to_path_buf(),
|
||||||
|
})?;
|
||||||
|
|
||||||
|
Ok(Some(RbxSnapshotInstance {
|
||||||
|
name: Cow::Borrowed(instance_name),
|
||||||
|
class_name: Cow::Borrowed(class_name),
|
||||||
|
properties: hashmap! {
|
||||||
|
"Source".to_owned() => RbxValue::String {
|
||||||
|
value: contents.to_owned(),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
children: Vec::new(),
|
||||||
|
metadata: MetadataPerInstance {
|
||||||
|
source_path: Some(file.path.to_path_buf()),
|
||||||
|
ignore_unknown_instances: false,
|
||||||
|
},
|
||||||
|
}))
|
||||||
|
}
|
||||||
|
|
||||||
|
fn match_trailing<'a>(input: &'a str, trailer: &str) -> Option<&'a str> {
|
||||||
|
if input.ends_with(trailer) {
|
||||||
|
let end = input.len().saturating_sub(trailer.len());
|
||||||
|
Some(&input[..end])
|
||||||
|
} else {
|
||||||
|
None
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn snapshot_txt_file<'source>(
|
||||||
|
file: &'source ImfsFile,
|
||||||
|
) -> SnapshotResult<'source> {
|
||||||
|
let instance_name = file.path
|
||||||
|
.file_stem().expect("Could not extract file stem")
|
||||||
|
.to_str().expect("Could not convert path to UTF-8");
|
||||||
|
|
||||||
|
let contents = str::from_utf8(&file.contents)
|
||||||
|
.map_err(|inner| SnapshotError::Utf8Error {
|
||||||
|
inner,
|
||||||
|
path: file.path.to_path_buf(),
|
||||||
|
})?;
|
||||||
|
|
||||||
|
Ok(Some(RbxSnapshotInstance {
|
||||||
|
name: Cow::Borrowed(instance_name),
|
||||||
|
class_name: Cow::Borrowed("StringValue"),
|
||||||
|
properties: hashmap! {
|
||||||
|
"Value".to_owned() => RbxValue::String {
|
||||||
|
value: contents.to_owned(),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
children: Vec::new(),
|
||||||
|
metadata: MetadataPerInstance {
|
||||||
|
source_path: Some(file.path.to_path_buf()),
|
||||||
|
ignore_unknown_instances: false,
|
||||||
|
},
|
||||||
|
}))
|
||||||
|
}
|
||||||
|
|
||||||
|
fn snapshot_csv_file<'source>(
|
||||||
|
file: &'source ImfsFile,
|
||||||
|
) -> SnapshotResult<'source> {
|
||||||
|
let instance_name = file.path
|
||||||
|
.file_stem().expect("Could not extract file stem")
|
||||||
|
.to_str().expect("Could not convert path to UTF-8");
|
||||||
|
|
||||||
|
let entries: Vec<LocalizationEntryJson> = csv::Reader::from_reader(file.contents.as_slice())
|
||||||
|
.deserialize()
|
||||||
|
// TODO: Propagate error upward instead of panicking
|
||||||
|
.map(|result| result.expect("Malformed localization table found!"))
|
||||||
|
.map(LocalizationEntryCsv::to_json)
|
||||||
|
.collect();
|
||||||
|
|
||||||
|
let table_contents = serde_json::to_string(&entries)
|
||||||
|
.expect("Could not encode JSON for localization table");
|
||||||
|
|
||||||
|
Ok(Some(RbxSnapshotInstance {
|
||||||
|
name: Cow::Borrowed(instance_name),
|
||||||
|
class_name: Cow::Borrowed("LocalizationTable"),
|
||||||
|
properties: hashmap! {
|
||||||
|
"Contents".to_owned() => RbxValue::String {
|
||||||
|
value: table_contents,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
children: Vec::new(),
|
||||||
|
metadata: MetadataPerInstance {
|
||||||
|
source_path: Some(file.path.to_path_buf()),
|
||||||
|
ignore_unknown_instances: false,
|
||||||
|
},
|
||||||
|
}))
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Serialize, Deserialize)]
|
||||||
|
#[serde(rename_all = "PascalCase")]
|
||||||
|
struct LocalizationEntryCsv {
|
||||||
|
key: String,
|
||||||
|
context: String,
|
||||||
|
example: String,
|
||||||
|
source: String,
|
||||||
|
#[serde(flatten)]
|
||||||
|
values: HashMap<String, String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl LocalizationEntryCsv {
|
||||||
|
fn to_json(self) -> LocalizationEntryJson {
|
||||||
|
LocalizationEntryJson {
|
||||||
|
key: self.key,
|
||||||
|
context: self.context,
|
||||||
|
example: self.example,
|
||||||
|
source: self.source,
|
||||||
|
values: self.values,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Serialize, Deserialize)]
|
||||||
|
#[serde(rename_all = "camelCase")]
|
||||||
|
struct LocalizationEntryJson {
|
||||||
|
key: String,
|
||||||
|
context: String,
|
||||||
|
example: String,
|
||||||
|
source: String,
|
||||||
|
values: HashMap<String, String>,
|
||||||
|
}
|
||||||
|
|
||||||
|
fn snapshot_json_model_file<'source>(
|
||||||
|
file: &'source ImfsFile,
|
||||||
|
) -> SnapshotResult<'source> {
|
||||||
|
let contents = str::from_utf8(&file.contents)
|
||||||
|
.map_err(|inner| SnapshotError::Utf8Error {
|
||||||
|
inner,
|
||||||
|
path: file.path.to_owned(),
|
||||||
|
})?;
|
||||||
|
|
||||||
|
let json_instance: JsonModelInstance = serde_json::from_str(contents)
|
||||||
|
.map_err(|inner| SnapshotError::JsonModelDecodeError {
|
||||||
|
inner,
|
||||||
|
path: file.path.to_owned(),
|
||||||
|
})?;
|
||||||
|
|
||||||
|
let mut snapshot = json_instance.into_snapshot();
|
||||||
|
snapshot.metadata.source_path = Some(file.path.to_owned());
|
||||||
|
|
||||||
|
Ok(Some(snapshot))
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug, Serialize, Deserialize)]
|
||||||
|
#[serde(rename_all = "PascalCase")]
|
||||||
|
struct JsonModelInstance {
|
||||||
|
name: String,
|
||||||
|
class_name: String,
|
||||||
|
|
||||||
|
#[serde(default = "Vec::new", skip_serializing_if = "Vec::is_empty")]
|
||||||
|
children: Vec<JsonModelInstance>,
|
||||||
|
|
||||||
|
#[serde(default = "HashMap::new", skip_serializing_if = "HashMap::is_empty")]
|
||||||
|
properties: HashMap<String, RbxValue>,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl JsonModelInstance {
|
||||||
|
fn into_snapshot(mut self) -> RbxSnapshotInstance<'static> {
|
||||||
|
let children = self.children
|
||||||
|
.drain(..)
|
||||||
|
.map(JsonModelInstance::into_snapshot)
|
||||||
|
.collect();
|
||||||
|
|
||||||
|
RbxSnapshotInstance {
|
||||||
|
name: Cow::Owned(self.name),
|
||||||
|
class_name: Cow::Owned(self.class_name),
|
||||||
|
properties: self.properties,
|
||||||
|
children,
|
||||||
|
metadata: Default::default(),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn snapshot_xml_model_file<'source>(
|
||||||
|
file: &'source ImfsFile,
|
||||||
|
) -> SnapshotResult<'source> {
|
||||||
|
let instance_name = file.path
|
||||||
|
.file_stem().expect("Could not extract file stem")
|
||||||
|
.to_str().expect("Could not convert path to UTF-8");
|
||||||
|
|
||||||
|
let mut temp_tree = RbxTree::new(RbxInstanceProperties {
|
||||||
|
name: "Temp".to_owned(),
|
||||||
|
class_name: "Folder".to_owned(),
|
||||||
|
properties: HashMap::new(),
|
||||||
|
});
|
||||||
|
|
||||||
|
let root_id = temp_tree.get_root_id();
|
||||||
|
rbx_xml::decode(&mut temp_tree, root_id, file.contents.as_slice())
|
||||||
|
.map_err(|inner| SnapshotError::XmlModelDecodeError {
|
||||||
|
inner,
|
||||||
|
path: file.path.clone(),
|
||||||
|
})?;
|
||||||
|
|
||||||
|
let root_instance = temp_tree.get_instance(root_id).unwrap();
|
||||||
|
let children = root_instance.get_children_ids();
|
||||||
|
|
||||||
|
match children.len() {
|
||||||
|
0 => Ok(None),
|
||||||
|
1 => {
|
||||||
|
let mut snapshot = snapshot_from_tree(&temp_tree, children[0]).unwrap();
|
||||||
|
snapshot.name = Cow::Borrowed(instance_name);
|
||||||
|
Ok(Some(snapshot))
|
||||||
|
},
|
||||||
|
_ => panic!("Rojo doesn't have support for model files with multiple roots yet"),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn snapshot_binary_model_file<'source>(
|
||||||
|
file: &'source ImfsFile,
|
||||||
|
) -> SnapshotResult<'source> {
|
||||||
|
let instance_name = file.path
|
||||||
|
.file_stem().expect("Could not extract file stem")
|
||||||
|
.to_str().expect("Could not convert path to UTF-8");
|
||||||
|
|
||||||
|
let mut temp_tree = RbxTree::new(RbxInstanceProperties {
|
||||||
|
name: "Temp".to_owned(),
|
||||||
|
class_name: "Folder".to_owned(),
|
||||||
|
properties: HashMap::new(),
|
||||||
|
});
|
||||||
|
|
||||||
|
let root_id = temp_tree.get_root_id();
|
||||||
|
rbx_binary::decode(&mut temp_tree, root_id, file.contents.as_slice())
|
||||||
|
.map_err(|inner| SnapshotError::BinaryModelDecodeError {
|
||||||
|
inner,
|
||||||
|
path: file.path.clone(),
|
||||||
|
})?;
|
||||||
|
|
||||||
|
let root_instance = temp_tree.get_instance(root_id).unwrap();
|
||||||
|
let children = root_instance.get_children_ids();
|
||||||
|
|
||||||
|
match children.len() {
|
||||||
|
0 => Ok(None),
|
||||||
|
1 => {
|
||||||
|
let mut snapshot = snapshot_from_tree(&temp_tree, children[0]).unwrap();
|
||||||
|
snapshot.name = Cow::Borrowed(instance_name);
|
||||||
|
Ok(Some(snapshot))
|
||||||
|
},
|
||||||
|
_ => panic!("Rojo doesn't have support for model files with multiple roots yet"),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -1,63 +0,0 @@
|
|||||||
//! Interactions with Roblox Studio's installation, including its location and
|
|
||||||
//! mechanisms like PluginSettings.
|
|
||||||
|
|
||||||
#![allow(dead_code)]
|
|
||||||
|
|
||||||
use std::path::PathBuf;
|
|
||||||
|
|
||||||
#[cfg(all(not(debug_assertions), not(feature = "bundle-plugin")))]
|
|
||||||
compile_error!("`bundle-plugin` feature must be set for release builds.");
|
|
||||||
|
|
||||||
#[cfg(feature = "bundle-plugin")]
|
|
||||||
static PLUGIN_RBXM: &'static [u8] = include_bytes!("../target/plugin.rbxmx");
|
|
||||||
|
|
||||||
#[cfg(target_os = "windows")]
|
|
||||||
pub fn get_install_location() -> Option<PathBuf> {
|
|
||||||
use std::env;
|
|
||||||
|
|
||||||
let local_app_data = env::var("LocalAppData").ok()?;
|
|
||||||
let mut location = PathBuf::from(local_app_data);
|
|
||||||
|
|
||||||
location.push("Roblox");
|
|
||||||
|
|
||||||
Some(location)
|
|
||||||
}
|
|
||||||
|
|
||||||
#[cfg(target_os = "macos")]
|
|
||||||
pub fn get_install_location() -> Option<PathBuf> {
|
|
||||||
unimplemented!();
|
|
||||||
}
|
|
||||||
|
|
||||||
#[cfg(not(any(target_os = "windows", target_os = "macos")))]
|
|
||||||
pub fn get_install_location() -> Option<PathBuf> {
|
|
||||||
// Roblox Studio doesn't install on any other platforms!
|
|
||||||
None
|
|
||||||
}
|
|
||||||
|
|
||||||
pub fn get_plugin_location() -> Option<PathBuf> {
|
|
||||||
let mut location = get_install_location()?;
|
|
||||||
|
|
||||||
location.push("Plugins/Rojo.rbxmx");
|
|
||||||
|
|
||||||
Some(location)
|
|
||||||
}
|
|
||||||
|
|
||||||
#[cfg(feature = "bundle-plugin")]
|
|
||||||
pub fn install_bundled_plugin() -> Option<()> {
|
|
||||||
use std::fs::File;
|
|
||||||
use std::io::Write;
|
|
||||||
|
|
||||||
info!("Installing plugin...");
|
|
||||||
|
|
||||||
let mut file = File::create(get_plugin_location()?).ok()?;
|
|
||||||
file.write_all(PLUGIN_RBXM).ok()?;
|
|
||||||
|
|
||||||
Some(())
|
|
||||||
}
|
|
||||||
|
|
||||||
#[cfg(not(feature = "bundle-plugin"))]
|
|
||||||
pub fn install_bundled_plugin() -> Option<()> {
|
|
||||||
info!("Skipping plugin installation, bundle-plugin not set.");
|
|
||||||
|
|
||||||
Some(())
|
|
||||||
}
|
|
||||||
305
server/src/snapshot_reconciler.rs
Normal file
@@ -0,0 +1,305 @@
|
|||||||
|
use std::{
|
||||||
|
str,
|
||||||
|
borrow::Cow,
|
||||||
|
collections::{HashMap, HashSet},
|
||||||
|
fmt,
|
||||||
|
};
|
||||||
|
|
||||||
|
use rbx_tree::{RbxTree, RbxId, RbxInstanceProperties, RbxValue};
|
||||||
|
use serde_derive::{Serialize, Deserialize};
|
||||||
|
|
||||||
|
use crate::{
|
||||||
|
path_map::PathMap,
|
||||||
|
rbx_session::{MetadataPerPath, MetadataPerInstance},
|
||||||
|
};
|
||||||
|
|
||||||
|
#[derive(Debug, Clone, Default, Serialize, Deserialize)]
|
||||||
|
pub struct InstanceChanges {
|
||||||
|
pub added: HashSet<RbxId>,
|
||||||
|
pub removed: HashSet<RbxId>,
|
||||||
|
pub updated: HashSet<RbxId>,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl fmt::Display for InstanceChanges {
|
||||||
|
fn fmt(&self, output: &mut fmt::Formatter) -> fmt::Result {
|
||||||
|
writeln!(output, "InstanceChanges {{")?;
|
||||||
|
|
||||||
|
if !self.added.is_empty() {
|
||||||
|
writeln!(output, " Added:")?;
|
||||||
|
for id in &self.added {
|
||||||
|
writeln!(output, " {}", id)?;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if !self.removed.is_empty() {
|
||||||
|
writeln!(output, " Removed:")?;
|
||||||
|
for id in &self.removed {
|
||||||
|
writeln!(output, " {}", id)?;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if !self.updated.is_empty() {
|
||||||
|
writeln!(output, " Updated:")?;
|
||||||
|
for id in &self.updated {
|
||||||
|
writeln!(output, " {}", id)?;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
writeln!(output, "}}")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
impl InstanceChanges {
|
||||||
|
pub fn is_empty(&self) -> bool {
|
||||||
|
self.added.is_empty() && self.removed.is_empty() && self.updated.is_empty()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
#[derive(Debug)]
|
||||||
|
pub struct RbxSnapshotInstance<'a> {
|
||||||
|
pub name: Cow<'a, str>,
|
||||||
|
pub class_name: Cow<'a, str>,
|
||||||
|
pub properties: HashMap<String, RbxValue>,
|
||||||
|
pub children: Vec<RbxSnapshotInstance<'a>>,
|
||||||
|
pub metadata: MetadataPerInstance,
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn snapshot_from_tree(tree: &RbxTree, id: RbxId) -> Option<RbxSnapshotInstance<'static>> {
|
||||||
|
let instance = tree.get_instance(id)?;
|
||||||
|
|
||||||
|
let mut children = Vec::new();
|
||||||
|
for &child_id in instance.get_children_ids() {
|
||||||
|
children.push(snapshot_from_tree(tree, child_id)?);
|
||||||
|
}
|
||||||
|
|
||||||
|
Some(RbxSnapshotInstance {
|
||||||
|
name: Cow::Owned(instance.name.to_owned()),
|
||||||
|
class_name: Cow::Owned(instance.class_name.to_owned()),
|
||||||
|
properties: instance.properties.clone(),
|
||||||
|
children,
|
||||||
|
metadata: MetadataPerInstance {
|
||||||
|
source_path: None,
|
||||||
|
ignore_unknown_instances: false,
|
||||||
|
},
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn reify_root(
|
||||||
|
snapshot: &RbxSnapshotInstance,
|
||||||
|
metadata_per_path: &mut PathMap<MetadataPerPath>,
|
||||||
|
instance_metadata_map: &mut HashMap<RbxId, MetadataPerInstance>,
|
||||||
|
changes: &mut InstanceChanges,
|
||||||
|
) -> RbxTree {
|
||||||
|
let instance = reify_core(snapshot);
|
||||||
|
let mut tree = RbxTree::new(instance);
|
||||||
|
let root_id = tree.get_root_id();
|
||||||
|
|
||||||
|
if let Some(source_path) = &snapshot.metadata.source_path {
|
||||||
|
let path_meta = metadata_per_path.entry(source_path.to_owned()).or_default();
|
||||||
|
path_meta.instance_id = Some(root_id);
|
||||||
|
}
|
||||||
|
|
||||||
|
instance_metadata_map.insert(root_id, snapshot.metadata.clone());
|
||||||
|
|
||||||
|
changes.added.insert(root_id);
|
||||||
|
|
||||||
|
for child in &snapshot.children {
|
||||||
|
reify_subtree(child, &mut tree, root_id, metadata_per_path, instance_metadata_map, changes);
|
||||||
|
}
|
||||||
|
|
||||||
|
tree
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn reify_subtree(
|
||||||
|
snapshot: &RbxSnapshotInstance,
|
||||||
|
tree: &mut RbxTree,
|
||||||
|
parent_id: RbxId,
|
||||||
|
metadata_per_path: &mut PathMap<MetadataPerPath>,
|
||||||
|
instance_metadata_map: &mut HashMap<RbxId, MetadataPerInstance>,
|
||||||
|
changes: &mut InstanceChanges,
|
||||||
|
) {
|
||||||
|
let instance = reify_core(snapshot);
|
||||||
|
let id = tree.insert_instance(instance, parent_id);
|
||||||
|
|
||||||
|
if let Some(source_path) = &snapshot.metadata.source_path {
|
||||||
|
let path_meta = metadata_per_path.entry(source_path.clone()).or_default();
|
||||||
|
path_meta.instance_id = Some(id);
|
||||||
|
}
|
||||||
|
|
||||||
|
instance_metadata_map.insert(id, snapshot.metadata.clone());
|
||||||
|
|
||||||
|
changes.added.insert(id);
|
||||||
|
|
||||||
|
for child in &snapshot.children {
|
||||||
|
reify_subtree(child, tree, id, metadata_per_path, instance_metadata_map, changes);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
pub fn reconcile_subtree(
|
||||||
|
tree: &mut RbxTree,
|
||||||
|
id: RbxId,
|
||||||
|
snapshot: &RbxSnapshotInstance,
|
||||||
|
metadata_per_path: &mut PathMap<MetadataPerPath>,
|
||||||
|
instance_metadata_map: &mut HashMap<RbxId, MetadataPerInstance>,
|
||||||
|
changes: &mut InstanceChanges,
|
||||||
|
) {
|
||||||
|
if let Some(source_path) = &snapshot.metadata.source_path {
|
||||||
|
let path_meta = metadata_per_path.entry(source_path.to_owned()).or_default();
|
||||||
|
path_meta.instance_id = Some(id);
|
||||||
|
}
|
||||||
|
|
||||||
|
instance_metadata_map.insert(id, snapshot.metadata.clone());
|
||||||
|
|
||||||
|
if reconcile_instance_properties(tree.get_instance_mut(id).unwrap(), snapshot) {
|
||||||
|
changes.updated.insert(id);
|
||||||
|
}
|
||||||
|
|
||||||
|
reconcile_instance_children(tree, id, snapshot, metadata_per_path, instance_metadata_map, changes);
|
||||||
|
}
|
||||||
|
|
||||||
|
fn reify_core(snapshot: &RbxSnapshotInstance) -> RbxInstanceProperties {
|
||||||
|
let mut properties = HashMap::new();
|
||||||
|
|
||||||
|
for (key, value) in &snapshot.properties {
|
||||||
|
properties.insert(key.clone(), value.clone());
|
||||||
|
}
|
||||||
|
|
||||||
|
let instance = RbxInstanceProperties {
|
||||||
|
name: snapshot.name.to_string(),
|
||||||
|
class_name: snapshot.class_name.to_string(),
|
||||||
|
properties,
|
||||||
|
};
|
||||||
|
|
||||||
|
instance
|
||||||
|
}
|
||||||
|
|
||||||
|
fn reconcile_instance_properties(instance: &mut RbxInstanceProperties, snapshot: &RbxSnapshotInstance) -> bool {
|
||||||
|
let mut has_diffs = false;
|
||||||
|
|
||||||
|
if instance.name != snapshot.name {
|
||||||
|
instance.name = snapshot.name.to_string();
|
||||||
|
has_diffs = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
if instance.class_name != snapshot.class_name {
|
||||||
|
instance.class_name = snapshot.class_name.to_string();
|
||||||
|
has_diffs = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
let mut property_updates = HashMap::new();
|
||||||
|
|
||||||
|
for (key, instance_value) in &instance.properties {
|
||||||
|
match snapshot.properties.get(key) {
|
||||||
|
Some(snapshot_value) => {
|
||||||
|
if snapshot_value != instance_value {
|
||||||
|
property_updates.insert(key.clone(), Some(snapshot_value.clone()));
|
||||||
|
}
|
||||||
|
},
|
||||||
|
None => {
|
||||||
|
property_updates.insert(key.clone(), None);
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
for (key, snapshot_value) in &snapshot.properties {
|
||||||
|
if property_updates.contains_key(key) {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
match instance.properties.get(key) {
|
||||||
|
Some(instance_value) => {
|
||||||
|
if snapshot_value != instance_value {
|
||||||
|
property_updates.insert(key.clone(), Some(snapshot_value.clone()));
|
||||||
|
}
|
||||||
|
},
|
||||||
|
None => {
|
||||||
|
property_updates.insert(key.clone(), Some(snapshot_value.clone()));
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
has_diffs = has_diffs || !property_updates.is_empty();
|
||||||
|
|
||||||
|
for (key, change) in property_updates.drain() {
|
||||||
|
match change {
|
||||||
|
Some(value) => instance.properties.insert(key, value),
|
||||||
|
None => instance.properties.remove(&key),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
has_diffs
|
||||||
|
}
|
||||||
|
|
||||||
|
fn reconcile_instance_children(
|
||||||
|
tree: &mut RbxTree,
|
||||||
|
id: RbxId,
|
||||||
|
snapshot: &RbxSnapshotInstance,
|
||||||
|
metadata_per_path: &mut PathMap<MetadataPerPath>,
|
||||||
|
instance_metadata_map: &mut HashMap<RbxId, MetadataPerInstance>,
|
||||||
|
changes: &mut InstanceChanges,
|
||||||
|
) {
|
||||||
|
let mut visited_snapshot_indices = HashSet::new();
|
||||||
|
|
||||||
|
let mut children_to_update: Vec<(RbxId, &RbxSnapshotInstance)> = Vec::new();
|
||||||
|
let mut children_to_add: Vec<&RbxSnapshotInstance> = Vec::new();
|
||||||
|
let mut children_to_remove: Vec<RbxId> = Vec::new();
|
||||||
|
|
||||||
|
let children_ids = tree.get_instance(id).unwrap().get_children_ids();
|
||||||
|
|
||||||
|
// Find all instances that were removed or updated, which we derive by
|
||||||
|
// trying to pair up existing instances to snapshots.
|
||||||
|
for &child_id in children_ids {
|
||||||
|
let child_instance = tree.get_instance(child_id).unwrap();
|
||||||
|
|
||||||
|
// Locate a matching snapshot for this instance
|
||||||
|
let mut matching_snapshot = None;
|
||||||
|
for (snapshot_index, child_snapshot) in snapshot.children.iter().enumerate() {
|
||||||
|
if visited_snapshot_indices.contains(&snapshot_index) {
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
|
||||||
|
// We assume that instances with the same name are probably pretty
|
||||||
|
// similar. This heuristic is similar to React's reconciliation
|
||||||
|
// strategy.
|
||||||
|
if child_snapshot.name == child_instance.name {
|
||||||
|
visited_snapshot_indices.insert(snapshot_index);
|
||||||
|
matching_snapshot = Some(child_snapshot);
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
match matching_snapshot {
|
||||||
|
Some(child_snapshot) => {
|
||||||
|
children_to_update.push((child_instance.get_id(), child_snapshot));
|
||||||
|
},
|
||||||
|
None => {
|
||||||
|
children_to_remove.push(child_instance.get_id());
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Find all instancs that were added, which is just the snapshots we didn't
|
||||||
|
// match up to existing instances above.
|
||||||
|
for (snapshot_index, child_snapshot) in snapshot.children.iter().enumerate() {
|
||||||
|
if !visited_snapshot_indices.contains(&snapshot_index) {
|
||||||
|
children_to_add.push(child_snapshot);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
for child_snapshot in &children_to_add {
|
||||||
|
reify_subtree(child_snapshot, tree, id, metadata_per_path, instance_metadata_map, changes);
|
||||||
|
}
|
||||||
|
|
||||||
|
for child_id in &children_to_remove {
|
||||||
|
if let Some(subtree) = tree.remove_instance(*child_id) {
|
||||||
|
for id in subtree.iter_all_ids() {
|
||||||
|
instance_metadata_map.remove(&id);
|
||||||
|
changes.removed.insert(id);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
for (child_id, child_snapshot) in &children_to_update {
|
||||||
|
reconcile_subtree(tree, *child_id, child_snapshot, metadata_per_path, instance_metadata_map, changes);
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -5,11 +5,13 @@ use std::{
|
|||||||
process::{Command, Stdio},
|
process::{Command, Stdio},
|
||||||
};
|
};
|
||||||
|
|
||||||
|
use log::warn;
|
||||||
use rbx_tree::RbxId;
|
use rbx_tree::RbxId;
|
||||||
|
|
||||||
use crate::{
|
use crate::{
|
||||||
imfs::{Imfs, ImfsItem},
|
imfs::{Imfs, ImfsItem},
|
||||||
rbx_session::RbxSession,
|
rbx_session::RbxSession,
|
||||||
|
web::InstanceMetadata,
|
||||||
};
|
};
|
||||||
|
|
||||||
static GRAPHVIZ_HEADER: &str = r#"
|
static GRAPHVIZ_HEADER: &str = r#"
|
||||||
@@ -25,13 +27,22 @@ digraph RojoTree {
|
|||||||
];
|
];
|
||||||
"#;
|
"#;
|
||||||
|
|
||||||
pub fn graphviz_to_svg(source: &str) -> String {
|
/// Compiles DOT source to SVG by invoking dot on the command line.
|
||||||
let mut child = Command::new("dot")
|
pub fn graphviz_to_svg(source: &str) -> Option<String> {
|
||||||
|
let command = Command::new("dot")
|
||||||
.arg("-Tsvg")
|
.arg("-Tsvg")
|
||||||
.stdin(Stdio::piped())
|
.stdin(Stdio::piped())
|
||||||
.stdout(Stdio::piped())
|
.stdout(Stdio::piped())
|
||||||
.spawn()
|
.spawn();
|
||||||
.expect("Failed to spawn GraphViz process -- make sure it's installed in order to use /api/visualize");
|
|
||||||
|
let mut child = match command {
|
||||||
|
Ok(child) => child,
|
||||||
|
Err(_) => {
|
||||||
|
warn!("Failed to spawn GraphViz process to visualize current state.");
|
||||||
|
warn!("If you want pretty graphs, install GraphViz and make sure 'dot' is on your PATH!");
|
||||||
|
return None;
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
{
|
{
|
||||||
let stdin = child.stdin.as_mut().expect("Failed to open stdin");
|
let stdin = child.stdin.as_mut().expect("Failed to open stdin");
|
||||||
@@ -39,9 +50,10 @@ pub fn graphviz_to_svg(source: &str) -> String {
|
|||||||
}
|
}
|
||||||
|
|
||||||
let output = child.wait_with_output().expect("Failed to read stdout");
|
let output = child.wait_with_output().expect("Failed to read stdout");
|
||||||
String::from_utf8(output.stdout).expect("Failed to parse stdout as UTF-8")
|
Some(String::from_utf8(output.stdout).expect("Failed to parse stdout as UTF-8"))
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// A Display wrapper struct to visualize an RbxSession as SVG.
|
||||||
pub struct VisualizeRbxSession<'a>(pub &'a RbxSession);
|
pub struct VisualizeRbxSession<'a>(pub &'a RbxSession);
|
||||||
|
|
||||||
impl<'a> fmt::Display for VisualizeRbxSession<'a> {
|
impl<'a> fmt::Display for VisualizeRbxSession<'a> {
|
||||||
@@ -61,9 +73,10 @@ fn visualize_rbx_node(session: &RbxSession, id: RbxId, output: &mut fmt::Formatt
|
|||||||
|
|
||||||
let mut node_label = format!("{}|{}|{}", node.name, node.class_name, id);
|
let mut node_label = format!("{}|{}|{}", node.name, node.class_name, id);
|
||||||
|
|
||||||
if let Some(metadata) = session.get_instance_metadata(id) {
|
if let Some(session_metadata) = session.get_instance_metadata(id) {
|
||||||
|
let metadata = InstanceMetadata::from_session_metadata(session_metadata);
|
||||||
node_label.push('|');
|
node_label.push('|');
|
||||||
node_label.push_str(&serde_json::to_string(metadata).unwrap());
|
node_label.push_str(&serde_json::to_string(&metadata).unwrap());
|
||||||
}
|
}
|
||||||
|
|
||||||
node_label = node_label
|
node_label = node_label
|
||||||
@@ -81,6 +94,7 @@ fn visualize_rbx_node(session: &RbxSession, id: RbxId, output: &mut fmt::Formatt
|
|||||||
Ok(())
|
Ok(())
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/// A Display wrapper struct to visualize an Imfs as SVG.
|
||||||
pub struct VisualizeImfs<'a>(pub &'a Imfs);
|
pub struct VisualizeImfs<'a>(pub &'a Imfs);
|
||||||
|
|
||||||
impl<'a> fmt::Display for VisualizeImfs<'a> {
|
impl<'a> fmt::Display for VisualizeImfs<'a> {
|
||||||
|
|||||||
@@ -4,6 +4,8 @@ use std::{
|
|||||||
sync::{mpsc, Arc},
|
sync::{mpsc, Arc},
|
||||||
};
|
};
|
||||||
|
|
||||||
|
use serde_derive::{Serialize, Deserialize};
|
||||||
|
use log::trace;
|
||||||
use rouille::{
|
use rouille::{
|
||||||
self,
|
self,
|
||||||
router,
|
router,
|
||||||
@@ -13,13 +15,30 @@ use rouille::{
|
|||||||
use rbx_tree::{RbxId, RbxInstance};
|
use rbx_tree::{RbxId, RbxInstance};
|
||||||
|
|
||||||
use crate::{
|
use crate::{
|
||||||
session::Session,
|
live_session::LiveSession,
|
||||||
session_id::SessionId,
|
session_id::SessionId,
|
||||||
project::InstanceProjectNodeMetadata,
|
snapshot_reconciler::InstanceChanges,
|
||||||
rbx_snapshot::InstanceChanges,
|
|
||||||
visualize::{VisualizeRbxSession, VisualizeImfs, graphviz_to_svg},
|
visualize::{VisualizeRbxSession, VisualizeImfs, graphviz_to_svg},
|
||||||
|
rbx_session::{MetadataPerInstance},
|
||||||
};
|
};
|
||||||
|
|
||||||
|
static HOME_CONTENT: &str = include_str!("../assets/index.html");
|
||||||
|
|
||||||
|
/// Contains the instance metadata relevant to Rojo clients.
|
||||||
|
#[derive(Debug, Serialize, Deserialize)]
|
||||||
|
#[serde(rename_all = "camelCase")]
|
||||||
|
pub struct InstanceMetadata {
|
||||||
|
ignore_unknown_instances: bool,
|
||||||
|
}
|
||||||
|
|
||||||
|
impl InstanceMetadata {
|
||||||
|
pub fn from_session_metadata(meta: &MetadataPerInstance) -> InstanceMetadata {
|
||||||
|
InstanceMetadata {
|
||||||
|
ignore_unknown_instances: meta.ignore_unknown_instances,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
/// Used to attach metadata specific to Rojo to instances, which come from the
|
/// Used to attach metadata specific to Rojo to instances, which come from the
|
||||||
/// rbx_tree crate.
|
/// rbx_tree crate.
|
||||||
///
|
///
|
||||||
@@ -31,7 +50,7 @@ pub struct InstanceWithMetadata<'a> {
|
|||||||
pub instance: Cow<'a, RbxInstance>,
|
pub instance: Cow<'a, RbxInstance>,
|
||||||
|
|
||||||
#[serde(rename = "Metadata")]
|
#[serde(rename = "Metadata")]
|
||||||
pub metadata: Option<Cow<'a, InstanceProjectNodeMetadata>>,
|
pub metadata: Option<InstanceMetadata>,
|
||||||
}
|
}
|
||||||
|
|
||||||
#[derive(Debug, Serialize, Deserialize)]
|
#[derive(Debug, Serialize, Deserialize)]
|
||||||
@@ -61,14 +80,14 @@ pub struct SubscribeResponse<'a> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
pub struct Server {
|
pub struct Server {
|
||||||
session: Arc<Session>,
|
live_session: Arc<LiveSession>,
|
||||||
server_version: &'static str,
|
server_version: &'static str,
|
||||||
}
|
}
|
||||||
|
|
||||||
impl Server {
|
impl Server {
|
||||||
pub fn new(session: Arc<Session>) -> Server {
|
pub fn new(live_session: Arc<LiveSession>) -> Server {
|
||||||
Server {
|
Server {
|
||||||
session,
|
live_session,
|
||||||
server_version: env!("CARGO_PKG_VERSION"),
|
server_version: env!("CARGO_PKG_VERSION"),
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -79,136 +98,31 @@ impl Server {
|
|||||||
|
|
||||||
router!(request,
|
router!(request,
|
||||||
(GET) (/) => {
|
(GET) (/) => {
|
||||||
Response::text("Rojo is up and running!")
|
self.handle_home()
|
||||||
},
|
},
|
||||||
|
|
||||||
(GET) (/api/rojo) => {
|
(GET) (/api/rojo) => {
|
||||||
// Get a summary of information about the server.
|
self.handle_api_rojo()
|
||||||
|
|
||||||
let rbx_session = self.session.rbx_session.lock().unwrap();
|
|
||||||
let tree = rbx_session.get_tree();
|
|
||||||
|
|
||||||
Response::json(&ServerInfoResponse {
|
|
||||||
server_version: self.server_version,
|
|
||||||
protocol_version: 2,
|
|
||||||
session_id: self.session.session_id,
|
|
||||||
expected_place_ids: self.session.project.serve_place_ids.clone(),
|
|
||||||
root_instance_id: tree.get_root_id(),
|
|
||||||
})
|
|
||||||
},
|
},
|
||||||
|
|
||||||
(GET) (/api/subscribe/{ cursor: u32 }) => {
|
(GET) (/api/subscribe/{ cursor: u32 }) => {
|
||||||
// Retrieve any messages past the given cursor index, and if
|
self.handle_api_subscribe(cursor)
|
||||||
// there weren't any, subscribe to receive any new messages.
|
|
||||||
|
|
||||||
let message_queue = Arc::clone(&self.session.message_queue);
|
|
||||||
|
|
||||||
// Did the client miss any messages since the last subscribe?
|
|
||||||
{
|
|
||||||
let (new_cursor, new_messages) = message_queue.get_messages_since(cursor);
|
|
||||||
|
|
||||||
if !new_messages.is_empty() {
|
|
||||||
return Response::json(&SubscribeResponse {
|
|
||||||
session_id: self.session.session_id,
|
|
||||||
messages: Cow::Borrowed(&new_messages),
|
|
||||||
message_cursor: new_cursor,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
let (tx, rx) = mpsc::channel();
|
|
||||||
|
|
||||||
let sender_id = message_queue.subscribe(tx);
|
|
||||||
|
|
||||||
match rx.recv() {
|
|
||||||
Ok(_) => (),
|
|
||||||
Err(_) => return Response::text("error!").with_status_code(500),
|
|
||||||
}
|
|
||||||
|
|
||||||
message_queue.unsubscribe(sender_id);
|
|
||||||
|
|
||||||
{
|
|
||||||
let (new_cursor, new_messages) = message_queue.get_messages_since(cursor);
|
|
||||||
|
|
||||||
return Response::json(&SubscribeResponse {
|
|
||||||
session_id: self.session.session_id,
|
|
||||||
messages: Cow::Owned(new_messages),
|
|
||||||
message_cursor: new_cursor,
|
|
||||||
})
|
|
||||||
}
|
|
||||||
},
|
},
|
||||||
|
|
||||||
(GET) (/api/read/{ id_list: String }) => {
|
(GET) (/api/read/{ id_list: String }) => {
|
||||||
let message_queue = Arc::clone(&self.session.message_queue);
|
|
||||||
|
|
||||||
let requested_ids: Option<Vec<RbxId>> = id_list
|
let requested_ids: Option<Vec<RbxId>> = id_list
|
||||||
.split(',')
|
.split(',')
|
||||||
.map(RbxId::parse_str)
|
.map(RbxId::parse_str)
|
||||||
.collect();
|
.collect();
|
||||||
|
|
||||||
let requested_ids = match requested_ids {
|
self.handle_api_read(requested_ids)
|
||||||
Some(id) => id,
|
|
||||||
None => return rouille::Response::text("Malformed ID list").with_status_code(400),
|
|
||||||
};
|
|
||||||
|
|
||||||
let rbx_session = self.session.rbx_session.lock().unwrap();
|
|
||||||
let tree = rbx_session.get_tree();
|
|
||||||
|
|
||||||
let message_cursor = message_queue.get_message_cursor();
|
|
||||||
|
|
||||||
let mut instances = HashMap::new();
|
|
||||||
|
|
||||||
for &requested_id in &requested_ids {
|
|
||||||
if let Some(instance) = tree.get_instance(requested_id) {
|
|
||||||
let metadata = rbx_session.get_instance_metadata(requested_id)
|
|
||||||
.map(Cow::Borrowed);
|
|
||||||
|
|
||||||
instances.insert(instance.get_id(), InstanceWithMetadata {
|
|
||||||
instance: Cow::Borrowed(instance),
|
|
||||||
metadata,
|
|
||||||
});
|
|
||||||
|
|
||||||
for descendant in tree.descendants(requested_id) {
|
|
||||||
let descendant_meta = rbx_session.get_instance_metadata(descendant.get_id())
|
|
||||||
.map(Cow::Borrowed);
|
|
||||||
|
|
||||||
instances.insert(descendant.get_id(), InstanceWithMetadata {
|
|
||||||
instance: Cow::Borrowed(descendant),
|
|
||||||
metadata: descendant_meta,
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
Response::json(&ReadResponse {
|
|
||||||
session_id: self.session.session_id,
|
|
||||||
message_cursor,
|
|
||||||
instances,
|
|
||||||
})
|
|
||||||
},
|
},
|
||||||
|
|
||||||
(GET) (/visualize/rbx) => {
|
(GET) (/visualize/rbx) => {
|
||||||
let rbx_session = self.session.rbx_session.lock().unwrap();
|
self.handle_visualize_rbx()
|
||||||
|
|
||||||
let dot_source = format!("{}", VisualizeRbxSession(&rbx_session));
|
|
||||||
|
|
||||||
Response::svg(graphviz_to_svg(&dot_source))
|
|
||||||
},
|
},
|
||||||
|
|
||||||
(GET) (/visualize/imfs) => {
|
(GET) (/visualize/imfs) => {
|
||||||
let imfs = self.session.imfs.lock().unwrap();
|
self.handle_visualize_imfs()
|
||||||
|
|
||||||
let dot_source = format!("{}", VisualizeImfs(&imfs));
|
|
||||||
|
|
||||||
Response::svg(graphviz_to_svg(&dot_source))
|
|
||||||
},
|
},
|
||||||
|
(GET) (/visualize/path_metadata) => {
|
||||||
(GET) (/visualize/path_map) => {
|
self.handle_visualize_path_metadata()
|
||||||
let rbx_session = self.session.rbx_session.lock().unwrap();
|
|
||||||
|
|
||||||
Response::json(&rbx_session.debug_get_path_map())
|
|
||||||
},
|
},
|
||||||
|
|
||||||
_ => Response::empty_404()
|
_ => Response::empty_404()
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
@@ -218,4 +132,131 @@ impl Server {
|
|||||||
|
|
||||||
rouille::start_server(address, move |request| self.handle_request(request));
|
rouille::start_server(address, move |request| self.handle_request(request));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
fn handle_home(&self) -> Response {
|
||||||
|
Response::html(HOME_CONTENT)
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Get a summary of information about the server
|
||||||
|
fn handle_api_rojo(&self) -> Response {
|
||||||
|
let rbx_session = self.live_session.rbx_session.lock().unwrap();
|
||||||
|
let tree = rbx_session.get_tree();
|
||||||
|
|
||||||
|
Response::json(&ServerInfoResponse {
|
||||||
|
server_version: self.server_version,
|
||||||
|
protocol_version: 2,
|
||||||
|
session_id: self.live_session.session_id,
|
||||||
|
expected_place_ids: self.live_session.project.serve_place_ids.clone(),
|
||||||
|
root_instance_id: tree.get_root_id(),
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
/// Retrieve any messages past the given cursor index, and if
|
||||||
|
/// there weren't any, subscribe to receive any new messages.
|
||||||
|
fn handle_api_subscribe(&self, cursor: u32) -> Response {
|
||||||
|
let message_queue = Arc::clone(&self.live_session.message_queue);
|
||||||
|
|
||||||
|
// Did the client miss any messages since the last subscribe?
|
||||||
|
{
|
||||||
|
let (new_cursor, new_messages) = message_queue.get_messages_since(cursor);
|
||||||
|
|
||||||
|
if !new_messages.is_empty() {
|
||||||
|
return Response::json(&SubscribeResponse {
|
||||||
|
session_id: self.live_session.session_id,
|
||||||
|
messages: Cow::Borrowed(&new_messages),
|
||||||
|
message_cursor: new_cursor,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
let (tx, rx) = mpsc::channel();
|
||||||
|
|
||||||
|
let sender_id = message_queue.subscribe(tx);
|
||||||
|
|
||||||
|
match rx.recv() {
|
||||||
|
Ok(_) => (),
|
||||||
|
Err(_) => return Response::text("error!").with_status_code(500),
|
||||||
|
}
|
||||||
|
|
||||||
|
message_queue.unsubscribe(sender_id);
|
||||||
|
|
||||||
|
{
|
||||||
|
let (new_cursor, new_messages) = message_queue.get_messages_since(cursor);
|
||||||
|
|
||||||
|
return Response::json(&SubscribeResponse {
|
||||||
|
session_id: self.live_session.session_id,
|
||||||
|
messages: Cow::Owned(new_messages),
|
||||||
|
message_cursor: new_cursor,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn handle_api_read(&self, requested_ids: Option<Vec<RbxId>>) -> Response {
|
||||||
|
let message_queue = Arc::clone(&self.live_session.message_queue);
|
||||||
|
|
||||||
|
let requested_ids = match requested_ids {
|
||||||
|
Some(id) => id,
|
||||||
|
None => return rouille::Response::text("Malformed ID list").with_status_code(400),
|
||||||
|
};
|
||||||
|
|
||||||
|
let rbx_session = self.live_session.rbx_session.lock().unwrap();
|
||||||
|
let tree = rbx_session.get_tree();
|
||||||
|
|
||||||
|
let message_cursor = message_queue.get_message_cursor();
|
||||||
|
|
||||||
|
let mut instances = HashMap::new();
|
||||||
|
|
||||||
|
for &requested_id in &requested_ids {
|
||||||
|
if let Some(instance) = tree.get_instance(requested_id) {
|
||||||
|
let metadata = rbx_session.get_instance_metadata(requested_id)
|
||||||
|
.map(InstanceMetadata::from_session_metadata);
|
||||||
|
|
||||||
|
instances.insert(instance.get_id(), InstanceWithMetadata {
|
||||||
|
instance: Cow::Borrowed(instance),
|
||||||
|
metadata,
|
||||||
|
});
|
||||||
|
|
||||||
|
for descendant in tree.descendants(requested_id) {
|
||||||
|
let descendant_meta = rbx_session.get_instance_metadata(descendant.get_id())
|
||||||
|
.map(InstanceMetadata::from_session_metadata);
|
||||||
|
|
||||||
|
instances.insert(descendant.get_id(), InstanceWithMetadata {
|
||||||
|
instance: Cow::Borrowed(descendant),
|
||||||
|
metadata: descendant_meta,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
Response::json(&ReadResponse {
|
||||||
|
session_id: self.live_session.session_id,
|
||||||
|
message_cursor,
|
||||||
|
instances,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
fn handle_visualize_rbx(&self) -> Response {
|
||||||
|
let rbx_session = self.live_session.rbx_session.lock().unwrap();
|
||||||
|
let dot_source = format!("{}", VisualizeRbxSession(&rbx_session));
|
||||||
|
|
||||||
|
match graphviz_to_svg(&dot_source) {
|
||||||
|
Some(svg) => Response::svg(svg),
|
||||||
|
None => Response::text(dot_source),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn handle_visualize_imfs(&self) -> Response {
|
||||||
|
let imfs = self.live_session.imfs.lock().unwrap();
|
||||||
|
let dot_source = format!("{}", VisualizeImfs(&imfs));
|
||||||
|
|
||||||
|
match graphviz_to_svg(&dot_source) {
|
||||||
|
Some(svg) => Response::svg(svg),
|
||||||
|
None => Response::text(dot_source),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
fn handle_visualize_path_metadata(&self) -> Response {
|
||||||
|
let rbx_session = self.live_session.rbx_session.lock().unwrap();
|
||||||
|
Response::json(&rbx_session.debug_get_metadata_per_path())
|
||||||
|
}
|
||||||
}
|
}
|
||||||
@@ -1,18 +0,0 @@
|
|||||||
#!/bin/sh
|
|
||||||
|
|
||||||
set -e
|
|
||||||
|
|
||||||
if [ ! -d "../test-projects/$1" ]
|
|
||||||
then
|
|
||||||
echo "Pick a project that exists!"
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
if [ -d "scratch" ]
|
|
||||||
then
|
|
||||||
rm -rf scratch
|
|
||||||
fi
|
|
||||||
|
|
||||||
mkdir -p scratch
|
|
||||||
cp -r "../test-projects/$1" scratch
|
|
||||||
cargo run -- serve "scratch/$1"
|
|
||||||
@@ -1,10 +1,10 @@
|
|||||||
use std::{
|
use std::{
|
||||||
collections::{HashMap, HashSet},
|
collections::{HashMap, HashSet},
|
||||||
io,
|
|
||||||
fs,
|
fs,
|
||||||
path::PathBuf,
|
path::PathBuf,
|
||||||
};
|
};
|
||||||
|
|
||||||
|
use failure::Error;
|
||||||
use tempfile::{TempDir, tempdir};
|
use tempfile::{TempDir, tempdir};
|
||||||
|
|
||||||
use librojo::{
|
use librojo::{
|
||||||
@@ -19,7 +19,7 @@ enum FsEvent {
|
|||||||
Moved(PathBuf, PathBuf),
|
Moved(PathBuf, PathBuf),
|
||||||
}
|
}
|
||||||
|
|
||||||
fn send_events(imfs: &mut Imfs, events: &[FsEvent]) -> io::Result<()> {
|
fn send_events(imfs: &mut Imfs, events: &[FsEvent]) -> Result<(), Error> {
|
||||||
for event in events {
|
for event in events {
|
||||||
match event {
|
match event {
|
||||||
FsEvent::Created(path) => imfs.path_created(path)?,
|
FsEvent::Created(path) => imfs.path_created(path)?,
|
||||||
@@ -56,7 +56,7 @@ fn check_expected(real: &Imfs, expected: &ExpectedImfs) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
fn base_tree() -> io::Result<(TempDir, Imfs, ExpectedImfs, TestResources)> {
|
fn base_tree() -> Result<(TempDir, Imfs, ExpectedImfs, TestResources), Error> {
|
||||||
let root = tempdir()?;
|
let root = tempdir()?;
|
||||||
|
|
||||||
let foo_path = root.path().join("foo");
|
let foo_path = root.path().join("foo");
|
||||||
@@ -125,7 +125,7 @@ fn base_tree() -> io::Result<(TempDir, Imfs, ExpectedImfs, TestResources)> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn initial_read() -> io::Result<()> {
|
fn initial_read() -> Result<(), Error> {
|
||||||
let (_root, imfs, expected_imfs, _resources) = base_tree()?;
|
let (_root, imfs, expected_imfs, _resources) = base_tree()?;
|
||||||
|
|
||||||
check_expected(&imfs, &expected_imfs);
|
check_expected(&imfs, &expected_imfs);
|
||||||
@@ -134,7 +134,7 @@ fn initial_read() -> io::Result<()> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn adding_files() -> io::Result<()> {
|
fn adding_files() -> Result<(), Error> {
|
||||||
let (root, mut imfs, mut expected_imfs, resources) = base_tree()?;
|
let (root, mut imfs, mut expected_imfs, resources) = base_tree()?;
|
||||||
|
|
||||||
check_expected(&imfs, &expected_imfs);
|
check_expected(&imfs, &expected_imfs);
|
||||||
@@ -178,7 +178,7 @@ fn adding_files() -> io::Result<()> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn adding_folder() -> io::Result<()> {
|
fn adding_folder() -> Result<(), Error> {
|
||||||
let (root, imfs, mut expected_imfs, _resources) = base_tree()?;
|
let (root, imfs, mut expected_imfs, _resources) = base_tree()?;
|
||||||
|
|
||||||
check_expected(&imfs, &expected_imfs);
|
check_expected(&imfs, &expected_imfs);
|
||||||
@@ -232,6 +232,16 @@ fn adding_folder() -> io::Result<()> {
|
|||||||
FsEvent::Created(file1_path.clone()),
|
FsEvent::Created(file1_path.clone()),
|
||||||
FsEvent::Created(file2_path.clone()),
|
FsEvent::Created(file2_path.clone()),
|
||||||
],
|
],
|
||||||
|
vec![
|
||||||
|
FsEvent::Created(file1_path.clone()),
|
||||||
|
FsEvent::Created(file2_path.clone()),
|
||||||
|
FsEvent::Created(folder_path.clone()),
|
||||||
|
],
|
||||||
|
vec![
|
||||||
|
FsEvent::Created(file1_path.clone()),
|
||||||
|
FsEvent::Created(folder_path.clone()),
|
||||||
|
FsEvent::Created(file2_path.clone()),
|
||||||
|
],
|
||||||
];
|
];
|
||||||
|
|
||||||
for events in &possible_event_sequences {
|
for events in &possible_event_sequences {
|
||||||
@@ -245,7 +255,36 @@ fn adding_folder() -> io::Result<()> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn removing_file() -> io::Result<()> {
|
fn updating_files() -> Result<(), Error> {
|
||||||
|
let (_root, mut imfs, mut expected_imfs, resources) = base_tree()?;
|
||||||
|
|
||||||
|
check_expected(&imfs, &expected_imfs);
|
||||||
|
|
||||||
|
fs::write(&resources.bar_path, b"bar updated")?;
|
||||||
|
fs::write(&resources.baz_path, b"baz updated")?;
|
||||||
|
|
||||||
|
imfs.path_updated(&resources.bar_path)?;
|
||||||
|
imfs.path_updated(&resources.baz_path)?;
|
||||||
|
|
||||||
|
let bar_updated_item = ImfsItem::File(ImfsFile {
|
||||||
|
path: resources.bar_path.clone(),
|
||||||
|
contents: b"bar updated".to_vec(),
|
||||||
|
});
|
||||||
|
let baz_updated_item = ImfsItem::File(ImfsFile {
|
||||||
|
path: resources.baz_path.clone(),
|
||||||
|
contents: b"baz updated".to_vec(),
|
||||||
|
});
|
||||||
|
|
||||||
|
expected_imfs.items.insert(resources.bar_path.clone(), bar_updated_item);
|
||||||
|
expected_imfs.items.insert(resources.baz_path.clone(), baz_updated_item);
|
||||||
|
|
||||||
|
check_expected(&imfs, &expected_imfs);
|
||||||
|
|
||||||
|
Ok(())
|
||||||
|
}
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn removing_file() -> Result<(), Error> {
|
||||||
let (root, mut imfs, mut expected_imfs, resources) = base_tree()?;
|
let (root, mut imfs, mut expected_imfs, resources) = base_tree()?;
|
||||||
|
|
||||||
check_expected(&imfs, &expected_imfs);
|
check_expected(&imfs, &expected_imfs);
|
||||||
@@ -269,7 +308,7 @@ fn removing_file() -> io::Result<()> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn removing_folder() -> io::Result<()> {
|
fn removing_folder() -> Result<(), Error> {
|
||||||
let (root, imfs, mut expected_imfs, resources) = base_tree()?;
|
let (root, imfs, mut expected_imfs, resources) = base_tree()?;
|
||||||
|
|
||||||
check_expected(&imfs, &expected_imfs);
|
check_expected(&imfs, &expected_imfs);
|
||||||
@@ -294,6 +333,10 @@ fn removing_folder() -> io::Result<()> {
|
|||||||
FsEvent::Removed(resources.baz_path.clone()),
|
FsEvent::Removed(resources.baz_path.clone()),
|
||||||
FsEvent::Removed(resources.foo_path.clone()),
|
FsEvent::Removed(resources.foo_path.clone()),
|
||||||
],
|
],
|
||||||
|
vec![
|
||||||
|
FsEvent::Removed(resources.foo_path.clone()),
|
||||||
|
FsEvent::Removed(resources.baz_path.clone()),
|
||||||
|
],
|
||||||
];
|
];
|
||||||
|
|
||||||
for events in &possible_event_sequences {
|
for events in &possible_event_sequences {
|
||||||
|
|||||||
@@ -21,7 +21,7 @@ lazy_static! {
|
|||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn empty() {
|
fn empty() {
|
||||||
let project_file_location = TEST_PROJECTS_ROOT.join("empty/roblox-project.json");
|
let project_file_location = TEST_PROJECTS_ROOT.join("empty/default.project.json");
|
||||||
let project = Project::load_exact(&project_file_location).unwrap();
|
let project = Project::load_exact(&project_file_location).unwrap();
|
||||||
|
|
||||||
assert_eq!(project.name, "empty");
|
assert_eq!(project.name, "empty");
|
||||||
@@ -29,7 +29,7 @@ fn empty() {
|
|||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn empty_fuzzy_file() {
|
fn empty_fuzzy_file() {
|
||||||
let project_file_location = TEST_PROJECTS_ROOT.join("empty/roblox-project.json");
|
let project_file_location = TEST_PROJECTS_ROOT.join("empty/default.project.json");
|
||||||
let project = Project::load_fuzzy(&project_file_location).unwrap();
|
let project = Project::load_fuzzy(&project_file_location).unwrap();
|
||||||
|
|
||||||
assert_eq!(project.name, "empty");
|
assert_eq!(project.name, "empty");
|
||||||
@@ -45,7 +45,7 @@ fn empty_fuzzy_folder() {
|
|||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn single_sync_point() {
|
fn single_sync_point() {
|
||||||
let project_file_location = TEST_PROJECTS_ROOT.join("single-sync-point/roblox-project.json");
|
let project_file_location = TEST_PROJECTS_ROOT.join("single-sync-point/default.project.json");
|
||||||
let project = Project::load_exact(&project_file_location).unwrap();
|
let project = Project::load_exact(&project_file_location).unwrap();
|
||||||
|
|
||||||
let expected_project = {
|
let expected_project = {
|
||||||
@@ -100,7 +100,7 @@ fn single_sync_point() {
|
|||||||
|
|
||||||
#[test]
|
#[test]
|
||||||
fn test_model() {
|
fn test_model() {
|
||||||
let project_file_location = TEST_PROJECTS_ROOT.join("test-model/roblox-project.json");
|
let project_file_location = TEST_PROJECTS_ROOT.join("test-model/default.project.json");
|
||||||
let project = Project::load_exact(&project_file_location).unwrap();
|
let project = Project::load_exact(&project_file_location).unwrap();
|
||||||
|
|
||||||
assert_eq!(project.name, "test-model");
|
assert_eq!(project.name, "test-model");
|
||||||
|
|||||||
14
test-projects/composing-models/src/Remotes.model.json
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
{
|
||||||
|
"Name": "All my Remote Events",
|
||||||
|
"ClassName": "Folder",
|
||||||
|
"Children": [
|
||||||
|
{
|
||||||
|
"Name": "SendMoney",
|
||||||
|
"ClassName": "RemoteEvent"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"Name": "SendItems",
|
||||||
|
"ClassName": "RemoteEvent"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
@@ -5,12 +5,10 @@
|
|||||||
<Item class="Script" referent="RBX634A9A9988354E4B9D971B2A4DEBD26E">
|
<Item class="Script" referent="RBX634A9A9988354E4B9D971B2A4DEBD26E">
|
||||||
<Properties>
|
<Properties>
|
||||||
<bool name="Disabled">false</bool>
|
<bool name="Disabled">false</bool>
|
||||||
<Content name="LinkedSource"><null></null></Content>
|
|
||||||
<string name="Name">Lone Script</string>
|
<string name="Name">Lone Script</string>
|
||||||
<string name="ScriptGuid">{C62CD9FB-FF28-4FD9-9712-AD28A1E92C84}</string>
|
<string name="ScriptGuid">{C62CD9FB-FF28-4FD9-9712-AD28A1E92C84}</string>
|
||||||
<ProtectedString name="Source"><![CDATA[print("Hello world!")
|
<string name="Source"><![CDATA[print("Hello world!")
|
||||||
]]></ProtectedString>
|
]]></string>
|
||||||
<BinaryString name="Tags"></BinaryString>
|
|
||||||
</Properties>
|
</Properties>
|
||||||
</Item>
|
</Item>
|
||||||
</roblox>
|
</roblox>
|
||||||
6
test-projects/missing-files/default.project.json
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
{
|
||||||
|
"name": "missing-files",
|
||||||
|
"tree": {
|
||||||
|
"$path": "does-not-exist"
|
||||||
|
}
|
||||||
|
}
|
||||||
21
test-scratch-project
Normal file
@@ -0,0 +1,21 @@
|
|||||||
|
#!/bin/sh
|
||||||
|
|
||||||
|
# Copies a project from 'test-projects' into a folder that can be messed with
|
||||||
|
# without accidentally checking the results into version control.
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
if [ ! -d "test-projects/$1" ]
|
||||||
|
then
|
||||||
|
echo "Pick a project that exists!"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ -d "scratch-project/$1" ]
|
||||||
|
then
|
||||||
|
rm -rf "scratch-project/$1"
|
||||||
|
fi
|
||||||
|
|
||||||
|
mkdir -p scratch-project
|
||||||
|
cp -r "test-projects/$1" scratch-project
|
||||||
|
cargo run -- serve "scratch-project/$1"
|
||||||