Compare commits

...

57 Commits

Author SHA1 Message Date
Lucien Greathouse
83a0ae673c 0.5.0-alpha.9 2019-04-04 21:20:00 -07:00
Lucien Greathouse
7de646c290 Upgrade dependencies 2019-04-04 18:35:18 -07:00
Lucien Greathouse
5d681a72ac Rewrite CSV conversion to dodge Serde (#152)
* Rewrite CSV conversion to dodge Serde

* Update CHANGELOG
2019-04-04 18:21:55 -07:00
Lucien Greathouse
d725970e6e Fix handling of CSV files with empty columns and rows (#149)
* Fix #147

* Add localization test project, fix empty rows in general

* Fill out 'normal' CSV in localization test project

* Update Changelog
2019-04-04 13:16:10 -07:00
Lucien Greathouse
54b82760cd Switch 'rojo build' to use BufWriter, magic performance increase 2019-04-01 18:02:46 -07:00
Lucien Greathouse
77f79fa913 0.5.0-alpha.8 2019-03-29 17:36:43 -07:00
Lucien Greathouse
6db714a2b1 Special-case Lighting.Technology in setCanonicalProperty, temporary fix 2019-03-29 17:25:57 -07:00
Lucien Greathouse
913ac7c9f5 Update dependencies 2019-03-28 15:44:56 -07:00
Lucien Greathouse
eecbfd29e7 Update dependencies, adding a bunch of new features 2019-03-27 13:31:12 -07:00
Lucien Greathouse
41025225b2 Rewrite message queue with oneshot futures (#139) 2019-03-27 13:27:50 -07:00
Lucien Greathouse
07c7b28c03 Fix plugin unloading 2019-03-21 22:35:30 -07:00
Lucien Greathouse
3faf3d2a56 Update Changelog for #135 2019-03-20 10:42:18 -07:00
Lucien Greathouse
be094d5b7c Make snapshot application communicative (#135)
* Add children sorting to snapshot_reconciler

* Update snapshot tests to include stable children order

* Bump dependencies, which should make this PR work
2019-03-20 10:39:53 -07:00
Lucien Greathouse
459673bd59 0.5.0-alpha.6 2019-03-19 18:24:30 -07:00
Lucien Greathouse
2968b70e6b Listen to Plugin.Unloading.
Closes #127.
2019-03-19 18:17:03 -07:00
Lucien Greathouse
b6989a18fc Add conditionally-enabled typechecking using t 2019-03-19 17:57:19 -07:00
Lucien Greathouse
4d6a504836 Remove Rodux and Roact-Rodux, add t dependency 2019-03-19 16:34:53 -07:00
Lucien Greathouse
6c3737df68 Update Changelog 2019-03-19 16:31:34 -07:00
Lucien Greathouse
9f382ed9bd Iterate on plugin reconciler
- Renamed setProperty to setCanonicalProperty, which is more usefully
  descriptive. Also added a detailed comment.
- Fixed reconciler behavior with regards to removing known instances
  when $ignoreUnknownInstances is set
2019-03-19 16:30:06 -07:00
Lucien Greathouse
f9e86e58d6 Add InstanceMap:destroyInstance for forgetting and destroying in one step 2019-03-19 16:29:56 -07:00
Lucien Greathouse
469f9c927f Improve plugin place project for testing 2019-03-19 16:29:31 -07:00
Lucien Greathouse
312724189b Remove ignore from old doc generator script 2019-03-14 14:20:38 -07:00
Lucien Greathouse
ec0a1f1ce4 New snapshot tests (#134)
* Changes project-related structures to use `BTreeMap` instead of `HashMap` for children to aid determiniusm
* Changes imfs-related structures to have total ordering and use `BTreeSet` instead of `HashSet`
* Upgrades dependencies to `bx_dom_weak`1.2.0 and rbx_xml 0.5.0 to aid in more determinism stuff
* Re-exposes the `RbxSession`'s root project via `root_project()`
* Implements `Default` for a couple things
* Tweaks visualization code to support visualizing trees not attached to an `RbxSession`
* Adds an ID-invariant comparison method for `rbx_tree` relying on previous determinism changes
* Adds a (disabled) test to start finding issues in the reconciler with regards to communicativity of snapshot application
* Adds a snapshot testing system that operates on `RbxTree` and associated metadata, which are committed in this change
2019-03-14 14:20:03 -07:00
Lucien Greathouse
ad93631ef8 Port to futures channel instead of std one.
Fixes #133.
2019-03-12 11:45:39 -07:00
Lucien Greathouse
3b6238ff93 Add more types to plugin 2019-03-11 16:55:42 -07:00
Lucien Greathouse
5b9facee00 Fix up variable naming in serialize_unresolved_minimal 2019-03-11 16:35:54 -07:00
Lucien Greathouse
376f2a554a Better default project, including minimal property types 2019-03-11 16:28:40 -07:00
Lucien Greathouse
5fd0bd3db9 Update/prune dependencies with help of cargo-outdated 2019-03-11 14:12:49 -07:00
Lucien Greathouse
2deb3bbf23 Add notable feature from dependency upgrade 2019-03-11 13:48:02 -07:00
Lucien Greathouse
01bef0c2b8 Update dependencies 2019-03-11 13:47:33 -07:00
Lucien Greathouse
b65a8ce680 0.5.0-alpha.5 2019-03-01 15:40:30 -08:00
Lucien Greathouse
5fc4f63238 Upgrade dependencies 2019-03-01 15:34:16 -08:00
Lucien Greathouse
9b0e0c175b Add missing CHANGELOG note 2019-02-27 17:32:36 -08:00
Lucien Greathouse
eb97e925e6 Flip LiveSession::session_id private, add getter 2019-02-27 14:54:05 -08:00
Lucien Greathouse
16f8975b18 Flip project field of LiveSession private to prepare for multi-project future 2019-02-27 14:51:53 -08:00
Lucien Greathouse
5073fce2f7 Implement LiveSession::restart_with_new_project as foundation for reloading 2019-02-27 14:42:41 -08:00
Lucien Greathouse
cf5036eec6 Fix warnings compiling server 2019-02-27 00:49:38 -08:00
Lucien Greathouse
20be37dd8b Improve error messages from bad snapshots 2019-02-27 00:47:02 -08:00
Lucien Greathouse
93349ae2dc Use rbx_reflection to allow type inference on projects (#130)
* Start dependency on rbx_reflection

* Alive and working, all tests pass

* Update CHANGELOG
2019-02-26 22:51:21 -08:00
Lucien Greathouse
be81de74cd Disable Lua tests for now, since they need features Lemur doesn't have 2019-02-24 00:58:02 -08:00
Lucien Greathouse
88e739090d WIP: Server plugins via rlua (Lua 5.3) (#125)
* Add 'plugins' field to project and add rlua

* Scaffold out new SnapshotContext type (again) with plugin state

* Almost functional snapshot system with rlua proof-of-concept

* Gate plugin config on 'plugins-enabled' feature, tell Travis to test all features

* Guard remaining plugin setup code behind feature

* Bump minimum version to 1.33, should've caught this before

* Whoops, latest Rust is 1.32, not 1.33
2019-02-24 00:31:58 -08:00
Lucien Greathouse
7f324f1957 Update CHANGELOG 2019-02-22 15:57:46 -08:00
Lucien Greathouse
4f31c9e72f Fix /api/read and /api/subscribe, re-add debug output 2019-02-22 15:56:24 -08:00
Lucien Greathouse
c9a663ed39 Remove Rouille and port everything to Hyper 2019-02-22 15:11:27 -08:00
Lucien Greathouse
105d8aeb6b Start to stub out sub-services 2019-02-22 13:08:07 -08:00
Lucien Greathouse
6ea1211bc5 It's alive! 2019-02-22 10:50:14 -08:00
Lucien Greathouse
c13291a598 Break apart web interface between UI and API 2019-02-19 11:44:24 -08:00
Lucien Greathouse
aaa78c618c Move diagnostics page to use Ritz, show server version 2019-02-19 11:27:22 -08:00
Lucien Greathouse
2890c677d4 Bump dependency from rbx_tree 0.2.0 to rbx_dom_weak 0.3.0 2019-02-14 17:22:44 -08:00
Lucien Greathouse
51a010de00 Update CHANGELOG 2019-02-11 13:48:20 -08:00
Lucien Greathouse
ca0aabd814 Preload plugin assets at start.
Closes #121.
2019-02-11 13:47:49 -08:00
Lucien Greathouse
91d1ba1910 Add test for rojoValueToRobloxValue, fails Lemur because of missing APIs right now 2019-02-11 11:43:17 -08:00
Lucien Greathouse
c7c739dc00 Fix test bootstrap script for testing in Studio 2019-02-11 11:43:06 -08:00
Lucien Greathouse
7a8389bf11 Update CHANGELOG 2019-02-11 11:42:40 -08:00
Lucien Greathouse
5f062b8ea3 Make the plugin support non-primitive types 2019-02-11 10:55:03 -08:00
Lucien Greathouse
b9ee14a0f9 Remove unused Cargo features section 2019-02-11 10:27:09 -08:00
Lucien Greathouse
c3baf73455 Update documentation for alpha 4 2019-02-08 18:29:23 -08:00
73 changed files with 4167 additions and 1442 deletions

2
.gitignore vendored
View File

@@ -2,4 +2,4 @@
/target
/scratch-project
**/*.rs.bk
/generate-docs.run
/server/failed-snapshots/

9
.gitmodules vendored
View File

@@ -1,12 +1,6 @@
[submodule "plugin/modules/roact"]
path = plugin/modules/roact
url = https://github.com/Roblox/roact.git
[submodule "plugin/modules/rodux"]
path = plugin/modules/rodux
url = https://github.com/Roblox/rodux.git
[submodule "plugin/modules/roact-rodux"]
path = plugin/modules/roact-rodux
url = https://github.com/Roblox/roact-rodux.git
[submodule "plugin/modules/testez"]
path = plugin/modules/testez
url = https://github.com/Roblox/testez.git
@@ -16,3 +10,6 @@
[submodule "plugin/modules/promise"]
path = plugin/modules/promise
url = https://github.com/LPGhatguy/roblox-lua-promise.git
[submodule "plugin/modules/t"]
path = plugin/modules/t
url = https://github.com/osyrisrblx/t.git

View File

@@ -1,36 +1,41 @@
matrix:
include:
- language: python
env:
- LUA="lua=5.1"
# Lua tests are currently disabled because of holes in Lemur that are pretty
# tedious to fix. It should be fixed by either adding missing features to
# Lemur or by migrating to a CI system based on real Roblox instead.
before_install:
- pip install hererocks
- hererocks lua_install -r^ --$LUA
- export PATH=$PATH:$PWD/lua_install/bin
# - language: python
# env:
# - LUA="lua=5.1"
install:
- luarocks install luafilesystem
- luarocks install busted
- luarocks install luacov
- luarocks install luacov-coveralls
- luarocks install luacheck
# before_install:
# - pip install hererocks
# - hererocks lua_install -r^ --$LUA
# - export PATH=$PATH:$PWD/lua_install/bin
script:
- cd plugin
- luacheck src
- lua -lluacov spec.lua
# install:
# - luarocks install luafilesystem
# - luarocks install busted
# - luarocks install luacov
# - luarocks install luacov-coveralls
# - luarocks install luacheck
after_success:
- cd plugin
- luacov-coveralls -e $TRAVIS_BUILD_DIR/lua_install
# script:
# - cd plugin
# - luacheck src
# - lua -lluacov spec.lua
# after_success:
# - cd plugin
# - luacov-coveralls -e $TRAVIS_BUILD_DIR/lua_install
- language: rust
rust: 1.31.1
rust: 1.32.0
cache: cargo
script:
- cargo test --verbose
- cargo test --verbose --all-features
- language: rust
rust: stable
@@ -38,10 +43,12 @@ matrix:
script:
- cargo test --verbose
- cargo test --verbose --all-features
- language: rust
rust: beta
cache: cargo
script:
- cargo test --verbose
- cargo test --verbose
- cargo test --verbose --all-features

View File

@@ -2,6 +2,50 @@
## [Unreleased]
## [0.5.0 Alpha 9](https://github.com/LPGhatguy/rojo/releases/tag/v0.5.0-alpha.9) (April 4, 2019)
* Changed `rojo build` to use buffered I/O, which can make it up to 2x faster in some cases.
* Building [*Road Not Taken*](https://github.com/LPGhatguy/roads) to an `rbxlx` file dropped from 150ms to 70ms on my machine
* Fixed `LocalizationTable` instances being made from `csv` files incorrectly interpreting empty rows and columns. ([#149](https://github.com/LPGhatguy/rojo/pull/149))
* Fixed CSV files with entries that parse as numbers causing Rojo to panic. ([#152](https://github.com/LPGhatguy/rojo/pull/152))
* Improved error messages when malformed CSV files are found in a Rojo project.
## [0.5.0 Alpha 8](https://github.com/LPGhatguy/rojo/releases/tag/v0.5.0-alpha.8) (March 29, 2019)
* Added support for a bunch of new types when dealing with XML model/place files:
* `ColorSequence`
* `Float64`
* `Int64`
* `NumberRange`
* `NumberSequence`
* `PhysicalProperties`
* `Ray`
* `Rect`
* `Ref`
* Improved server instance ordering behavior when files are added during a live session ([#135](https://github.com/LPGhatguy/rojo/pull/135))
* Fixed error being thrown when trying to unload the Rojo plugin.
* Added partial fix for [issue #141](https://github.com/LPGhatguy/rojo/issues/141) for `Lighting.Technology`, which should restore live sync functionality for the default project file.
## [0.5.0 Alpha 6](https://github.com/LPGhatguy/rojo/releases/tag/v0.5.0-alpha.6) (March 19, 2019)
* Fixed `rojo init` giving unexpected results by upgrading to `rbx_dom_weak` 1.1.0
* Fixed live server not responding when the Rojo plugin is connected ([#133](https://github.com/LPGhatguy/rojo/issues/133))
* Updated default place file:
* Improved default properties to be closer to Studio's built-in 'Baseplate' template
* Added a baseplate to the project file (Thanks, [@AmaranthineCodices](https://github.com/AmaranthineCodices/)!)
* Added more type support to Rojo plugin
* Fixed some cases where the Rojo plugin would leave around objects that it knows should be deleted
* Updated plugin to correctly listen to `Plugin.Unloading` when installing or uninstalling new plugins
## [0.5.0 Alpha 5](https://github.com/LPGhatguy/rojo/releases/tag/v0.5.0-alpha.5) (March 1, 2019)
* Upgraded core dependencies, which improves compatibility for lots of instance types
* Upgraded from `rbx_tree` 0.2.0 to `rbx_dom_weak` 1.0.0
* Upgraded from `rbx_xml` 0.2.0 to `rbx_xml` 0.4.0
* Upgraded from `rbx_binary` 0.2.0 to `rbx_binary` 0.4.0
* Added support for non-primitive types in the Rojo plugin.
* Types like `Color3` and `CFrame` can now be updated live!
* Fixed plugin assets flashing in on first load ([#121](https://github.com/LPGhatguy/rojo/issues/121))
* Changed Rojo's HTTP server from Rouille to Hyper, which reduced the release size by around a megabyte.
* Added property type inference to projects, which makes specifying services a lot easier ([#130](https://github.com/LPGhatguy/rojo/pull/130))
* Made error messages from invalid and missing files more user-friendly
## [0.5.0 Alpha 4](https://github.com/LPGhatguy/rojo/releases/tag/v0.5.0-alpha.4) (February 8, 2019)
* Added support for nested partitions ([#102](https://github.com/LPGhatguy/rojo/issues/102))
* Added support for 'transmuting' partitions ([#112](https://github.com/LPGhatguy/rojo/issues/112))

1173
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -62,7 +62,7 @@ If you use a plugin that _isn't_ Rojo for syncing code, open an issue and let me
## Contributing
Pull requests are welcome!
Rojo supports Rust 1.31.1 and newer. Any changes to the minimum required compiler version require a _minor_ version bump.
Rojo supports Rust 1.32 and newer. Any changes to the minimum required compiler version require a _minor_ version bump.
## License
Rojo is available under the terms of the Mozilla Public License, Version 2.0. See [LICENSE.txt](LICENSE.txt) for details.

View File

@@ -25,7 +25,7 @@ If you have Rust installed, the easiest way to get Rojo is with Cargo!
To install the latest 0.5.0 alpha, use:
```sh
cargo install rojo --version 0.5.0-alpha.3
cargo install rojo --version 0.5.0-alpha.9
```
## Installing the Plugin

View File

@@ -13,6 +13,7 @@ stds.roblox = {
-- Types
"Vector2", "Vector3",
"Vector2int16", "Vector3int16",
"Color3",
"UDim", "UDim2",
"Rect",

View File

@@ -8,14 +8,11 @@
"Roact": {
"$path": "modules/roact/lib"
},
"Rodux": {
"$path": "modules/rodux/lib"
},
"RoactRodux": {
"$path": "modules/roact-rodux/lib"
},
"Promise": {
"$path": "modules/promise/lib"
},
"t": {
"$path": "modules/t/lib/t.lua"
}
}
}

1
plugin/modules/t Submodule

Submodule plugin/modules/t added at a3a80ebf0a

View File

@@ -15,14 +15,11 @@
"Roact": {
"$path": "modules/roact/lib"
},
"Rodux": {
"$path": "modules/rodux/lib"
},
"RoactRodux": {
"$path": "modules/roact-rodux/lib"
},
"Promise": {
"$path": "modules/promise/lib"
},
"t": {
"$path": "modules/t/lib/t.lua"
}
},
"TestEZ": {
@@ -40,8 +37,8 @@
}
},
"TestService": {
"$className": "TestService",
"ServerScriptService": {
"$className": "ServerScriptService",
"TestBootstrap": {
"$path": "testBootstrap.server.lua"

View File

@@ -1,9 +1,7 @@
local sheetAsset = "rbxassetid://2738712459"
local Assets = {
Sprites = {
WhiteCross = {
asset = sheetAsset,
asset = "rbxassetid://2738712459",
offset = Vector2.new(190, 318),
size = Vector2.new(18, 18),
},

View File

@@ -8,6 +8,7 @@ local Config = require(Plugin.Config)
local Version = require(Plugin.Version)
local Logging = require(Plugin.Logging)
local DevSettings = require(Plugin.DevSettings)
local preloadAssets = require(Plugin.preloadAssets)
local ConnectPanel = require(Plugin.Components.ConnectPanel)
local ConnectionActivePanel = require(Plugin.Components.ConnectionActivePanel)
@@ -177,6 +178,15 @@ function App:didMount()
})
end
end)
preloadAssets()
end
function App:willUnmount()
if self.currentSession ~= nil then
self.currentSession:disconnect()
self.currentSession = nil
end
end
function App:didUpdate()

View File

@@ -1,6 +1,6 @@
return {
codename = "Epiphany",
version = {0, 5, 0, "-alpha.4"},
version = {0, 5, 0, "-alpha.9"},
expectedServerVersionString = "0.5.0 or newer",
protocolVersion = 2,
defaultHost = "localhost",

View File

@@ -1,10 +1,27 @@
local Config = require(script.Parent.Config)
local Environment = {
User = "User",
Dev = "Dev",
Test = "Test",
}
local VALUES = {
LogLevel = {
type = "IntValue",
defaultUserValue = 2,
defaultDevValue = 3,
values = {
[Environment.User] = 2,
[Environment.Dev] = 3,
[Environment.Test] = 3,
},
},
TypecheckingEnabled = {
type = "BoolValue",
values = {
[Environment.User] = false,
[Environment.Dev] = true,
[Environment.Test] = true,
},
},
}
@@ -42,7 +59,9 @@ local function setStoredValue(name, kind, value)
object.Value = value
end
local function createAllValues()
local function createAllValues(environment)
assert(Environment[environment] ~= nil, "Invalid environment")
valueContainer = getValueContainer()
if valueContainer == nil then
@@ -52,20 +71,57 @@ local function createAllValues()
end
for name, value in pairs(VALUES) do
setStoredValue(name, value.type, value.defaultDevValue)
setStoredValue(name, value.type, value.values[environment])
end
end
_G[("ROJO_%s_DEV_CREATE"):format(Config.codename:upper())] = createAllValues
local function getValue(name)
assert(VALUES[name] ~= nil, "Invalid DevSettings name")
local stored = getStoredValue(name)
if stored ~= nil then
return stored
end
return VALUES[name].values[Environment.User]
end
local DevSettings = {}
function DevSettings:createDevSettings()
createAllValues(Environment.Dev)
end
function DevSettings:createTestSettings()
createAllValues(Environment.Test)
end
function DevSettings:hasChangedValues()
return valueContainer ~= nil
end
function DevSettings:resetValues()
if valueContainer then
valueContainer:Destroy()
valueContainer = nil
end
end
function DevSettings:isEnabled()
return valueContainer ~= nil
end
function DevSettings:getLogLevel()
return getStoredValue("LogLevel") or VALUES.LogLevel.defaultUserValue
return getValue("LogLevel")
end
function DevSettings:shouldTypecheck()
return getValue("TypecheckingEnabled")
end
function _G.ROJO_DEV_CREATE()
DevSettings:createDevSettings()
end
return DevSettings

View File

@@ -0,0 +1,81 @@
local Logging = require(script.Parent.Logging)
--[[
A bidirectional map between instance IDs and Roblox instances. It lets us
keep track of every instance we know about.
TODO: Track ancestry to catch when stuff moves?
]]
local InstanceMap = {}
InstanceMap.__index = InstanceMap
function InstanceMap.new()
local self = {
fromIds = {},
fromInstances = {},
}
return setmetatable(self, InstanceMap)
end
function InstanceMap:insert(id, instance)
self.fromIds[id] = instance
self.fromInstances[instance] = id
end
function InstanceMap:removeId(id)
local instance = self.fromIds[id]
if instance ~= nil then
self.fromIds[id] = nil
self.fromInstances[instance] = nil
else
Logging.warn("Attempted to remove nonexistant ID %s", tostring(id))
end
end
function InstanceMap:removeInstance(instance)
local id = self.fromInstances[instance]
if id ~= nil then
self.fromInstances[instance] = nil
self.fromIds[id] = nil
else
Logging.warn("Attempted to remove nonexistant instance %s", tostring(instance))
end
end
function InstanceMap:destroyInstance(instance)
local id = self.fromInstances[instance]
if id ~= nil then
self:destroyId(id)
else
Logging.warn("Attempted to destroy untracked instance %s", tostring(instance))
end
end
function InstanceMap:destroyId(id)
local instance = self.fromIds[id]
self:removeId(id)
if instance ~= nil then
local descendantsToDestroy = {}
for otherInstance in pairs(self.fromInstances) do
if otherInstance:IsDescendantOf(instance) then
table.insert(descendantsToDestroy, otherInstance)
end
end
for _, otherInstance in ipairs(descendantsToDestroy) do
self:removeInstance(otherInstance)
end
instance:Destroy()
else
Logging.warn("Attempted to destroy nonexistant ID %s", tostring(id))
end
end
return InstanceMap

View File

@@ -1,108 +1,17 @@
local t = require(script.Parent.Parent.t)
local InstanceMap = require(script.Parent.InstanceMap)
local Logging = require(script.Parent.Logging)
local function makeInstanceMap()
local self = {
fromIds = {},
fromInstances = {},
}
function self:insert(id, instance)
self.fromIds[id] = instance
self.fromInstances[instance] = id
end
function self:removeId(id)
local instance = self.fromIds[id]
if instance ~= nil then
self.fromIds[id] = nil
self.fromInstances[instance] = nil
else
Logging.warn("Attempted to remove nonexistant ID %s", tostring(id))
end
end
function self:removeInstance(instance)
local id = self.fromInstances[instance]
if id ~= nil then
self.fromInstances[instance] = nil
self.fromIds[id] = nil
else
Logging.warn("Attempted to remove nonexistant instance %s", tostring(instance))
end
end
function self:destroyId(id)
local instance = self.fromIds[id]
self:removeId(id)
if instance ~= nil then
local descendantsToDestroy = {}
for otherInstance in pairs(self.fromInstances) do
if otherInstance:IsDescendantOf(instance) then
table.insert(descendantsToDestroy, otherInstance)
end
end
for _, otherInstance in ipairs(descendantsToDestroy) do
self:removeInstance(otherInstance)
end
instance:Destroy()
else
Logging.warn("Attempted to destroy nonexistant ID %s", tostring(id))
end
end
return self
end
local function setProperty(instance, key, value)
-- The 'Contents' property of LocalizationTable isn't directly exposed, but
-- has corresponding (deprecated) getters and setters.
if key == "Contents" and instance.ClassName == "LocalizationTable" then
instance:SetContents(value)
return
end
-- If we don't have permissions to access this value at all, we can skip it.
local readSuccess, existingValue = pcall(function()
return instance[key]
end)
if not readSuccess then
-- An error will be thrown if there was a permission issue or if the
-- property doesn't exist. In the latter case, we should tell the user
-- because it's probably their fault.
if existingValue:find("lacking permission") then
Logging.trace("Permission error reading property %s on class %s", tostring(key), instance.ClassName)
return
else
error(("Invalid property %s on class %s: %s"):format(tostring(key), instance.ClassName, existingValue), 2)
end
end
local writeSuccess, err = pcall(function()
if existingValue ~= value then
instance[key] = value
end
end)
if not writeSuccess then
error(("Cannot set property %s on class %s: %s"):format(tostring(key), instance.ClassName, err), 2)
end
return true
end
local setCanonicalProperty = require(script.Parent.setCanonicalProperty)
local rojoValueToRobloxValue = require(script.Parent.rojoValueToRobloxValue)
local Types = require(script.Parent.Types)
local Reconciler = {}
Reconciler.__index = Reconciler
function Reconciler.new()
local self = {
instanceMap = makeInstanceMap(),
instanceMap = InstanceMap.new(),
}
return setmetatable(self, Reconciler)
@@ -118,11 +27,18 @@ function Reconciler:applyUpdate(requestedIds, virtualInstancesById)
end
end
local reconcileSchema = Types.ifEnabled(t.tuple(
t.map(t.string, Types.VirtualInstance),
t.string,
t.Instance
))
--[[
Update an existing instance, including its properties and children, to match
the given information.
]]
function Reconciler:reconcile(virtualInstancesById, id, instance)
assert(reconcileSchema(virtualInstancesById, id, instance))
local virtualInstance = virtualInstancesById[id]
-- If an instance changes ClassName, we assume it's very different. That's
@@ -137,10 +53,10 @@ function Reconciler:reconcile(virtualInstancesById, id, instance)
self.instanceMap:insert(id, instance)
-- Some instances don't like being named, even if their name already matches
setProperty(instance, "Name", virtualInstance.Name)
setCanonicalProperty(instance, "Name", virtualInstance.Name)
for key, value in pairs(virtualInstance.Properties) do
setProperty(instance, key, value.Value)
setCanonicalProperty(instance, key, rojoValueToRobloxValue(value))
end
local existingChildren = instance:GetChildren()
@@ -175,10 +91,17 @@ function Reconciler:reconcile(virtualInstancesById, id, instance)
end
end
if self:__shouldClearUnknownInstances(virtualInstance) then
for existingChildInstance in pairs(unvisitedExistingChildren) do
self.instanceMap:removeInstance(existingChildInstance)
existingChildInstance:Destroy()
local shouldClearUnknown = self:__shouldClearUnknownChildren(virtualInstance)
for existingChildInstance in pairs(unvisitedExistingChildren) do
local childId = self.instanceMap.fromInstances[existingChildInstance]
if childId == nil then
if shouldClearUnknown then
existingChildInstance:Destroy()
end
else
self.instanceMap:destroyInstance(existingChildInstance)
end
end
@@ -194,16 +117,13 @@ function Reconciler:reconcile(virtualInstancesById, id, instance)
-- Some instances, like services, don't like having their Parent
-- property poked, even if we're setting it to the same value.
setProperty(instance, "Parent", parent)
if instance.Parent ~= parent then
instance.Parent = parent
end
setCanonicalProperty(instance, "Parent", parent)
end
return instance
end
function Reconciler:__shouldClearUnknownInstances(virtualInstance)
function Reconciler:__shouldClearUnknownChildren(virtualInstance)
if virtualInstance.Metadata ~= nil then
return not virtualInstance.Metadata.ignoreUnknownInstances
else
@@ -211,29 +131,44 @@ function Reconciler:__shouldClearUnknownInstances(virtualInstance)
end
end
local reifySchema = Types.ifEnabled(t.tuple(
t.map(t.string, Types.VirtualInstance),
t.string,
t.Instance
))
function Reconciler:__reify(virtualInstancesById, id, parent)
assert(reifySchema(virtualInstancesById, id, parent))
local virtualInstance = virtualInstancesById[id]
local instance = Instance.new(virtualInstance.ClassName)
for key, value in pairs(virtualInstance.Properties) do
-- TODO: Branch on value.Type
setProperty(instance, key, value.Value)
setCanonicalProperty(instance, key, rojoValueToRobloxValue(value))
end
instance.Name = virtualInstance.Name
setCanonicalProperty(instance, "Name", virtualInstance.Name)
for _, childId in ipairs(virtualInstance.Children) do
self:__reify(virtualInstancesById, childId, instance)
end
setProperty(instance, "Parent", parent)
setCanonicalProperty(instance, "Parent", parent)
self.instanceMap:insert(id, instance)
return instance
end
local applyUpdatePieceSchema = Types.ifEnabled(t.tuple(
t.string,
t.map(t.string, t.boolean),
t.map(t.string, Types.VirtualInstance)
))
function Reconciler:__applyUpdatePiece(id, visitedIds, virtualInstancesById)
assert(applyUpdatePieceSchema(id, visitedIds, virtualInstancesById))
if visitedIds[id] then
return
end

View File

@@ -0,0 +1,218 @@
local Reconciler = require(script.Parent.Reconciler)
return function()
it("should leave instances alone if there's nothing specified", function()
local instance = Instance.new("Folder")
instance.Name = "TestFolder"
local instanceId = "test-id"
local virtualInstancesById = {
[instanceId] = {
Name = "TestFolder",
ClassName = "Folder",
Children = {},
Properties = {},
},
}
local reconciler = Reconciler.new()
reconciler:reconcile(virtualInstancesById, instanceId, instance)
end)
it("should assign names from virtual instances", function()
local instance = Instance.new("Folder")
instance.Name = "InitialName"
local instanceId = "test-id"
local virtualInstancesById = {
[instanceId] = {
Name = "NewName",
ClassName = "Folder",
Children = {},
Properties = {},
},
}
local reconciler = Reconciler.new()
reconciler:reconcile(virtualInstancesById, instanceId, instance)
expect(instance.Name).to.equal("NewName")
end)
it("should assign properties from virtual instances", function()
local instance = Instance.new("IntValue")
instance.Name = "TestValue"
instance.Value = 5
local instanceId = "test-id"
local virtualInstancesById = {
[instanceId] = {
Name = "TestValue",
ClassName = "IntValue",
Children = {},
Properties = {
Value = {
Type = "Int32",
Value = 9
}
},
},
}
local reconciler = Reconciler.new()
reconciler:reconcile(virtualInstancesById, instanceId, instance)
expect(instance.Value).to.equal(9)
end)
it("should wipe unknown children by default", function()
local parent = Instance.new("Folder")
parent.Name = "Parent"
local child = Instance.new("Folder")
child.Name = "Child"
local parentId = "test-id"
local virtualInstancesById = {
[parentId] = {
Name = "Parent",
ClassName = "Folder",
Children = {},
Properties = {},
},
}
local reconciler = Reconciler.new()
reconciler:reconcile(virtualInstancesById, parentId, parent)
expect(#parent:GetChildren()).to.equal(0)
end)
it("should preserve unknown children if ignoreUnknownInstances is set", function()
local parent = Instance.new("Folder")
parent.Name = "Parent"
local child = Instance.new("Folder")
child.Parent = parent
child.Name = "Child"
local parentId = "test-id"
local virtualInstancesById = {
[parentId] = {
Name = "Parent",
ClassName = "Folder",
Children = {},
Properties = {},
Metadata = {
ignoreUnknownInstances = true,
},
},
}
local reconciler = Reconciler.new()
reconciler:reconcile(virtualInstancesById, parentId, parent)
expect(child.Parent).to.equal(parent)
expect(#parent:GetChildren()).to.equal(1)
end)
it("should remove known removed children", function()
local parent = Instance.new("Folder")
parent.Name = "Parent"
local child = Instance.new("Folder")
child.Parent = parent
child.Name = "Child"
local parentId = "parent-id"
local childId = "child-id"
local reconciler = Reconciler.new()
local virtualInstancesById = {
[parentId] = {
Name = "Parent",
ClassName = "Folder",
Children = {childId},
Properties = {},
},
[childId] = {
Name = "Child",
ClassName = "Folder",
Children = {},
Properties = {},
},
}
reconciler:reconcile(virtualInstancesById, parentId, parent)
expect(child.Parent).to.equal(parent)
expect(#parent:GetChildren()).to.equal(1)
local newVirtualInstances = {
[parentId] = {
Name = "Parent",
ClassName = "Folder",
Children = {},
Properties = {},
},
[childId] = nil,
}
reconciler:reconcile(newVirtualInstances, parentId, parent)
expect(child.Parent).to.equal(nil)
expect(#parent:GetChildren()).to.equal(0)
end)
it("should remove known removed children if ignoreUnknownInstances is set", function()
local parent = Instance.new("Folder")
parent.Name = "Parent"
local child = Instance.new("Folder")
child.Parent = parent
child.Name = "Child"
local parentId = "parent-id"
local childId = "child-id"
local reconciler = Reconciler.new()
local virtualInstancesById = {
[parentId] = {
Name = "Parent",
ClassName = "Folder",
Children = {childId},
Properties = {},
Metadata = {
ignoreUnknownInstances = true,
},
},
[childId] = {
Name = "Child",
ClassName = "Folder",
Children = {},
Properties = {},
},
}
reconciler:reconcile(virtualInstancesById, parentId, parent)
expect(child.Parent).to.equal(parent)
expect(#parent:GetChildren()).to.equal(1)
local newVirtualInstances = {
[parentId] = {
Name = "Parent",
ClassName = "Folder",
Children = {},
Properties = {},
Metadata = {
ignoreUnknownInstances = true,
},
},
[childId] = nil,
}
reconciler:reconcile(newVirtualInstances, parentId, parent)
expect(child.Parent).to.equal(nil)
expect(#parent:GetChildren()).to.equal(0)
end)
end

36
plugin/src/Types.lua Normal file
View File

@@ -0,0 +1,36 @@
local t = require(script.Parent.Parent.t)
local DevSettings = require(script.Parent.DevSettings)
local VirtualValue = t.interface({
Type = t.string,
Value = t.optional(t.any),
})
local VirtualMetadata = t.interface({
ignoreUnknownInstances = t.optional(t.boolean),
})
local VirtualInstance = t.interface({
Name = t.string,
ClassName = t.string,
Properties = t.map(t.string, VirtualValue),
Metadata = t.optional(VirtualMetadata)
})
local function ifEnabled(innerCheck)
return function(...)
if DevSettings:shouldTypecheck() then
return innerCheck(...)
else
return true
end
end
end
return {
ifEnabled = ifEnabled,
VirtualInstance = VirtualInstance,
VirtualMetadata = VirtualMetadata,
VirtualValue = VirtualValue,
}

View File

@@ -4,16 +4,14 @@ end
local Roact = require(script.Parent.Roact)
Roact.setGlobalConfig({
elementTracing = true,
})
local App = require(script.Components.App)
local app = Roact.createElement(App, {
plugin = plugin,
})
Roact.mount(app, game:GetService("CoreGui"), "Rojo UI")
local tree = Roact.mount(app, game:GetService("CoreGui"), "Rojo UI")
-- TODO: Detect another instance of Rojo coming online and shut down this one.
plugin.Unloading:Connect(function()
Roact.unmount(tree)
end)

View File

@@ -0,0 +1,28 @@
local ContentProvider = game:GetService("ContentProvider")
local Logging = require(script.Parent.Logging)
local Assets = require(script.Parent.Assets)
local function preloadAssets()
local contentUrls = {}
for _, sprite in pairs(Assets.Sprites) do
table.insert(contentUrls, sprite.asset)
end
for _, slice in pairs(Assets.Slices) do
table.insert(contentUrls, slice.asset)
end
for _, url in pairs(Assets.Images) do
table.insert(contentUrls, url)
end
Logging.trace("Preloading assets: %s", table.concat(contentUrls, ", "))
coroutine.wrap(function()
ContentProvider:PreloadAsync(contentUrls)
end)()
end
return preloadAssets

View File

@@ -0,0 +1,38 @@
local primitiveTypes = {
Bool = true,
Enum = true,
Float32 = true,
Float64 = true,
Int32 = true,
Int64 = true,
String = true,
}
local directConstructors = {
CFrame = CFrame.new,
Color3 = Color3.new,
Color3uint8 = Color3.fromRGB,
Rect = Rect.new,
UDim = UDim.new,
UDim2 = UDim2.new,
Vector2 = Vector2.new,
Vector2int16 = Vector2int16.new,
Vector3 = Vector3.new,
Vector3int16 = Vector3int16.new,
}
local function rojoValueToRobloxValue(value)
if primitiveTypes[value.Type] then
return value.Value
end
local constructor = directConstructors[value.Type]
if constructor ~= nil then
return constructor(unpack(value.Value))
end
local errorMessage = ("The Rojo plugin doesn't know how to handle values of type %q yet!"):format(tostring(value.Type))
error(errorMessage)
end
return rojoValueToRobloxValue

View File

@@ -0,0 +1,40 @@
local rojoValueToRobloxValue = require(script.Parent.rojoValueToRobloxValue)
return function()
it("should convert primitives", function()
local inputString = {
Type = "String",
Value = "Hello, world!",
}
local inputFloat32 = {
Type = "Float32",
Value = 12341.512,
}
expect(rojoValueToRobloxValue(inputString)).to.equal(inputString.Value)
expect(rojoValueToRobloxValue(inputFloat32)).to.equal(inputFloat32.Value)
end)
it("should convert properties with direct constructors", function()
local inputColor3 = {
Type = "Color3",
Value = {0, 1, 0.5},
}
local outputColor3 = Color3.new(0, 1, 0.5)
local inputCFrame = {
Type = "CFrame",
Value = {
1, 2, 3,
4, 5, 6,
7, 8, 9,
10, 11, 12,
},
}
local outputCFrame = CFrame.new(1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12)
expect(rojoValueToRobloxValue(inputColor3)).to.equal(outputColor3)
expect(rojoValueToRobloxValue(inputCFrame)).to.equal(outputCFrame)
end)
end

View File

@@ -0,0 +1,57 @@
local Logging = require(script.Parent.Logging)
--[[
Attempts to set a property on the given instance.
This method deals in terms of what Rojo calls 'canonical properties', which
don't necessarily exist either in serialization or in Lua-reflected APIs,
but may be present in the API dump.
Ideally, canonical properties map 1:1 with properties we can assign, but in
some cases like LocalizationTable contents and CollectionService tags, we
have to read/write properties a little differently.
]]
local function setCanonicalProperty(instance, key, value)
-- The 'Contents' property of LocalizationTable isn't directly exposed, but
-- has corresponding (deprecated) getters and setters.
if instance.ClassName == "LocalizationTable" and key == "Contents" then
instance:SetContents(value)
return
end
-- Temporary workaround for fixing issue #141 in this specific case.
if instance.ClassName == "Lighting" and key == "Technology" then
return
end
-- If we don't have permissions to access this value at all, we can skip it.
local readSuccess, existingValue = pcall(function()
return instance[key]
end)
if not readSuccess then
-- An error will be thrown if there was a permission issue or if the
-- property doesn't exist. In the latter case, we should tell the user
-- because it's probably their fault.
if existingValue:find("lacking permission") then
Logging.trace("Permission error reading property %s on class %s", tostring(key), instance.ClassName)
return
else
error(("Invalid property %s on class %s: %s"):format(tostring(key), instance.ClassName, existingValue), 2)
end
end
local writeSuccess, err = pcall(function()
if existingValue ~= value then
instance[key] = value
end
end)
if not writeSuccess then
error(("Cannot set property %s on class %s: %s"):format(tostring(key), instance.ClassName, err), 2)
end
return true
end
return setCanonicalProperty

View File

@@ -1,2 +1,19 @@
local TestEZ = require(game.ReplicatedStorage.TestEZ)
TestEZ.TestBootstrap:run(game.ReplicatedStorage.Rojo.plugin)
local ReplicatedStorage = game:GetService("ReplicatedStorage")
local TestEZ = require(ReplicatedStorage.TestEZ)
local Rojo = ReplicatedStorage.Rojo
local DevSettings = require(Rojo.Plugin.DevSettings)
local setDevSettings = not DevSettings:hasChangedValues()
if setDevSettings then
DevSettings:createTestSettings()
end
TestEZ.TestBootstrap:run({Rojo.Plugin})
if setDevSettings then
DevSettings:resetValues()
end

View File

@@ -1,12 +1,16 @@
[package]
name = "rojo"
version = "0.5.0-alpha.4"
version = "0.5.0-alpha.9"
authors = ["Lucien Greathouse <me@lpghatguy.com>"]
description = "A tool to create robust Roblox projects"
license = "MIT"
repository = "https://github.com/LPGhatguy/rojo"
edition = "2018"
[features]
default = []
server-plugins = []
[lib]
name = "librojo"
path = "src/lib.rs"
@@ -15,25 +19,24 @@ path = "src/lib.rs"
name = "rojo"
path = "src/bin.rs"
[features]
default = []
bundle-plugin = []
[dependencies]
clap = "2.27"
csv = "1.0"
env_logger = "0.6"
failure = "0.1.3"
futures = "0.1"
hyper = "0.12"
log = "0.4"
maplit = "1.0.1"
notify = "4.0"
rand = "0.4"
rbx_binary = "0.2.0"
rbx_tree = "0.2.0"
rbx_xml = "0.2.0"
rbx_binary = "0.4.0"
rbx_dom_weak = "1.3.0"
rbx_xml = "0.6.0"
rbx_reflection = "2.0.374"
regex = "1.0"
reqwest = "0.9.5"
rouille = "2.1"
rlua = "0.16"
ritz = "0.1.0"
serde = "1.0"
serde_derive = "1.0"
serde_json = "1.0"
@@ -43,5 +46,5 @@ uuid = { version = "0.7", features = ["v4", "serde"] }
tempfile = "3.0"
walkdir = "2.1"
lazy_static = "1.2"
pretty_assertions = "0.5.1"
pretty_assertions = "0.6.1"
paste = "0.1"

43
server/assets/index.css Normal file
View File

@@ -0,0 +1,43 @@
* {
margin: 0;
padding: 0;
font: inherit;
}
html {
font-family: sans-serif;
height: 100%;
}
body {
height: 100%;
display: flex;
flex-direction: column;
justify-content: center;
}
.main {
padding: 1rem;
text-align: center;
margin: 0 auto;
width: 100%;
max-width: 60rem;
background-color: #efefef;
border: 1px solid #666;
border-radius: 4px;
}
.title {
font-size: 2rem;
font-weight: bold;
}
.subtitle {
font-size: 1.5rem;
font-weight: bold;
}
.docs {
font-size: 1.3rem;
font-weight: bold;
}

View File

@@ -1,54 +0,0 @@
<!DOCTYPE html>
<html>
<head>
<title>Rojo</title>
<style>
* {
margin: 0;
padding: 0;
font: inherit;
}
html {
font-family: sans-serif;
height: 100%;
}
body {
height: 100%;
display: flex;
flex-direction: column;
justify-content: center;
}
.main {
padding: 1rem;
text-align: center;
margin: 0 auto;
width: 100%;
max-width: 60rem;
background-color: #efefef;
border: 1px solid #666;
border-radius: 4px;
}
.title {
font-size: 2rem;
font-weight: bold;
}
.docs {
font-size: 1.5rem;
font-weight: bold;
}
</style>
</head>
<body>
<div class="main">
<h1 class="title">Rojo Live Sync is up and running!</h1>
<a class="docs" href="https://lpghatguy.github.io/rojo">Rojo Documentation</a>
</div>
</body>
</html>

View File

@@ -0,0 +1,66 @@
{
"name": "[placeholder]",
"tree": {
"$className": "DataModel",
"HttpService": {
"$className": "HttpService",
"$properties": {
"HttpEnabled": true
}
},
"Lighting": {
"$className": "Lighting",
"$properties": {
"Ambient": [
0,
0,
0
],
"Brightness": 2,
"GlobalShadows": true,
"Outlines": false,
"Technology": "Voxel"
}
},
"ReplicatedStorage": {
"$className": "ReplicatedStorage",
"Source": {
"$path": "src"
}
},
"SoundService": {
"$className": "SoundService",
"$properties": {
"RespectFilteringEnabled": true
}
},
"Workspace": {
"$className": "Workspace",
"$properties": {
"FilteringEnabled": true
},
"Baseplate": {
"$className": "Part",
"$properties": {
"Anchored": true,
"Color": [
0.38823,
0.37254,
0.38823
],
"Locked": true,
"Position": [
0,
-10,
0
],
"Size": [
512,
20,
512
]
}
}
}
}
}

View File

@@ -1,16 +1,17 @@
use std::{
path::PathBuf,
fs::File,
io,
io::{self, Write, BufWriter},
};
use log::info;
use failure::Fail;
use crate::{
rbx_session::construct_oneoff_tree,
project::{Project, ProjectLoadFuzzyError},
imfs::{Imfs, FsError},
project::{Project, ProjectLoadFuzzyError},
rbx_session::construct_oneoff_tree,
rbx_snapshot::SnapshotError,
};
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
@@ -59,6 +60,9 @@ pub enum BuildError {
#[fail(display = "{}", _0)]
FsError(#[fail(cause)] FsError),
#[fail(display = "{}", _0)]
SnapshotError(#[fail(cause)] SnapshotError),
}
impl_from!(BuildError {
@@ -67,6 +71,7 @@ impl_from!(BuildError {
rbx_xml::EncodeError => XmlModelEncodeError,
rbx_binary::EncodeError => BinaryModelEncodeError,
FsError => FsError,
SnapshotError => SnapshotError,
});
pub fn build(options: &BuildOptions) -> Result<(), BuildError> {
@@ -86,8 +91,8 @@ pub fn build(options: &BuildOptions) -> Result<(), BuildError> {
let mut imfs = Imfs::new();
imfs.add_roots_from_project(&project)?;
let tree = construct_oneoff_tree(&project, &imfs);
let mut file = File::create(&options.output_file)?;
let tree = construct_oneoff_tree(&project, &imfs)?;
let mut file = BufWriter::new(File::create(&options.output_file)?);
match output_kind {
OutputKind::Rbxmx => {
@@ -116,5 +121,7 @@ pub fn build(options: &BuildOptions) -> Result<(), BuildError> {
},
}
file.flush()?;
Ok(())
}

View File

@@ -8,9 +8,9 @@ use failure::Fail;
use crate::{
project::{Project, ProjectLoadFuzzyError},
web::Server,
web::LiveServer,
imfs::FsError,
live_session::LiveSession,
live_session::{LiveSession, LiveSessionError},
};
const DEFAULT_PORT: u16 = 34872;
@@ -28,11 +28,15 @@ pub enum ServeError {
#[fail(display = "{}", _0)]
FsError(#[fail(cause)] FsError),
#[fail(display = "{}", _0)]
LiveSessionError(#[fail(cause)] LiveSessionError),
}
impl_from!(ServeError {
ProjectLoadFuzzyError => ProjectLoadError,
FsError => FsError,
LiveSessionError => LiveSessionError,
});
pub fn serve(options: &ServeOptions) -> Result<(), ServeError> {
@@ -45,7 +49,7 @@ pub fn serve(options: &ServeOptions) -> Result<(), ServeError> {
info!("Using project {:#?}", project);
let live_session = Arc::new(LiveSession::new(Arc::clone(&project))?);
let server = Server::new(Arc::clone(&live_session));
let server = LiveServer::new(live_session);
let port = options.port
.or(project.serve_port)
@@ -53,7 +57,7 @@ pub fn serve(options: &ServeOptions) -> Result<(), ServeError> {
println!("Rojo server listening on port {}", port);
server.listen(port);
server.start(port);
Ok(())
}

View File

@@ -9,9 +9,10 @@ use failure::Fail;
use reqwest::header::{ACCEPT, USER_AGENT, CONTENT_TYPE, COOKIE};
use crate::{
rbx_session::construct_oneoff_tree,
project::{Project, ProjectLoadFuzzyError},
imfs::{Imfs, FsError},
project::{Project, ProjectLoadFuzzyError},
rbx_session::construct_oneoff_tree,
rbx_snapshot::SnapshotError,
};
#[derive(Debug, Fail)]
@@ -36,6 +37,9 @@ pub enum UploadError {
#[fail(display = "{}", _0)]
FsError(#[fail(cause)] FsError),
#[fail(display = "{}", _0)]
SnapshotError(#[fail(cause)] SnapshotError),
}
impl_from!(UploadError {
@@ -44,6 +48,7 @@ impl_from!(UploadError {
reqwest::Error => HttpError,
rbx_xml::EncodeError => XmlModelEncodeError,
FsError => FsError,
SnapshotError => SnapshotError,
});
#[derive(Debug)]
@@ -67,7 +72,7 @@ pub fn upload(options: &UploadOptions) -> Result<(), UploadError> {
let mut imfs = Imfs::new();
imfs.add_roots_from_project(&project)?;
let tree = construct_oneoff_tree(&project, &imfs);
let tree = construct_oneoff_tree(&project, &imfs)?;
let root_id = tree.get_root_id();
let mut contents = Vec::new();

View File

@@ -1,9 +1,10 @@
use std::{
collections::{HashMap, HashSet},
path::{self, Path, PathBuf},
cmp::Ordering,
collections::{HashMap, HashSet, BTreeSet},
fmt,
fs,
io,
path::{self, Path, PathBuf},
};
use failure::Fail;
@@ -237,7 +238,7 @@ impl Imfs {
} else if metadata.is_dir() {
let item = ImfsItem::Directory(ImfsDirectory {
path: path.to_path_buf(),
children: HashSet::new(),
children: BTreeSet::new(),
});
self.items.insert(path.to_path_buf(), item);
@@ -285,19 +286,43 @@ impl Imfs {
}
}
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
pub struct ImfsFile {
pub path: PathBuf,
pub contents: Vec<u8>,
}
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
pub struct ImfsDirectory {
pub path: PathBuf,
pub children: HashSet<PathBuf>,
impl PartialOrd for ImfsFile {
fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
Some(self.cmp(other))
}
}
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
impl Ord for ImfsFile {
fn cmp(&self, other: &Self) -> Ordering {
self.path.cmp(&other.path)
}
}
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
pub struct ImfsDirectory {
pub path: PathBuf,
pub children: BTreeSet<PathBuf>,
}
impl PartialOrd for ImfsDirectory {
fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
Some(self.cmp(other))
}
}
impl Ord for ImfsDirectory {
fn cmp(&self, other: &Self) -> Ordering {
self.path.cmp(&other.path)
}
}
#[derive(Debug, Clone, PartialEq, Eq, PartialOrd, Ord, Serialize, Deserialize)]
pub enum ImfsItem {
File(ImfsFile),
Directory(ImfsDirectory),

View File

@@ -1,3 +1,5 @@
#![recursion_limit="128"]
// Macros
#[macro_use]
pub mod impl_from;
@@ -16,5 +18,4 @@ pub mod rbx_snapshot;
pub mod session_id;
pub mod snapshot_reconciler;
pub mod visualize;
pub mod web;
pub mod web_util;
pub mod web;

View File

@@ -1,21 +1,40 @@
use std::{
collections::HashSet,
mem,
sync::{Arc, Mutex},
};
use failure::Fail;
use crate::{
fs_watcher::FsWatcher,
imfs::{Imfs, FsError},
message_queue::MessageQueue,
project::Project,
rbx_session::RbxSession,
rbx_snapshot::SnapshotError,
session_id::SessionId,
snapshot_reconciler::InstanceChanges,
};
#[derive(Debug, Fail)]
pub enum LiveSessionError {
#[fail(display = "{}", _0)]
Fs(#[fail(cause)] FsError),
#[fail(display = "{}", _0)]
Snapshot(#[fail(cause)] SnapshotError),
}
impl_from!(LiveSessionError {
FsError => Fs,
SnapshotError => Snapshot,
});
/// Contains all of the state for a Rojo live-sync session.
pub struct LiveSession {
pub project: Arc<Project>,
pub session_id: SessionId,
project: Arc<Project>,
session_id: SessionId,
pub message_queue: Arc<MessageQueue<InstanceChanges>>,
pub rbx_session: Arc<Mutex<RbxSession>>,
pub imfs: Arc<Mutex<Imfs>>,
@@ -23,7 +42,7 @@ pub struct LiveSession {
}
impl LiveSession {
pub fn new(project: Arc<Project>) -> Result<LiveSession, FsError> {
pub fn new(project: Arc<Project>) -> Result<LiveSession, LiveSessionError> {
let imfs = {
let mut imfs = Imfs::new();
imfs.add_roots_from_project(&project)?;
@@ -36,7 +55,7 @@ impl LiveSession {
Arc::clone(&project),
Arc::clone(&imfs),
Arc::clone(&message_queue),
)));
)?));
let fs_watcher = FsWatcher::start(
Arc::clone(&imfs),
@@ -46,8 +65,8 @@ impl LiveSession {
let session_id = SessionId::new();
Ok(LiveSession {
project,
session_id,
project,
message_queue,
rbx_session,
imfs,
@@ -55,7 +74,26 @@ impl LiveSession {
})
}
pub fn get_project(&self) -> &Project {
/// Restarts the live session using the given project while preserving the
/// internal session ID.
pub fn restart_with_new_project(&mut self, project: Arc<Project>) -> Result<(), LiveSessionError> {
let mut new_session = LiveSession::new(project)?;
new_session.session_id = self.session_id;
mem::replace(self, new_session);
Ok(())
}
pub fn root_project(&self) -> &Project {
&self.project
}
pub fn session_id(&self) -> SessionId {
self.session_id
}
pub fn serve_place_ids(&self) -> &Option<HashSet<u64>> {
&self.project.serve_place_ids
}
}

View File

@@ -1,67 +1,83 @@
use std::{
collections::HashMap,
mem,
sync::{
mpsc,
atomic::{AtomicUsize, Ordering},
RwLock,
Mutex,
},
};
/// A unique identifier, not guaranteed to be generated in any order.
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
pub struct ListenerId(usize);
use futures::sync::oneshot;
/// Generate a new ID, which has no defined ordering.
pub fn get_listener_id() -> ListenerId {
static LAST_ID: AtomicUsize = AtomicUsize::new(0);
struct Listener<T> {
sender: oneshot::Sender<(u32, Vec<T>)>,
cursor: u32,
}
ListenerId(LAST_ID.fetch_add(1, Ordering::SeqCst))
fn fire_listener_if_ready<T: Clone>(messages: &[T], listener: Listener<T>) -> Result<(), Listener<T>> {
let current_cursor = messages.len() as u32;
if listener.cursor < current_cursor {
let new_messages = messages[(listener.cursor as usize)..].to_vec();
let _ = listener.sender.send((current_cursor, new_messages));
Ok(())
} else {
Err(listener)
}
}
/// A message queue with persistent history that can be subscribed to.
///
/// Definitely non-optimal, but a simple design that works well for the
/// synchronous web server Rojo uses, Rouille.
/// Definitely non-optimal. This would ideally be a lockless mpmc queue.
#[derive(Default)]
pub struct MessageQueue<T> {
messages: RwLock<Vec<T>>,
message_listeners: Mutex<HashMap<ListenerId, mpsc::Sender<()>>>,
message_listeners: Mutex<Vec<Listener<T>>>,
}
impl<T: Clone> MessageQueue<T> {
pub fn new() -> MessageQueue<T> {
MessageQueue {
messages: RwLock::new(Vec::new()),
message_listeners: Mutex::new(HashMap::new()),
message_listeners: Mutex::new(Vec::new()),
}
}
pub fn push_messages(&self, new_messages: &[T]) {
let message_listeners = self.message_listeners.lock().unwrap();
let mut message_listeners = self.message_listeners.lock().unwrap();
let mut messages = self.messages.write().unwrap();
messages.extend_from_slice(new_messages);
{
let mut messages = self.messages.write().unwrap();
messages.extend_from_slice(new_messages);
let mut remaining_listeners = Vec::new();
for listener in message_listeners.drain(..) {
match fire_listener_if_ready(&messages, listener) {
Ok(_) => {}
Err(listener) => remaining_listeners.push(listener)
}
}
for listener in message_listeners.values() {
listener.send(()).unwrap();
}
// Without this annotation, Rust gets confused since the first argument
// is a MutexGuard, but the second is a Vec.
mem::replace::<Vec<_>>(&mut message_listeners, remaining_listeners);
}
pub fn subscribe(&self, sender: mpsc::Sender<()>) -> ListenerId {
let id = get_listener_id();
pub fn subscribe(&self, cursor: u32, sender: oneshot::Sender<(u32, Vec<T>)>) {
let listener = {
let listener = Listener {
sender,
cursor,
};
let messages = self.messages.read().unwrap();
match fire_listener_if_ready(&messages, listener) {
Ok(_) => return,
Err(listener) => listener
}
};
let mut message_listeners = self.message_listeners.lock().unwrap();
message_listeners.insert(id, sender);
id
}
pub fn unsubscribe(&self, id: ListenerId) {
let mut message_listeners = self.message_listeners.lock().unwrap();
message_listeners.remove(&id);
message_listeners.push(listener);
}
pub fn get_message_cursor(&self) -> u32 {

View File

@@ -20,6 +20,12 @@ pub struct PathMap<T> {
nodes: HashMap<PathBuf, PathMapNode<T>>,
}
impl<T> Default for PathMap<T> {
fn default() -> Self {
Self::new()
}
}
impl<T> PathMap<T> {
pub fn new() -> PathMap<T> {
PathMap {

View File

@@ -1,5 +1,5 @@
use std::{
collections::{HashMap, HashSet},
collections::{HashMap, HashSet, BTreeMap},
fmt,
fs::{self, File},
io,
@@ -8,9 +8,11 @@ use std::{
use log::warn;
use failure::Fail;
use maplit::hashmap;
use rbx_tree::RbxValue;
use rbx_dom_weak::{UnresolvedRbxValue, RbxValue};
use serde_derive::{Serialize, Deserialize};
use serde::{Serialize, Serializer};
static DEFAULT_PLACE: &'static str = include_str!("../assets/place.project.json");
pub static PROJECT_FILENAME: &'static str = "default.project.json";
pub static COMPAT_PROJECT_FILENAME: &'static str = "roblox-project.json";
@@ -24,6 +26,10 @@ struct SourceProject {
name: String,
tree: SourceProjectNode,
#[cfg_attr(not(feature = "plugins-enabled"), serde(skip_deserializing))]
#[serde(default = "Vec::new", skip_serializing_if = "Vec::is_empty")]
plugins: Vec<SourcePlugin>,
#[serde(skip_serializing_if = "Option::is_none")]
serve_port: Option<u16>,
@@ -33,12 +39,17 @@ struct SourceProject {
impl SourceProject {
/// Consumes the SourceProject and yields a Project, ready for prime-time.
pub fn into_project(self, project_file_location: &Path) -> Project {
pub fn into_project(mut self, project_file_location: &Path) -> Project {
let tree = self.tree.into_project_node(project_file_location);
let plugins = self.plugins
.drain(..)
.map(|source_plugin| source_plugin.into_plugin(project_file_location))
.collect();
Project {
name: self.name,
tree,
plugins,
serve_port: self.serve_port,
serve_place_ids: self.serve_place_ids,
file_location: PathBuf::from(project_file_location),
@@ -46,16 +57,89 @@ impl SourceProject {
}
}
/// An alternative serializer for `UnresolvedRbxValue` that uses the minimum
/// representation of the value.
///
/// For example, the default Serialize impl might give you:
///
/// ```json
/// {
/// "Type": "Bool",
/// "Value": true
/// }
/// ```
///
/// But in reality, users are expected to write just:
///
/// ```json
/// true
/// ```
///
/// This holds true for other values that might be ambiguous or just have more
/// complicated representations like enums.
fn serialize_unresolved_minimal<S>(unresolved: &UnresolvedRbxValue, serializer: S) -> Result<S::Ok, S::Error>
where S: Serializer
{
match unresolved {
UnresolvedRbxValue::Ambiguous(_) => unresolved.serialize(serializer),
UnresolvedRbxValue::Concrete(concrete) => {
match concrete {
RbxValue::Bool { value } => value.serialize(serializer),
RbxValue::CFrame { value } => value.serialize(serializer),
RbxValue::Color3 { value } => value.serialize(serializer),
RbxValue::Color3uint8 { value } => value.serialize(serializer),
RbxValue::Content { value } => value.serialize(serializer),
RbxValue::Enum { value } => value.serialize(serializer),
RbxValue::Float32 { value } => value.serialize(serializer),
RbxValue::Int32 { value } => value.serialize(serializer),
RbxValue::String { value } => value.serialize(serializer),
RbxValue::UDim { value } => value.serialize(serializer),
RbxValue::UDim2 { value } => value.serialize(serializer),
RbxValue::Vector2 { value } => value.serialize(serializer),
RbxValue::Vector2int16 { value } => value.serialize(serializer),
RbxValue::Vector3 { value } => value.serialize(serializer),
RbxValue::Vector3int16 { value } => value.serialize(serializer),
_ => concrete.serialize(serializer),
}
},
}
}
/// A wrapper around serialize_unresolved_minimal that handles the HashMap case.
fn serialize_unresolved_map<S>(value: &HashMap<String, UnresolvedRbxValue>, serializer: S) -> Result<S::Ok, S::Error>
where S: Serializer
{
use serde::ser::SerializeMap;
#[derive(Serialize)]
struct Minimal<'a>(
#[serde(serialize_with = "serialize_unresolved_minimal")]
&'a UnresolvedRbxValue
);
let mut map = serializer.serialize_map(Some(value.len()))?;
for (k, v) in value {
map.serialize_key(k)?;
map.serialize_value(&Minimal(v))?;
}
map.end()
}
/// Similar to SourceProject, the structure of nodes in the project tree is
/// slightly different on-disk than how we want to handle them in the rest of
/// Rojo.
#[derive(Debug, Serialize, Deserialize)]
#[derive(Debug, Clone, Serialize, Deserialize)]
struct SourceProjectNode {
#[serde(rename = "$className", skip_serializing_if = "Option::is_none")]
class_name: Option<String>,
#[serde(rename = "$properties", default = "HashMap::new", skip_serializing_if = "HashMap::is_empty")]
properties: HashMap<String, RbxValue>,
#[serde(
rename = "$properties",
default = "HashMap::new",
skip_serializing_if = "HashMap::is_empty",
serialize_with = "serialize_unresolved_map",
)]
properties: HashMap<String, UnresolvedRbxValue>,
#[serde(rename = "$ignoreUnknownInstances", skip_serializing_if = "Option::is_none")]
ignore_unknown_instances: Option<bool>,
@@ -64,14 +148,14 @@ struct SourceProjectNode {
path: Option<String>,
#[serde(flatten)]
children: HashMap<String, SourceProjectNode>,
children: BTreeMap<String, SourceProjectNode>,
}
impl SourceProjectNode {
/// Consumes the SourceProjectNode and turns it into a ProjectNode.
pub fn into_project_node(mut self, project_file_location: &Path) -> ProjectNode {
let children = self.children.drain()
.map(|(key, value)| (key, value.into_project_node(project_file_location)))
pub fn into_project_node(self, project_file_location: &Path) -> ProjectNode {
let children = self.children.iter()
.map(|(key, value)| (key.clone(), value.clone().into_project_node(project_file_location)))
.collect();
// Make sure that paths are absolute, transforming them by adding the
@@ -95,6 +179,26 @@ impl SourceProjectNode {
}
}
#[derive(Debug, Serialize, Deserialize)]
struct SourcePlugin {
path: String,
}
impl SourcePlugin {
pub fn into_plugin(self, project_file_location: &Path) -> Plugin {
let path = if Path::new(&self.path).is_absolute() {
PathBuf::from(self.path)
} else {
let project_folder_location = project_file_location.parent().unwrap();
project_folder_location.join(self.path)
};
Plugin {
path,
}
}
}
/// Error returned by Project::load_exact
#[derive(Debug, Fail)]
pub enum ProjectLoadExactError {
@@ -133,6 +237,7 @@ pub enum ProjectInitError {
AlreadyExists(PathBuf),
IoError(#[fail(cause)] io::Error),
SaveError(#[fail(cause)] ProjectSaveError),
JsonError(#[fail(cause)] serde_json::Error),
}
impl fmt::Display for ProjectInitError {
@@ -141,6 +246,7 @@ impl fmt::Display for ProjectInitError {
ProjectInitError::AlreadyExists(path) => write!(output, "Path {} already exists", path.display()),
ProjectInitError::IoError(inner) => write!(output, "IO error: {}", inner),
ProjectInitError::SaveError(inner) => write!(output, "{}", inner),
ProjectInitError::JsonError(inner) => write!(output, "{}", inner),
}
}
}
@@ -158,8 +264,8 @@ pub enum ProjectSaveError {
#[derive(Debug, Clone, PartialEq, Default, Serialize, Deserialize)]
pub struct ProjectNode {
pub class_name: Option<String>,
pub children: HashMap<String, ProjectNode>,
pub properties: HashMap<String, RbxValue>,
pub children: BTreeMap<String, ProjectNode>,
pub properties: HashMap<String, UnresolvedRbxValue>,
pub ignore_unknown_instances: Option<bool>,
#[serde(serialize_with = "crate::path_serializer::serialize_option")]
@@ -198,10 +304,30 @@ impl ProjectNode {
}
}
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
pub struct Plugin {
pub path: PathBuf,
}
impl Plugin {
fn to_source_plugin(&self, project_file_location: &Path) -> SourcePlugin {
let project_folder_location = project_file_location.parent().unwrap();
let path = match self.path.strip_prefix(project_folder_location) {
Ok(stripped) => stripped.to_str().unwrap().replace("\\", "/"),
Err(_) => format!("{}", self.path.display()),
};
SourcePlugin {
path,
}
}
}
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)]
pub struct Project {
pub name: String,
pub tree: ProjectNode,
pub plugins: Vec<Plugin>,
pub serve_port: Option<u16>,
pub serve_place_ids: Option<HashSet<u64>>,
pub file_location: PathBuf,
@@ -210,46 +336,16 @@ pub struct Project {
impl Project {
pub fn init_place(project_fuzzy_path: &Path) -> Result<PathBuf, ProjectInitError> {
let project_path = Project::init_pick_path(project_fuzzy_path)?;
let project_folder_path = project_path.parent().unwrap();
let project_name = if project_fuzzy_path == project_path {
project_fuzzy_path.parent().unwrap().file_name().unwrap().to_str().unwrap()
} else {
project_fuzzy_path.file_name().unwrap().to_str().unwrap()
};
let tree = ProjectNode {
class_name: Some(String::from("DataModel")),
children: hashmap! {
String::from("ReplicatedStorage") => ProjectNode {
class_name: Some(String::from("ReplicatedStorage")),
children: hashmap! {
String::from("Source") => ProjectNode {
path: Some(project_folder_path.join("src")),
..Default::default()
},
},
..Default::default()
},
String::from("HttpService") => ProjectNode {
class_name: Some(String::from("HttpService")),
properties: hashmap! {
String::from("HttpEnabled") => RbxValue::Bool {
value: true,
},
},
..Default::default()
},
},
..Default::default()
};
let mut project = Project::load_from_str(DEFAULT_PLACE, &project_path)
.map_err(ProjectInitError::JsonError)?;
let project = Project {
name: project_name.to_string(),
tree,
serve_port: None,
serve_place_ids: None,
file_location: project_path.clone(),
};
project.name = project_name.to_owned();
project.save()
.map_err(ProjectInitError::SaveError)?;
@@ -274,6 +370,7 @@ impl Project {
let project = Project {
name: project_name.to_string(),
tree,
plugins: Vec::new(),
serve_port: None,
serve_place_ids: None,
file_location: project_path.clone(),
@@ -336,6 +433,12 @@ impl Project {
}
}
fn load_from_str(contents: &str, project_file_location: &Path) -> Result<Project, serde_json::Error> {
let parsed: SourceProject = serde_json::from_str(&contents)?;
Ok(parsed.into_project(project_file_location))
}
pub fn load_fuzzy(fuzzy_project_location: &Path) -> Result<Project, ProjectLoadFuzzyError> {
let project_path = Self::locate(fuzzy_project_location)
.ok_or(ProjectLoadFuzzyError::NotFound)?;
@@ -383,10 +486,20 @@ impl Project {
}
}
pub fn folder_location(&self) -> &Path {
self.file_location.parent().unwrap()
}
fn to_source_project(&self) -> SourceProject {
let plugins = self.plugins
.iter()
.map(|plugin| plugin.to_source_plugin(&self.file_location))
.collect();
SourceProject {
name: self.name.clone(),
tree: self.tree.to_source_node(&self.file_location),
plugins,
serve_port: self.serve_port,
serve_place_ids: self.serve_place_ids.clone(),
}

View File

@@ -6,16 +6,25 @@ use std::{
sync::{Arc, Mutex},
};
use rlua::Lua;
use serde_derive::{Serialize, Deserialize};
use log::{info, trace};
use rbx_tree::{RbxTree, RbxId};
use log::{info, trace, error};
use rbx_dom_weak::{RbxTree, RbxId};
use crate::{
project::{Project, ProjectNode},
message_queue::MessageQueue,
imfs::{Imfs, ImfsItem},
path_map::PathMap,
rbx_snapshot::{snapshot_project_tree, snapshot_project_node, snapshot_imfs_path},
rbx_snapshot::{
SnapshotError,
SnapshotContext,
SnapshotPluginContext,
SnapshotPluginEntry,
snapshot_project_tree,
snapshot_project_node,
snapshot_imfs_path,
},
snapshot_reconciler::{InstanceChanges, reify_root, reconcile_subtree},
};
@@ -58,22 +67,60 @@ impl RbxSession {
project: Arc<Project>,
imfs: Arc<Mutex<Imfs>>,
message_queue: Arc<MessageQueue<InstanceChanges>>,
) -> RbxSession {
) -> Result<RbxSession, SnapshotError> {
let mut instances_per_path = PathMap::new();
let mut metadata_per_instance = HashMap::new();
let tree = {
let temp_imfs = imfs.lock().unwrap();
reify_initial_tree(&project, &temp_imfs, &mut instances_per_path, &mut metadata_per_instance)
let plugin_context = if cfg!(feature = "server-plugins") {
let lua = Lua::new();
let mut callback_key = None;
lua.context(|context| {
let callback = context.load(r#"
return function(snapshot)
print("got my snapshot:", snapshot)
print("name:", snapshot.name, "class name:", snapshot.className)
end"#)
.set_name("a cool plugin").unwrap()
.call::<(), rlua::Function>(()).unwrap();
callback_key = Some(context.create_registry_value(callback).unwrap());
});
let plugins = vec![
SnapshotPluginEntry {
file_name_filter: String::new(),
callback: callback_key.unwrap(),
}
];
Some(SnapshotPluginContext { lua, plugins })
} else {
None
};
RbxSession {
let context = SnapshotContext {
plugin_context,
};
let tree = {
let temp_imfs = imfs.lock().unwrap();
reify_initial_tree(
&project,
&context,
&temp_imfs,
&mut instances_per_path,
&mut metadata_per_instance,
)?
};
Ok(RbxSession {
tree,
instances_per_path,
metadata_per_instance,
message_queue,
imfs,
}
})
}
fn path_created_or_updated(&mut self, path: &Path) {
@@ -104,27 +151,37 @@ impl RbxSession {
.expect("Metadata did not exist for path")
.clone();
let context = SnapshotContext {
plugin_context: None,
};
for instance_id in &instances_at_path {
let instance_metadata = self.metadata_per_instance.get(&instance_id)
.expect("Metadata for instance ID did not exist");
let maybe_snapshot = match &instance_metadata.project_definition {
Some((instance_name, project_node)) => {
snapshot_project_node(&imfs, &project_node, Cow::Owned(instance_name.clone()))
.unwrap_or_else(|_| panic!("Could not generate instance snapshot for path {}", path_to_snapshot.display()))
snapshot_project_node(&context, &imfs, &project_node, Cow::Owned(instance_name.clone()))
// .unwrap_or_else(|_| panic!("Could not generate instance snapshot for path {}", path_to_snapshot.display()))
},
None => {
snapshot_imfs_path(&imfs, &path_to_snapshot, None)
.unwrap_or_else(|_| panic!("Could not generate instance snapshot for path {}", path_to_snapshot.display()))
snapshot_imfs_path(&context, &imfs, &path_to_snapshot, None)
// .unwrap_or_else(|_| panic!("Could not generate instance snapshot for path {}", path_to_snapshot.display()))
},
};
let snapshot = match maybe_snapshot {
Some(snapshot) => snapshot,
None => {
Ok(Some(snapshot)) => snapshot,
Ok(None) => {
trace!("Path resulted in no snapshot being generated.");
return;
},
Err(err) => {
error!("Rojo couldn't turn one of the project's files into Roblox instances.");
error!("Any changes to the file have been ignored.");
error!("{}", err);
return;
},
};
trace!("Snapshot: {:#?}", snapshot);
@@ -194,29 +251,39 @@ impl RbxSession {
&self.tree
}
pub fn get_all_instance_metadata(&self) -> &HashMap<RbxId, MetadataPerInstance> {
&self.metadata_per_instance
}
pub fn get_instance_metadata(&self, id: RbxId) -> Option<&MetadataPerInstance> {
self.metadata_per_instance.get(&id)
}
}
pub fn construct_oneoff_tree(project: &Project, imfs: &Imfs) -> RbxTree {
pub fn construct_oneoff_tree(project: &Project, imfs: &Imfs) -> Result<RbxTree, SnapshotError> {
let mut instances_per_path = PathMap::new();
let mut metadata_per_instance = HashMap::new();
reify_initial_tree(project, imfs, &mut instances_per_path, &mut metadata_per_instance)
let context = SnapshotContext {
plugin_context: None,
};
reify_initial_tree(project, &context, imfs, &mut instances_per_path, &mut metadata_per_instance)
}
fn reify_initial_tree(
project: &Project,
context: &SnapshotContext,
imfs: &Imfs,
instances_per_path: &mut PathMap<HashSet<RbxId>>,
metadata_per_instance: &mut HashMap<RbxId, MetadataPerInstance>,
) -> RbxTree {
let snapshot = snapshot_project_tree(imfs, project)
.expect("Could not snapshot project tree")
.expect("Project did not produce any instances");
) -> Result<RbxTree, SnapshotError> {
let snapshot = match snapshot_project_tree(&context, imfs, project)? {
Some(snapshot) => snapshot,
None => panic!("Project did not produce any instances"),
};
let mut changes = InstanceChanges::default();
let tree = reify_root(&snapshot, instances_per_path, metadata_per_instance, &mut changes);
tree
Ok(tree)
}

View File

@@ -9,11 +9,13 @@ use std::{
str,
};
use rlua::Lua;
use failure::Fail;
use log::info;
use maplit::hashmap;
use rbx_tree::{RbxTree, RbxValue, RbxInstanceProperties};
use rbx_dom_weak::{RbxTree, RbxValue, RbxInstanceProperties};
use serde_derive::{Serialize, Deserialize};
use rbx_reflection::{try_resolve_value, ValueResolveError};
use crate::{
imfs::{
@@ -38,6 +40,53 @@ const INIT_MODULE_NAME: &str = "init.lua";
const INIT_SERVER_NAME: &str = "init.server.lua";
const INIT_CLIENT_NAME: &str = "init.client.lua";
pub struct SnapshotContext {
pub plugin_context: Option<SnapshotPluginContext>,
}
/// Context that's only relevant to generating snapshots if there are plugins
/// associated with the project.
///
/// It's possible that this needs some sort of extra nesting/filtering to
/// support nested projects, since their plugins should only apply to
/// themselves.
pub struct SnapshotPluginContext {
pub lua: Lua,
pub plugins: Vec<SnapshotPluginEntry>,
}
pub struct SnapshotPluginEntry {
/// Simple file name suffix filter to avoid running plugins on every file
/// change.
pub file_name_filter: String,
/// A key into the Lua registry created by [`create_registry_value`] that
/// refers to a function that can be called to transform a file/instance
/// pair according to how the plugin needs to operate.
///
/// [`create_registry_value`]: https://docs.rs/rlua/0.16.2/rlua/struct.Context.html#method.create_registry_value
pub callback: rlua::RegistryKey,
}
#[derive(Debug, Clone)]
struct LuaRbxSnapshot(RbxSnapshotInstance<'static>);
impl rlua::UserData for LuaRbxSnapshot {
fn add_methods<'lua, M: rlua::UserDataMethods<'lua, Self>>(methods: &mut M) {
methods.add_meta_method(rlua::MetaMethod::Index, |_context, this, key: String| {
match key.as_str() {
"name" => Ok(this.0.name.clone().into_owned()),
"className" => Ok(this.0.class_name.clone().into_owned()),
_ => Err(rlua::Error::RuntimeError(format!("{} is not a valid member of RbxSnapshotInstance", &key))),
}
});
methods.add_meta_method(rlua::MetaMethod::ToString, |_context, _this, _args: ()| {
Ok("RbxSnapshotInstance")
});
}
}
pub type SnapshotResult<'a> = Result<Option<RbxSnapshotInstance<'a>>, SnapshotError>;
#[derive(Debug, Fail)]
@@ -57,6 +106,7 @@ pub enum SnapshotError {
},
XmlModelDecodeError {
#[fail(cause)]
inner: rbx_xml::DecodeError,
path: PathBuf,
},
@@ -66,11 +116,30 @@ pub enum SnapshotError {
path: PathBuf,
},
CsvDecodeError {
#[fail(cause)]
inner: csv::Error,
path: PathBuf,
},
ProjectNodeUnusable,
ProjectNodeInvalidTransmute {
partition_path: PathBuf,
},
PropertyResolveError {
#[fail(cause)]
inner: ValueResolveError,
},
}
impl From<ValueResolveError> for SnapshotError {
fn from(inner: ValueResolveError) -> SnapshotError {
SnapshotError::PropertyResolveError {
inner,
}
}
}
impl fmt::Display for SnapshotError {
@@ -89,6 +158,9 @@ impl fmt::Display for SnapshotError {
SnapshotError::BinaryModelDecodeError { inner, path } => {
write!(output, "Malformed rbxm model: {:?} in path {}", inner, path.display())
},
SnapshotError::CsvDecodeError { inner, path } => {
write!(output, "Malformed csv file: {} in path {}", inner, path.display())
},
SnapshotError::ProjectNodeUnusable => {
write!(output, "Rojo project nodes must specify either $path or $className.")
},
@@ -99,24 +171,27 @@ impl fmt::Display for SnapshotError {
writeln!(output, "")?;
writeln!(output, "Partition target ($path): {}", partition_path.display())
},
SnapshotError::PropertyResolveError { inner } => write!(output, "{}", inner),
}
}
}
pub fn snapshot_project_tree<'source>(
context: &SnapshotContext,
imfs: &'source Imfs,
project: &'source Project,
) -> SnapshotResult<'source> {
snapshot_project_node(imfs, &project.tree, Cow::Borrowed(&project.name))
snapshot_project_node(context, imfs, &project.tree, Cow::Borrowed(&project.name))
}
pub fn snapshot_project_node<'source>(
context: &SnapshotContext,
imfs: &'source Imfs,
node: &ProjectNode,
instance_name: Cow<'source, str>,
) -> SnapshotResult<'source> {
let maybe_snapshot = match &node.path {
Some(path) => snapshot_imfs_path(imfs, &path, Some(instance_name.clone()))?,
Some(path) => snapshot_imfs_path(context, imfs, &path, Some(instance_name.clone()))?,
None => match &node.class_name {
Some(_class_name) => Some(RbxSnapshotInstance {
name: instance_name.clone(),
@@ -170,13 +245,14 @@ pub fn snapshot_project_node<'source>(
}
for (child_name, child_project_node) in &node.children {
if let Some(child) = snapshot_project_node(imfs, child_project_node, Cow::Owned(child_name.clone()))? {
if let Some(child) = snapshot_project_node(context, imfs, child_project_node, Cow::Owned(child_name.clone()))? {
snapshot.children.push(child);
}
}
for (key, value) in &node.properties {
snapshot.properties.insert(key.clone(), value.clone());
let resolved_value = try_resolve_value(&snapshot.class_name, key, value)?;
snapshot.properties.insert(key.clone(), resolved_value);
}
if let Some(ignore_unknown_instances) = node.ignore_unknown_instances {
@@ -189,6 +265,7 @@ pub fn snapshot_project_node<'source>(
}
pub fn snapshot_imfs_path<'source>(
context: &SnapshotContext,
imfs: &'source Imfs,
path: &Path,
instance_name: Option<Cow<'source, str>>,
@@ -196,23 +273,25 @@ pub fn snapshot_imfs_path<'source>(
// If the given path doesn't exist in the in-memory filesystem, we consider
// that an error.
match imfs.get(path) {
Some(imfs_item) => snapshot_imfs_item(imfs, imfs_item, instance_name),
Some(imfs_item) => snapshot_imfs_item(context, imfs, imfs_item, instance_name),
None => return Err(SnapshotError::DidNotExist(path.to_owned())),
}
}
fn snapshot_imfs_item<'source>(
context: &SnapshotContext,
imfs: &'source Imfs,
item: &'source ImfsItem,
instance_name: Option<Cow<'source, str>>,
) -> SnapshotResult<'source> {
match item {
ImfsItem::File(file) => snapshot_imfs_file(file, instance_name),
ImfsItem::Directory(directory) => snapshot_imfs_directory(imfs, directory, instance_name),
ImfsItem::File(file) => snapshot_imfs_file(context, file, instance_name),
ImfsItem::Directory(directory) => snapshot_imfs_directory(context, imfs, directory, instance_name),
}
}
fn snapshot_imfs_directory<'source>(
context: &SnapshotContext,
imfs: &'source Imfs,
directory: &'source ImfsDirectory,
instance_name: Option<Cow<'source, str>>,
@@ -229,11 +308,11 @@ fn snapshot_imfs_directory<'source>(
});
let mut snapshot = if directory.children.contains(&init_path) {
snapshot_imfs_path(imfs, &init_path, Some(snapshot_name))?.unwrap()
snapshot_imfs_path(context, imfs, &init_path, Some(snapshot_name))?.unwrap()
} else if directory.children.contains(&init_server_path) {
snapshot_imfs_path(imfs, &init_server_path, Some(snapshot_name))?.unwrap()
snapshot_imfs_path(context, imfs, &init_server_path, Some(snapshot_name))?.unwrap()
} else if directory.children.contains(&init_client_path) {
snapshot_imfs_path(imfs, &init_client_path, Some(snapshot_name))?.unwrap()
snapshot_imfs_path(context, imfs, &init_client_path, Some(snapshot_name))?.unwrap()
} else {
RbxSnapshotInstance {
class_name: Cow::Borrowed("Folder"),
@@ -262,7 +341,7 @@ fn snapshot_imfs_directory<'source>(
// them here.
},
_ => {
if let Some(child) = snapshot_imfs_path(imfs, child_path, None)? {
if let Some(child) = snapshot_imfs_path(context, imfs, child_path, None)? {
snapshot.children.push(child);
}
},
@@ -273,6 +352,7 @@ fn snapshot_imfs_directory<'source>(
}
fn snapshot_imfs_file<'source>(
context: &SnapshotContext,
file: &'source ImfsFile,
instance_name: Option<Cow<'source, str>>,
) -> SnapshotResult<'source> {
@@ -308,6 +388,20 @@ fn snapshot_imfs_file<'source>(
info!("File generated no snapshot: {}", file.path.display());
}
if let Some(snapshot) = maybe_snapshot.as_ref() {
if let Some(plugin_context) = &context.plugin_context {
for plugin in &plugin_context.plugins {
let owned_snapshot = snapshot.get_owned();
let registry_key = &plugin.callback;
plugin_context.lua.context(move |context| {
let callback: rlua::Function = context.registry_value(registry_key).unwrap();
callback.call::<_, ()>(LuaRbxSnapshot(owned_snapshot)).unwrap();
});
}
}
}
Ok(maybe_snapshot)
}
@@ -391,16 +485,87 @@ fn snapshot_txt_file<'source>(
fn snapshot_csv_file<'source>(
file: &'source ImfsFile,
) -> SnapshotResult<'source> {
/// Struct that holds any valid row from a Roblox CSV translation table.
///
/// We manually deserialize into this table from CSV, but let JSON handle
/// serializing.
#[derive(Debug, Default, Serialize)]
#[serde(rename_all = "camelCase")]
struct LocalizationEntry<'a> {
#[serde(skip_serializing_if = "Option::is_none")]
key: Option<&'a str>,
#[serde(skip_serializing_if = "Option::is_none")]
context: Option<&'a str>,
#[serde(skip_serializing_if = "Option::is_none")]
example: Option<&'a str>,
#[serde(skip_serializing_if = "Option::is_none")]
source: Option<&'a str>,
values: HashMap<&'a str, &'a str>,
}
let instance_name = file.path
.file_stem().expect("Could not extract file stem")
.to_str().expect("Could not convert path to UTF-8");
let entries: Vec<LocalizationEntryJson> = csv::Reader::from_reader(file.contents.as_slice())
.deserialize()
// TODO: Propagate error upward instead of panicking
.map(|result| result.expect("Malformed localization table found!"))
.map(LocalizationEntryCsv::to_json)
.collect();
// Normally, we'd be able to let the csv crate construct our struct for us.
//
// However, because of a limitation with Serde's 'flatten' feature, it's not
// possible presently to losslessly collect extra string values while using
// csv+Serde.
//
// https://github.com/BurntSushi/rust-csv/issues/151
let mut reader = csv::Reader::from_reader(file.contents.as_slice());
let headers = reader.headers()
.map_err(|inner| SnapshotError::CsvDecodeError {
inner,
path: file.path.to_path_buf(),
})?
.clone();
let mut records = Vec::new();
for record in reader.into_records() {
let record = record
.map_err(|inner| SnapshotError::CsvDecodeError {
inner,
path: file.path.to_path_buf(),
})?;
records.push(record);
}
let mut entries = Vec::new();
for record in &records {
let mut entry = LocalizationEntry::default();
for (header, value) in headers.iter().zip(record.into_iter()) {
if header.is_empty() || value.is_empty() {
continue;
}
match header {
"Key" => entry.key = Some(value),
"Source" => entry.source = Some(value),
"Context" => entry.context = Some(value),
"Example" => entry.example = Some(value),
_ => {
entry.values.insert(header, value);
}
}
}
if entry.key.is_none() && entry.source.is_none() {
continue;
}
entries.push(entry);
}
let table_contents = serde_json::to_string(&entries)
.expect("Could not encode JSON for localization table");
@@ -422,39 +587,6 @@ fn snapshot_csv_file<'source>(
}))
}
#[derive(Debug, Serialize, Deserialize)]
#[serde(rename_all = "PascalCase")]
struct LocalizationEntryCsv {
key: String,
context: String,
example: String,
source: String,
#[serde(flatten)]
values: HashMap<String, String>,
}
impl LocalizationEntryCsv {
fn to_json(self) -> LocalizationEntryJson {
LocalizationEntryJson {
key: self.key,
context: self.context,
example: self.example,
source: self.source,
values: self.values,
}
}
}
#[derive(Debug, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
struct LocalizationEntryJson {
key: String,
context: String,
example: String,
source: String,
values: HashMap<String, String>,
}
fn snapshot_json_model_file<'source>(
file: &'source ImfsFile,
) -> SnapshotResult<'source> {

View File

@@ -10,7 +10,7 @@ use std::{
str,
};
use rbx_tree::{RbxTree, RbxId, RbxInstanceProperties, RbxValue};
use rbx_dom_weak::{RbxTree, RbxId, RbxInstanceProperties, RbxValue};
use serde_derive::{Serialize, Deserialize};
use crate::{
@@ -64,7 +64,7 @@ impl InstanceChanges {
/// A lightweight, hierarchical representation of an instance that can be
/// applied to the tree.
#[derive(Debug, PartialEq, Serialize, Deserialize)]
#[derive(Debug, Clone, Default, PartialEq, Serialize, Deserialize)]
pub struct RbxSnapshotInstance<'a> {
pub name: Cow<'a, str>,
pub class_name: Cow<'a, str>,
@@ -73,6 +73,22 @@ pub struct RbxSnapshotInstance<'a> {
pub metadata: MetadataPerInstance,
}
impl<'a> RbxSnapshotInstance<'a> {
pub fn get_owned(&'a self) -> RbxSnapshotInstance<'static> {
let children: Vec<RbxSnapshotInstance<'static>> = self.children.iter()
.map(RbxSnapshotInstance::get_owned)
.collect();
RbxSnapshotInstance {
name: Cow::Owned(self.name.clone().into_owned()),
class_name: Cow::Owned(self.class_name.clone().into_owned()),
properties: self.properties.clone(),
children,
metadata: self.metadata.clone(),
}
}
}
impl<'a> PartialOrd for RbxSnapshotInstance<'a> {
fn partial_cmp(&self, other: &RbxSnapshotInstance) -> Option<Ordering> {
Some(self.name.cmp(&other.name)
@@ -137,7 +153,7 @@ pub fn reify_subtree(
instance_per_path: &mut PathMap<HashSet<RbxId>>,
metadata_per_instance: &mut HashMap<RbxId, MetadataPerInstance>,
changes: &mut InstanceChanges,
) {
) -> RbxId {
let instance = reify_core(snapshot);
let id = tree.insert_instance(instance, parent_id);
@@ -148,6 +164,8 @@ pub fn reify_subtree(
for child in &snapshot.children {
reify_subtree(child, tree, id, instance_per_path, metadata_per_instance, changes);
}
id
}
fn reify_metadata(
@@ -206,6 +224,9 @@ fn reify_core(snapshot: &RbxSnapshotInstance) -> RbxInstanceProperties {
instance
}
/// Updates the given instance to match the properties defined on the snapshot.
///
/// Returns whether any changes were applied.
fn reconcile_instance_properties(instance: &mut RbxInstanceProperties, snapshot: &RbxSnapshotInstance) -> bool {
let mut has_diffs = false;
@@ -263,6 +284,8 @@ fn reconcile_instance_properties(instance: &mut RbxInstanceProperties, snapshot:
has_diffs
}
/// Updates the children of the instance in the `RbxTree` to match the children
/// of the `RbxSnapshotInstance`. Order will be updated to match.
fn reconcile_instance_children(
tree: &mut RbxTree,
id: RbxId,
@@ -271,12 +294,21 @@ fn reconcile_instance_children(
metadata_per_instance: &mut HashMap<RbxId, MetadataPerInstance>,
changes: &mut InstanceChanges,
) {
let mut visited_snapshot_indices = HashSet::new();
let mut children_to_update: Vec<(RbxId, &RbxSnapshotInstance)> = Vec::new();
let mut children_to_add: Vec<&RbxSnapshotInstance> = Vec::new();
// These lists are kept so that we can apply all the changes we figure out
let mut children_to_maybe_update: Vec<(RbxId, &RbxSnapshotInstance)> = Vec::new();
let mut children_to_add: Vec<(usize, &RbxSnapshotInstance)> = Vec::new();
let mut children_to_remove: Vec<RbxId> = Vec::new();
// This map is used once we're done mutating children to sort them according
// to the order specified in the snapshot. Without it, a snapshot with a new
// child prepended will cause the RbxTree instance to have out-of-order
// children and would make Rojo non-deterministic.
let mut ids_to_snapshot_indices = HashMap::new();
// Since we have to enumerate the children of both the RbxTree instance and
// our snapshot, we keep a set of the snapshot children we've seen.
let mut visited_snapshot_indices = vec![false; snapshot.children.len()];
let children_ids = tree.get_instance(id).unwrap().get_children_ids();
// Find all instances that were removed or updated, which we derive by
@@ -287,7 +319,7 @@ fn reconcile_instance_children(
// Locate a matching snapshot for this instance
let mut matching_snapshot = None;
for (snapshot_index, child_snapshot) in snapshot.children.iter().enumerate() {
if visited_snapshot_indices.contains(&snapshot_index) {
if visited_snapshot_indices[snapshot_index] {
continue;
}
@@ -295,7 +327,8 @@ fn reconcile_instance_children(
// similar. This heuristic is similar to React's reconciliation
// strategy.
if child_snapshot.name == child_instance.name {
visited_snapshot_indices.insert(snapshot_index);
ids_to_snapshot_indices.insert(child_id, snapshot_index);
visited_snapshot_indices[snapshot_index] = true;
matching_snapshot = Some(child_snapshot);
break;
}
@@ -303,26 +336,23 @@ fn reconcile_instance_children(
match matching_snapshot {
Some(child_snapshot) => {
children_to_update.push((child_instance.get_id(), child_snapshot));
},
children_to_maybe_update.push((child_instance.get_id(), child_snapshot));
}
None => {
children_to_remove.push(child_instance.get_id());
},
}
}
}
// Find all instancs that were added, which is just the snapshots we didn't
// match up to existing instances above.
for (snapshot_index, child_snapshot) in snapshot.children.iter().enumerate() {
if !visited_snapshot_indices.contains(&snapshot_index) {
children_to_add.push(child_snapshot);
if !visited_snapshot_indices[snapshot_index] {
children_to_add.push((snapshot_index, child_snapshot));
}
}
for child_snapshot in &children_to_add {
reify_subtree(child_snapshot, tree, id, instance_per_path, metadata_per_instance, changes);
}
// Apply all of our removals we gathered from our diff
for child_id in &children_to_remove {
if let Some(subtree) = tree.remove_instance(*child_id) {
for id in subtree.iter_all_ids() {
@@ -332,7 +362,18 @@ fn reconcile_instance_children(
}
}
for (child_id, child_snapshot) in &children_to_update {
// Apply all of our children additions
for (snapshot_index, child_snapshot) in &children_to_add {
let id = reify_subtree(child_snapshot, tree, id, instance_per_path, metadata_per_instance, changes);
ids_to_snapshot_indices.insert(id, *snapshot_index);
}
// Apply any updates that might have updates
for (child_id, child_snapshot) in &children_to_maybe_update {
reconcile_subtree(tree, *child_id, child_snapshot, instance_per_path, metadata_per_instance, changes);
}
// Apply the sort mapping defined by ids_to_snapshot_indices above
let instance = tree.get_instance_mut(id).unwrap();
instance.sort_children_unstable_by_key(|id| ids_to_snapshot_indices.get(&id).unwrap());
}

View File

@@ -1,4 +1,5 @@
use std::{
collections::HashMap,
fmt,
io::Write,
path::Path,
@@ -6,12 +7,13 @@ use std::{
};
use log::warn;
use rbx_tree::RbxId;
use rbx_dom_weak::{RbxTree, RbxId};
use crate::{
imfs::{Imfs, ImfsItem},
rbx_session::RbxSession,
web::PublicInstanceMetadata,
web::api::PublicInstanceMetadata,
rbx_session::MetadataPerInstance,
};
static GRAPHVIZ_HEADER: &str = r#"
@@ -53,42 +55,59 @@ pub fn graphviz_to_svg(source: &str) -> Option<String> {
Some(String::from_utf8(output.stdout).expect("Failed to parse stdout as UTF-8"))
}
pub struct VisualizeRbxTree<'a, 'b> {
pub tree: &'a RbxTree,
pub metadata: &'b HashMap<RbxId, MetadataPerInstance>,
}
impl<'a, 'b> fmt::Display for VisualizeRbxTree<'a, 'b> {
fn fmt(&self, output: &mut fmt::Formatter) -> fmt::Result {
writeln!(output, "{}", GRAPHVIZ_HEADER)?;
visualize_instance(&self.tree, self.tree.get_root_id(), &self.metadata, output)?;
writeln!(output, "}}")
}
}
/// A Display wrapper struct to visualize an RbxSession as SVG.
pub struct VisualizeRbxSession<'a>(pub &'a RbxSession);
impl<'a> fmt::Display for VisualizeRbxSession<'a> {
fn fmt(&self, output: &mut fmt::Formatter) -> fmt::Result {
writeln!(output, "{}", GRAPHVIZ_HEADER)?;
visualize_rbx_node(self.0, self.0.get_tree().get_root_id(), output)?;
writeln!(output, "}}")?;
Ok(())
writeln!(output, "{}", VisualizeRbxTree {
tree: self.0.get_tree(),
metadata: self.0.get_all_instance_metadata(),
})
}
}
fn visualize_rbx_node(session: &RbxSession, id: RbxId, output: &mut fmt::Formatter) -> fmt::Result {
let node = session.get_tree().get_instance(id).unwrap();
fn visualize_instance(
tree: &RbxTree,
id: RbxId,
metadata: &HashMap<RbxId, MetadataPerInstance>,
output: &mut fmt::Formatter,
) -> fmt::Result {
let instance = tree.get_instance(id).unwrap();
let mut node_label = format!("{}|{}|{}", node.name, node.class_name, id);
let mut instance_label = format!("{}|{}|{}", instance.name, instance.class_name, id);
if let Some(session_metadata) = session.get_instance_metadata(id) {
if let Some(session_metadata) = metadata.get(&id) {
let metadata = PublicInstanceMetadata::from_session_metadata(session_metadata);
node_label.push('|');
node_label.push_str(&serde_json::to_string(&metadata).unwrap());
instance_label.push('|');
instance_label.push_str(&serde_json::to_string(&metadata).unwrap());
}
node_label = node_label
instance_label = instance_label
.replace("\"", "&quot;")
.replace("{", "\\{")
.replace("}", "\\}");
writeln!(output, " \"{}\" [label=\"{}\"]", id, node_label)?;
writeln!(output, " \"{}\" [label=\"{}\"]", id, instance_label)?;
for &child_id in node.get_children_ids() {
for &child_id in instance.get_children_ids() {
writeln!(output, " \"{}\" -> \"{}\"", id, child_id)?;
visualize_rbx_node(session, child_id, output)?;
visualize_instance(tree, child_id, metadata, output)?;
}
Ok(())

View File

@@ -1,32 +1,36 @@
//! Defines Rojo's web interface that all clients use to communicate with a
//! running live-sync session.
//! Defines Rojo's HTTP API, all under /api. These endpoints generally return
//! JSON.
use std::{
borrow::Cow,
collections::{HashMap, HashSet},
sync::{mpsc, Arc},
sync::Arc,
};
use serde_derive::{Serialize, Deserialize};
use log::trace;
use rouille::{
self,
router,
use futures::{
future::{self, IntoFuture},
Future,
sync::oneshot,
};
use hyper::{
service::Service,
header,
StatusCode,
Method,
Body,
Request,
Response,
};
use rbx_tree::{RbxId, RbxInstance};
use serde_derive::{Serialize, Deserialize};
use rbx_dom_weak::{RbxId, RbxInstance};
use crate::{
live_session::LiveSession,
session_id::SessionId,
snapshot_reconciler::InstanceChanges,
visualize::{VisualizeRbxSession, VisualizeImfs, graphviz_to_svg},
rbx_session::{MetadataPerInstance},
};
static HOME_CONTENT: &str = include_str!("../assets/index.html");
/// Contains the instance metadata relevant to Rojo clients.
#[derive(Debug, Serialize, Deserialize)]
#[serde(rename_all = "camelCase")]
@@ -43,7 +47,7 @@ impl PublicInstanceMetadata {
}
/// Used to attach metadata specific to Rojo to instances, which come from the
/// rbx_tree crate.
/// rbx_dom_weak crate.
///
/// Both fields are wrapped in Cow in order to make owned-vs-borrowed simpler
/// for tests.
@@ -82,121 +86,133 @@ pub struct SubscribeResponse<'a> {
pub messages: Cow<'a, [InstanceChanges]>,
}
pub struct Server {
fn response_json<T: serde::Serialize>(value: T) -> Response<Body> {
let serialized = match serde_json::to_string(&value) {
Ok(v) => v,
Err(err) => {
return Response::builder()
.status(StatusCode::BAD_REQUEST)
.header(header::CONTENT_TYPE, "text/plain")
.body(Body::from(err.to_string()))
.unwrap();
},
};
Response::builder()
.header(header::CONTENT_TYPE, "application/json")
.body(Body::from(serialized))
.unwrap()
}
pub struct ApiService {
live_session: Arc<LiveSession>,
server_version: &'static str,
}
impl Server {
pub fn new(live_session: Arc<LiveSession>) -> Server {
Server {
impl Service for ApiService {
type ReqBody = Body;
type ResBody = Body;
type Error = hyper::Error;
type Future = Box<dyn Future<Item = hyper::Response<Self::ReqBody>, Error = Self::Error> + Send>;
fn call(&mut self, request: hyper::Request<Self::ReqBody>) -> Self::Future {
let response = match (request.method(), request.uri().path()) {
(&Method::GET, "/api/rojo") => self.handle_api_rojo(),
(&Method::GET, path) if path.starts_with("/api/read/") => self.handle_api_read(request),
(&Method::GET, path) if path.starts_with("/api/subscribe/") => {
return self.handle_api_subscribe(request);
}
_ => {
Response::builder()
.status(StatusCode::NOT_FOUND)
.body(Body::empty())
.unwrap()
}
};
Box::new(future::ok(response))
}
}
impl ApiService {
pub fn new(live_session: Arc<LiveSession>) -> ApiService {
ApiService {
live_session,
server_version: env!("CARGO_PKG_VERSION"),
}
}
#[allow(unreachable_code)]
pub fn handle_request(&self, request: &Request) -> Response {
trace!("Request {} {}", request.method(), request.url());
router!(request,
(GET) (/) => {
self.handle_home()
},
(GET) (/api/rojo) => {
self.handle_api_rojo()
},
(GET) (/api/subscribe/{ cursor: u32 }) => {
self.handle_api_subscribe(cursor)
},
(GET) (/api/read/{ id_list: String }) => {
let requested_ids: Option<Vec<RbxId>> = id_list
.split(',')
.map(RbxId::parse_str)
.collect();
self.handle_api_read(requested_ids)
},
(GET) (/visualize/rbx) => {
self.handle_visualize_rbx()
},
(GET) (/visualize/imfs) => {
self.handle_visualize_imfs()
},
_ => Response::empty_404()
)
}
pub fn listen(self, port: u16) {
let address = format!("0.0.0.0:{}", port);
rouille::start_server(address, move |request| self.handle_request(request));
}
fn handle_home(&self) -> Response {
Response::html(HOME_CONTENT)
}
/// Get a summary of information about the server
fn handle_api_rojo(&self) -> Response {
fn handle_api_rojo(&self) -> Response<Body> {
let rbx_session = self.live_session.rbx_session.lock().unwrap();
let tree = rbx_session.get_tree();
Response::json(&ServerInfoResponse {
response_json(&ServerInfoResponse {
server_version: self.server_version,
protocol_version: 2,
session_id: self.live_session.session_id,
expected_place_ids: self.live_session.project.serve_place_ids.clone(),
session_id: self.live_session.session_id(),
expected_place_ids: self.live_session.serve_place_ids().clone(),
root_instance_id: tree.get_root_id(),
})
}
/// Retrieve any messages past the given cursor index, and if
/// there weren't any, subscribe to receive any new messages.
fn handle_api_subscribe(&self, cursor: u32) -> Response {
fn handle_api_subscribe(&self, request: Request<Body>) -> <ApiService as Service>::Future {
let argument = &request.uri().path()["/api/subscribe/".len()..];
let cursor: u32 = match argument.parse() {
Ok(v) => v,
Err(err) => {
return Box::new(future::ok(Response::builder()
.status(StatusCode::BAD_REQUEST)
.header(header::CONTENT_TYPE, "text/plain")
.body(Body::from(err.to_string()))
.unwrap()));
},
};
let message_queue = Arc::clone(&self.live_session.message_queue);
let session_id = self.live_session.session_id();
// Did the client miss any messages since the last subscribe?
{
let (new_cursor, new_messages) = message_queue.get_messages_since(cursor);
let (tx, rx) = oneshot::channel();
message_queue.subscribe(cursor, tx);
if !new_messages.is_empty() {
return Response::json(&SubscribeResponse {
session_id: self.live_session.session_id,
messages: Cow::Borrowed(&new_messages),
let result = rx.into_future()
.and_then(move |(new_cursor, new_messages)| {
Box::new(future::ok(response_json(SubscribeResponse {
session_id: session_id,
messages: Cow::Owned(new_messages),
message_cursor: new_cursor,
})
}
}
let (tx, rx) = mpsc::channel();
let sender_id = message_queue.subscribe(tx);
match rx.recv() {
Ok(_) => (),
Err(_) => return Response::text("error!").with_status_code(500),
}
message_queue.unsubscribe(sender_id);
{
let (new_cursor, new_messages) = message_queue.get_messages_since(cursor);
return Response::json(&SubscribeResponse {
session_id: self.live_session.session_id,
messages: Cow::Owned(new_messages),
message_cursor: new_cursor,
})))
})
}
.or_else(|e| {
Box::new(future::ok(Response::builder()
.status(500)
.body(Body::from(format!("Internal Error: {:?}", e)))
.unwrap()))
});
Box::new(result)
}
fn handle_api_read(&self, requested_ids: Option<Vec<RbxId>>) -> Response {
fn handle_api_read(&self, request: Request<Body>) -> Response<Body> {
let argument = &request.uri().path()["/api/read/".len()..];
let requested_ids: Option<Vec<RbxId>> = argument
.split(',')
.map(RbxId::parse_str)
.collect();
let message_queue = Arc::clone(&self.live_session.message_queue);
let requested_ids = match requested_ids {
Some(id) => id,
None => return rouille::Response::text("Malformed ID list").with_status_code(400),
None => {
return Response::builder()
.status(StatusCode::BAD_REQUEST)
.header(header::CONTENT_TYPE, "text/plain")
.body(Body::from("Malformed ID list"))
.unwrap();
},
};
let rbx_session = self.live_session.rbx_session.lock().unwrap();
@@ -228,30 +244,10 @@ impl Server {
}
}
Response::json(&ReadResponse {
session_id: self.live_session.session_id,
response_json(&ReadResponse {
session_id: self.live_session.session_id(),
message_cursor,
instances,
})
}
fn handle_visualize_rbx(&self) -> Response {
let rbx_session = self.live_session.rbx_session.lock().unwrap();
let dot_source = format!("{}", VisualizeRbxSession(&rbx_session));
match graphviz_to_svg(&dot_source) {
Some(svg) => Response::svg(svg),
None => Response::text(dot_source),
}
}
fn handle_visualize_imfs(&self) -> Response {
let imfs = self.live_session.imfs.lock().unwrap();
let dot_source = format!("{}", VisualizeImfs(&imfs));
match graphviz_to_svg(&dot_source) {
Some(svg) => Response::svg(svg),
None => Response::text(dot_source),
}
}
}

121
server/src/web/interface.rs Normal file
View File

@@ -0,0 +1,121 @@
//! Defines the HTTP-based UI. These endpoints generally return HTML and SVG.
use std::sync::Arc;
use futures::{future, Future};
use hyper::{
service::Service,
header,
Body,
Method,
StatusCode,
Request,
Response,
};
use ritz::html;
use crate::{
live_session::LiveSession,
visualize::{VisualizeRbxSession, VisualizeImfs, graphviz_to_svg},
};
static HOME_CSS: &str = include_str!("../../assets/index.css");
pub struct InterfaceService {
live_session: Arc<LiveSession>,
server_version: &'static str,
}
impl Service for InterfaceService {
type ReqBody = Body;
type ResBody = Body;
type Error = hyper::Error;
type Future = Box<dyn Future<Item = Response<Self::ReqBody>, Error = Self::Error> + Send>;
fn call(&mut self, request: Request<Self::ReqBody>) -> Self::Future {
let response = match (request.method(), request.uri().path()) {
(&Method::GET, "/") => self.handle_home(),
(&Method::GET, "/visualize/rbx") => self.handle_visualize_rbx(),
(&Method::GET, "/visualize/imfs") => self.handle_visualize_imfs(),
_ => Response::builder()
.status(StatusCode::NOT_FOUND)
.body(Body::empty())
.unwrap(),
};
Box::new(future::ok(response))
}
}
impl InterfaceService {
pub fn new(live_session: Arc<LiveSession>) -> InterfaceService {
InterfaceService {
live_session,
server_version: env!("CARGO_PKG_VERSION"),
}
}
fn handle_home(&self) -> Response<Body> {
let page = html! {
<html>
<head>
<title>"Rojo"</title>
<style>
{ ritz::UnescapedText::new(HOME_CSS) }
</style>
</head>
<body>
<div class="main">
<h1 class="title">
"Rojo Live Sync is up and running!"
</h1>
<h2 class="subtitle">
"Version " { self.server_version }
</h2>
<a class="docs" href="https://lpghatguy.github.io/rojo">
"Rojo Documentation"
</a>
</div>
</body>
</html>
};
Response::builder()
.header(header::CONTENT_TYPE, "text/html")
.body(Body::from(format!("<!DOCTYPE html>{}", page)))
.unwrap()
}
fn handle_visualize_rbx(&self) -> Response<Body> {
let rbx_session = self.live_session.rbx_session.lock().unwrap();
let dot_source = format!("{}", VisualizeRbxSession(&rbx_session));
match graphviz_to_svg(&dot_source) {
Some(svg) => Response::builder()
.header(header::CONTENT_TYPE, "image/svg+xml")
.body(Body::from(svg))
.unwrap(),
None => Response::builder()
.header(header::CONTENT_TYPE, "text/plain")
.body(Body::from(dot_source))
.unwrap(),
}
}
fn handle_visualize_imfs(&self) -> Response<Body> {
let imfs = self.live_session.imfs.lock().unwrap();
let dot_source = format!("{}", VisualizeImfs(&imfs));
match graphviz_to_svg(&dot_source) {
Some(svg) => Response::builder()
.header(header::CONTENT_TYPE, "image/svg+xml")
.body(Body::from(svg))
.unwrap(),
None => Response::builder()
.header(header::CONTENT_TYPE, "text/plain")
.body(Body::from(dot_source))
.unwrap(),
}
}
}

85
server/src/web/mod.rs Normal file
View File

@@ -0,0 +1,85 @@
// TODO: This module needs to be public for visualize, we should move
// PublicInstanceMetadata and switch this private!
pub mod api;
mod interface;
use std::sync::Arc;
use log::trace;
use futures::{
future::{self, FutureResult},
Future,
};
use hyper::{
service::Service,
Body,
Request,
Response,
Server,
};
use crate::{
live_session::LiveSession,
};
use self::{
api::ApiService,
interface::InterfaceService,
};
pub struct RootService {
api: api::ApiService,
interface: interface::InterfaceService,
}
impl Service for RootService {
type ReqBody = Body;
type ResBody = Body;
type Error = hyper::Error;
type Future = Box<dyn Future<Item = Response<Self::ReqBody>, Error = Self::Error> + Send>;
fn call(&mut self, request: Request<Self::ReqBody>) -> Self::Future {
trace!("{} {}", request.method(), request.uri().path());
if request.uri().path().starts_with("/api") {
self.api.call(request)
} else {
self.interface.call(request)
}
}
}
impl RootService {
pub fn new(live_session: Arc<LiveSession>) -> RootService {
RootService {
api: ApiService::new(Arc::clone(&live_session)),
interface: InterfaceService::new(Arc::clone(&live_session)),
}
}
}
pub struct LiveServer {
live_session: Arc<LiveSession>,
}
impl LiveServer {
pub fn new(live_session: Arc<LiveSession>) -> LiveServer {
LiveServer {
live_session,
}
}
pub fn start(self, port: u16) {
let address = ([127, 0, 0, 1], port).into();
let server = Server::bind(&address)
.serve(move || {
let service: FutureResult<_, hyper::Error> =
future::ok(RootService::new(Arc::clone(&self.live_session)));
service
})
.map_err(|e| eprintln!("Server error: {}", e));
hyper::rt::run(server);
}
}

View File

@@ -1,43 +0,0 @@
use std::io::Read;
use rouille;
use serde;
use serde_json;
static MAX_BODY_SIZE: usize = 100 * 1024 * 1024; // 100 MiB
/// Pulls text that may be JSON out of a Rouille Request object.
///
/// Doesn't do any actual parsing -- all this method does is verify the content
/// type of the request and read the request's body.
fn read_json_text(request: &rouille::Request) -> Option<String> {
// Bail out if the request body isn't marked as JSON
let content_type = request.header("Content-Type")?;
if !content_type.starts_with("application/json") {
return None;
}
let body = request.data()?;
// Allocate a buffer and read up to MAX_BODY_SIZE+1 bytes into it.
let mut out = Vec::new();
body.take(MAX_BODY_SIZE.saturating_add(1) as u64).read_to_end(&mut out).ok()?;
// If the body was too big (MAX_BODY_SIZE+1), we abort instead of trying to
// process it.
if out.len() > MAX_BODY_SIZE {
return None;
}
String::from_utf8(out).ok()
}
/// Reads the body out of a Rouille Request and attempts to turn it into JSON.
pub fn read_json<T>(request: &rouille::Request) -> Option<T>
where
T: serde::de::DeserializeOwned,
{
let body = read_json_text(&request)?;
serde_json::from_str(&body).ok()?
}

View File

@@ -1,5 +1,5 @@
use std::{
collections::{HashMap, HashSet},
collections::{HashMap, HashSet, BTreeSet},
fs,
path::PathBuf,
};
@@ -80,7 +80,7 @@ fn base_tree() -> Result<(TempDir, Imfs, ExpectedImfs, TestResources), Error> {
expected_roots.insert(root.path().to_path_buf());
let root_item = {
let mut children = HashSet::new();
let mut children = BTreeSet::new();
children.insert(foo_path.clone());
children.insert(bar_path.clone());
@@ -91,7 +91,7 @@ fn base_tree() -> Result<(TempDir, Imfs, ExpectedImfs, TestResources), Error> {
};
let foo_item = {
let mut children = HashSet::new();
let mut children = BTreeSet::new();
children.insert(baz_path.clone());
ImfsItem::Directory(ImfsDirectory {
@@ -199,7 +199,7 @@ fn adding_folder() -> Result<(), Error> {
}
let folder_item = {
let mut children = HashSet::new();
let mut children = BTreeSet::new();
children.insert(file1_path.clone());
children.insert(file2_path.clone());

View File

@@ -1,12 +1,12 @@
#[macro_use] extern crate lazy_static;
use std::{
collections::HashMap,
collections::{HashMap, BTreeMap},
path::{Path, PathBuf},
};
use pretty_assertions::assert_eq;
use rbx_tree::RbxValue;
use rbx_dom_weak::RbxValue;
use librojo::{
project::{Project, ProjectNode},
@@ -53,7 +53,7 @@ fn single_partition_game() {
..Default::default()
};
let mut replicated_storage_children = HashMap::new();
let mut replicated_storage_children = BTreeMap::new();
replicated_storage_children.insert("Foo".to_string(), foo);
let replicated_storage = ProjectNode {
@@ -65,7 +65,7 @@ fn single_partition_game() {
let mut http_service_properties = HashMap::new();
http_service_properties.insert("HttpEnabled".to_string(), RbxValue::Bool {
value: true,
});
}.into());
let http_service = ProjectNode {
class_name: Some(String::from("HttpService")),
@@ -73,7 +73,7 @@ fn single_partition_game() {
..Default::default()
};
let mut root_children = HashMap::new();
let mut root_children = BTreeMap::new();
root_children.insert("ReplicatedStorage".to_string(), replicated_storage);
root_children.insert("HttpService".to_string(), http_service);
@@ -86,6 +86,7 @@ fn single_partition_game() {
Project {
name: "single-sync-point".to_string(),
tree: root_node,
plugins: Vec::new(),
serve_port: None,
serve_place_ids: None,
file_location: project_location.join("default.project.json"),

View File

@@ -0,0 +1,112 @@
mod test_util;
use std::collections::HashMap;
use pretty_assertions::assert_eq;
use rbx_dom_weak::{RbxTree, RbxInstanceProperties};
use librojo::{
snapshot_reconciler::{RbxSnapshotInstance, reconcile_subtree},
};
use test_util::tree::trees_equal;
#[test]
fn patch_communicativity() {
let base_tree = RbxTree::new(RbxInstanceProperties {
name: "DataModel".into(),
class_name: "DataModel".into(),
properties: HashMap::new(),
});
let patch_a = RbxSnapshotInstance {
name: "DataModel".into(),
class_name: "DataModel".into(),
children: vec![
RbxSnapshotInstance {
name: "Child-A".into(),
class_name: "Folder".into(),
..Default::default()
},
],
..Default::default()
};
let patch_b = RbxSnapshotInstance {
name: "DataModel".into(),
class_name: "DataModel".into(),
children: vec![
RbxSnapshotInstance {
name: "Child-B".into(),
class_name: "Folder".into(),
..Default::default()
},
],
..Default::default()
};
let patch_combined = RbxSnapshotInstance {
name: "DataModel".into(),
class_name: "DataModel".into(),
children: vec![
RbxSnapshotInstance {
name: "Child-A".into(),
class_name: "Folder".into(),
..Default::default()
},
RbxSnapshotInstance {
name: "Child-B".into(),
class_name: "Folder".into(),
..Default::default()
},
],
..Default::default()
};
let root_id = base_tree.get_root_id();
let mut tree_a = base_tree.clone();
reconcile_subtree(
&mut tree_a,
root_id,
&patch_a,
&mut Default::default(),
&mut Default::default(),
&mut Default::default(),
);
reconcile_subtree(
&mut tree_a,
root_id,
&patch_combined,
&mut Default::default(),
&mut Default::default(),
&mut Default::default(),
);
let mut tree_b = base_tree.clone();
reconcile_subtree(
&mut tree_b,
root_id,
&patch_b,
&mut Default::default(),
&mut Default::default(),
&mut Default::default(),
);
reconcile_subtree(
&mut tree_b,
root_id,
&patch_combined,
&mut Default::default(),
&mut Default::default(),
&mut Default::default(),
);
match trees_equal(&tree_a, &tree_b) {
Ok(_) => {}
Err(e) => panic!("{}", e),
}
}

View File

@@ -0,0 +1,69 @@
mod test_util;
use std::path::Path;
use pretty_assertions::assert_eq;
use librojo::{
imfs::Imfs,
project::Project,
rbx_snapshot::{SnapshotContext, snapshot_project_tree},
};
use crate::test_util::{
snapshot::*,
};
macro_rules! generate_snapshot_tests {
($($name: ident),*) => {
$(
paste::item! {
#[test]
fn [<snapshot_ $name>]() {
let _ = env_logger::try_init();
let tests_folder = Path::new(env!("CARGO_MANIFEST_DIR")).join("../test-projects");
let project_folder = tests_folder.join(stringify!($name));
run_snapshot_test(&project_folder);
}
}
)*
};
}
generate_snapshot_tests!(
empty,
localization,
multi_partition_game,
nested_partitions,
single_partition_game,
single_partition_model,
transmute_partition
);
fn run_snapshot_test(path: &Path) {
println!("Running snapshot from project: {}", path.display());
let project = Project::load_fuzzy(path)
.expect("Couldn't load project file for snapshot test");
let mut imfs = Imfs::new();
imfs.add_roots_from_project(&project)
.expect("Could not add IMFS roots to snapshot project");
let context = SnapshotContext {
plugin_context: None,
};
let mut snapshot = snapshot_project_tree(&context, &imfs, &project)
.expect("Could not generate snapshot for snapshot test");
if let Some(snapshot) = snapshot.as_mut() {
anonymize_snapshot(path, snapshot);
}
match read_expected_snapshot(path) {
Some(expected_snapshot) => assert_eq!(snapshot, expected_snapshot),
None => write_expected_snapshot(path, &snapshot),
}
}

View File

@@ -1,124 +0,0 @@
use std::{
fs::{self, File},
path::{Path, PathBuf},
};
use pretty_assertions::assert_eq;
use librojo::{
imfs::Imfs,
project::{Project, ProjectNode},
rbx_snapshot::snapshot_project_tree,
snapshot_reconciler::{RbxSnapshotInstance},
};
macro_rules! generate_snapshot_tests {
($($name: ident),*) => {
$(
paste::item! {
#[test]
fn [<snapshot_ $name>]() {
let tests_folder = Path::new(env!("CARGO_MANIFEST_DIR")).join("../test-projects");
let project_folder = tests_folder.join(stringify!($name));
run_snapshot_test(&project_folder);
}
}
)*
};
}
generate_snapshot_tests!(
empty,
nested_partitions,
single_partition_game,
single_partition_model,
transmute_partition
);
const SNAPSHOT_EXPECTED_NAME: &str = "expected-snapshot.json";
fn run_snapshot_test(path: &Path) {
println!("Running snapshot from project: {}", path.display());
let project = Project::load_fuzzy(path)
.expect("Couldn't load project file for snapshot test");
let mut imfs = Imfs::new();
imfs.add_roots_from_project(&project)
.expect("Could not add IMFS roots to snapshot project");
let mut snapshot = snapshot_project_tree(&imfs, &project)
.expect("Could not generate snapshot for snapshot test");
if let Some(snapshot) = snapshot.as_mut() {
anonymize_snapshot(path, snapshot);
}
match read_expected_snapshot(path) {
Some(expected_snapshot) => assert_eq!(snapshot, expected_snapshot),
None => write_expected_snapshot(path, &snapshot),
}
}
/// Snapshots contain absolute paths, which simplifies much of Rojo.
///
/// For saving snapshots to the disk, we should strip off the project folder
/// path to make them machine-independent. This doesn't work for paths that fall
/// outside of the project folder, but that's okay here.
///
/// We also need to sort children, since Rojo tends to enumerate the filesystem
/// in an unpredictable order.
fn anonymize_snapshot(project_folder_path: &Path, snapshot: &mut RbxSnapshotInstance) {
match snapshot.metadata.source_path.as_mut() {
Some(path) => *path = anonymize_path(project_folder_path, path),
None => {},
}
match snapshot.metadata.project_definition.as_mut() {
Some((_, project_node)) => anonymize_project_node(project_folder_path, project_node),
None => {},
}
snapshot.children.sort_by(|a, b| a.partial_cmp(b).unwrap());
for child in snapshot.children.iter_mut() {
anonymize_snapshot(project_folder_path, child);
}
}
fn anonymize_project_node(project_folder_path: &Path, project_node: &mut ProjectNode) {
match project_node.path.as_mut() {
Some(path) => *path = anonymize_path(project_folder_path, path),
None => {},
}
for child_node in project_node.children.values_mut() {
anonymize_project_node(project_folder_path, child_node);
}
}
fn anonymize_path(project_folder_path: &Path, path: &Path) -> PathBuf {
if path.is_absolute() {
path.strip_prefix(project_folder_path)
.expect("Could not anonymize absolute path")
.to_path_buf()
} else {
path.to_path_buf()
}
}
fn read_expected_snapshot(path: &Path) -> Option<Option<RbxSnapshotInstance<'static>>> {
let contents = fs::read(path.join(SNAPSHOT_EXPECTED_NAME)).ok()?;
let snapshot: Option<RbxSnapshotInstance<'static>> = serde_json::from_slice(&contents)
.expect("Could not deserialize snapshot");
Some(snapshot)
}
fn write_expected_snapshot(path: &Path, snapshot: &Option<RbxSnapshotInstance>) {
let mut file = File::create(path.join(SNAPSHOT_EXPECTED_NAME))
.expect("Could not open file to write snapshot");
serde_json::to_writer_pretty(&mut file, snapshot)
.expect("Could not serialize snapshot to file");
}

View File

@@ -1,31 +1,13 @@
#![allow(dead_code)]
use std::fs::{create_dir, copy};
use std::path::Path;
use std::io;
use rouille::Request;
use walkdir::WalkDir;
use librojo::web::Server;
pub trait HttpTestUtil {
fn get_string(&self, url: &str) -> String;
}
impl HttpTestUtil for Server {
fn get_string(&self, url: &str) -> String {
let info_request = Request::fake_http("GET", url, vec![], vec![]);
let response = self.handle_request(&info_request);
assert_eq!(response.status_code, 200);
let (mut reader, _) = response.data.into_reader_and_size();
let mut body = String::new();
reader.read_to_string(&mut body).unwrap();
body
}
}
pub mod snapshot;
pub mod tree;
pub fn copy_recursive(from: &Path, to: &Path) -> io::Result<()> {
for entry in WalkDir::new(from) {
@@ -51,4 +33,4 @@ pub fn copy_recursive(from: &Path, to: &Path) -> io::Result<()> {
}
Ok(())
}
}

View File

@@ -0,0 +1,79 @@
use std::{
fs::{self, File},
path::{Path, PathBuf},
};
use librojo::{
project::ProjectNode,
snapshot_reconciler::RbxSnapshotInstance,
rbx_session::MetadataPerInstance,
};
const SNAPSHOT_EXPECTED_NAME: &str = "expected-snapshot.json";
/// Snapshots contain absolute paths, which simplifies much of Rojo.
///
/// For saving snapshots to the disk, we should strip off the project folder
/// path to make them machine-independent. This doesn't work for paths that fall
/// outside of the project folder, but that's okay here.
///
/// We also need to sort children, since Rojo tends to enumerate the filesystem
/// in an unpredictable order.
pub fn anonymize_snapshot(project_folder_path: &Path, snapshot: &mut RbxSnapshotInstance) {
anonymize_metadata(project_folder_path, &mut snapshot.metadata);
snapshot.children.sort_by(|a, b| a.partial_cmp(b).unwrap());
for child in snapshot.children.iter_mut() {
anonymize_snapshot(project_folder_path, child);
}
}
pub fn anonymize_metadata(project_folder_path: &Path, metadata: &mut MetadataPerInstance) {
match metadata.source_path.as_mut() {
Some(path) => *path = anonymize_path(project_folder_path, path),
None => {},
}
match metadata.project_definition.as_mut() {
Some((_, project_node)) => anonymize_project_node(project_folder_path, project_node),
None => {},
}
}
pub fn anonymize_project_node(project_folder_path: &Path, project_node: &mut ProjectNode) {
match project_node.path.as_mut() {
Some(path) => *path = anonymize_path(project_folder_path, path),
None => {},
}
for child_node in project_node.children.values_mut() {
anonymize_project_node(project_folder_path, child_node);
}
}
pub fn anonymize_path(project_folder_path: &Path, path: &Path) -> PathBuf {
if path.is_absolute() {
path.strip_prefix(project_folder_path)
.expect("Could not anonymize absolute path")
.to_path_buf()
} else {
path.to_path_buf()
}
}
pub fn read_expected_snapshot(path: &Path) -> Option<Option<RbxSnapshotInstance<'static>>> {
let contents = fs::read(path.join(SNAPSHOT_EXPECTED_NAME)).ok()?;
let snapshot: Option<RbxSnapshotInstance<'static>> = serde_json::from_slice(&contents)
.expect("Could not deserialize snapshot");
Some(snapshot)
}
pub fn write_expected_snapshot(path: &Path, snapshot: &Option<RbxSnapshotInstance>) {
let mut file = File::create(path.join(SNAPSHOT_EXPECTED_NAME))
.expect("Could not open file to write snapshot");
serde_json::to_writer_pretty(&mut file, snapshot)
.expect("Could not serialize snapshot to file");
}

View File

@@ -0,0 +1,351 @@
//! Defines a mechanism to compare two RbxTree objects and generate a useful
//! diff if they aren't the same. These methods ignore IDs, which are randomly
//! generated whenever a tree is constructed anyways. This makes matching up
//! pairs of instances that should be the same potentially difficult.
//!
//! It relies on a couple different ideas:
//! - Instances with the same name and class name are matched as the same
//! instance. See basic_equal for this logic
//! - A path of period-delimited names (like Roblox's GetFullName) should be
//! enough to debug most issues. If it isn't, we can do something fun like
//! generate GraphViz graphs.
use std::{
borrow::Cow,
collections::{HashMap, HashSet},
fmt,
fs::{self, File},
hash::Hash,
path::{Path, PathBuf},
};
use log::error;
use serde_derive::{Serialize, Deserialize};
use rbx_dom_weak::{RbxId, RbxTree};
use librojo::{
rbx_session::MetadataPerInstance,
live_session::LiveSession,
visualize::{VisualizeRbxTree, graphviz_to_svg},
};
use super::snapshot::anonymize_metadata;
/// Marks a 'step' in the test, which will snapshot the session's current
/// RbxTree object and compare it against the saved snapshot if it exists.
pub fn tree_step(step: &str, live_session: &LiveSession, source_path: &Path) {
let rbx_session = live_session.rbx_session.lock().unwrap();
let tree = rbx_session.get_tree();
let project_folder = live_session.root_project().folder_location();
let metadata = rbx_session.get_all_instance_metadata()
.iter()
.map(|(key, meta)| {
let mut meta = meta.clone();
anonymize_metadata(project_folder, &mut meta);
(*key, meta)
})
.collect();
let tree_with_metadata = TreeWithMetadata {
tree: Cow::Borrowed(&tree),
metadata: Cow::Owned(metadata),
};
match read_tree_by_name(source_path, step) {
Some(expected) => match trees_and_metadata_equal(&expected, &tree_with_metadata) {
Ok(_) => {}
Err(e) => {
error!("Trees at step '{}' were not equal.\n{}", step, e);
let expected_gv = format!("{}", VisualizeRbxTree {
tree: &expected.tree,
metadata: &expected.metadata,
});
let actual_gv = format!("{}", VisualizeRbxTree {
tree: &tree_with_metadata.tree,
metadata: &tree_with_metadata.metadata,
});
let output_dir = PathBuf::from("failed-snapshots");
fs::create_dir_all(&output_dir)
.expect("Could not create failed-snapshots directory");
let expected_basename = format!("{}-{}-expected", live_session.root_project().name, step);
let actual_basename = format!("{}-{}-actual", live_session.root_project().name, step);
let mut expected_out = output_dir.join(expected_basename);
let mut actual_out = output_dir.join(actual_basename);
match (graphviz_to_svg(&expected_gv), graphviz_to_svg(&actual_gv)) {
(Some(expected_svg), Some(actual_svg)) => {
expected_out.set_extension("svg");
actual_out.set_extension("svg");
fs::write(&expected_out, expected_svg)
.expect("Couldn't write expected SVG");
fs::write(&actual_out, actual_svg)
.expect("Couldn't write actual SVG");
}
_ => {
expected_out.set_extension("gv");
actual_out.set_extension("gv");
fs::write(&expected_out, expected_gv)
.expect("Couldn't write expected GV");
fs::write(&actual_out, actual_gv)
.expect("Couldn't write actual GV");
}
}
error!("Output at {} and {}", expected_out.display(), actual_out.display());
panic!("Tree mismatch at step '{}'", step);
}
}
None => {
write_tree_by_name(source_path, step, &tree_with_metadata);
}
}
}
fn new_cow_map<K: Clone + Eq + Hash, V: Clone>() -> Cow<'static, HashMap<K, V>> {
Cow::Owned(HashMap::new())
}
#[derive(Debug, Serialize, Deserialize)]
struct TreeWithMetadata<'a> {
#[serde(flatten)]
pub tree: Cow<'a, RbxTree>,
#[serde(default = "new_cow_map")]
pub metadata: Cow<'a, HashMap<RbxId, MetadataPerInstance>>,
}
fn read_tree_by_name(path: &Path, identifier: &str) -> Option<TreeWithMetadata<'static>> {
let mut file_path = path.join(identifier);
file_path.set_extension("tree.json");
let contents = fs::read(&file_path).ok()?;
let tree: TreeWithMetadata = serde_json::from_slice(&contents)
.expect("Could not deserialize tree");
Some(tree)
}
fn write_tree_by_name(path: &Path, identifier: &str, tree: &TreeWithMetadata) {
let mut file_path = path.join(identifier);
file_path.set_extension("tree.json");
let mut file = File::create(file_path)
.expect("Could not open file to write tree");
serde_json::to_writer_pretty(&mut file, tree)
.expect("Could not serialize tree to file");
}
#[derive(Debug)]
pub struct TreeMismatch {
pub path: Cow<'static, str>,
pub detail: Cow<'static, str>,
}
impl TreeMismatch {
pub fn new<'a, A: Into<Cow<'a, str>>, B: Into<Cow<'a, str>>>(path: A, detail: B) -> TreeMismatch {
TreeMismatch {
path: Cow::Owned(path.into().into_owned()),
detail: Cow::Owned(detail.into().into_owned()),
}
}
fn add_parent(mut self, name: &str) -> TreeMismatch {
self.path.to_mut().insert(0, '.');
self.path.to_mut().insert_str(0, name);
TreeMismatch {
path: self.path,
detail: self.detail,
}
}
}
impl fmt::Display for TreeMismatch {
fn fmt(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
writeln!(formatter, "Tree mismatch at path {}", self.path)?;
writeln!(formatter, "{}", self.detail)
}
}
pub fn trees_equal(
left_tree: &RbxTree,
right_tree: &RbxTree,
) -> Result<(), TreeMismatch> {
let left = TreeWithMetadata {
tree: Cow::Borrowed(left_tree),
metadata: Cow::Owned(HashMap::new()),
};
let right = TreeWithMetadata {
tree: Cow::Borrowed(right_tree),
metadata: Cow::Owned(HashMap::new()),
};
trees_and_metadata_equal(&left, &right)
}
fn trees_and_metadata_equal(
left_tree: &TreeWithMetadata,
right_tree: &TreeWithMetadata,
) -> Result<(), TreeMismatch> {
let left_id = left_tree.tree.get_root_id();
let right_id = right_tree.tree.get_root_id();
instances_equal(left_tree, left_id, right_tree, right_id)
}
fn instances_equal(
left_tree: &TreeWithMetadata,
left_id: RbxId,
right_tree: &TreeWithMetadata,
right_id: RbxId,
) -> Result<(), TreeMismatch> {
basic_equal(left_tree, left_id, right_tree, right_id)?;
properties_equal(left_tree, left_id, right_tree, right_id)?;
children_equal(left_tree, left_id, right_tree, right_id)?;
metadata_equal(left_tree, left_id, right_tree, right_id)
}
fn basic_equal(
left_tree: &TreeWithMetadata,
left_id: RbxId,
right_tree: &TreeWithMetadata,
right_id: RbxId,
) -> Result<(), TreeMismatch> {
let left_instance = left_tree.tree.get_instance(left_id)
.expect("ID did not exist in left tree");
let right_instance = right_tree.tree.get_instance(right_id)
.expect("ID did not exist in right tree");
if left_instance.name != right_instance.name {
let message = format!("Name did not match ('{}' vs '{}')", left_instance.name, right_instance.name);
return Err(TreeMismatch::new(&left_instance.name, message));
}
if left_instance.class_name != right_instance.class_name {
let message = format!("Class name did not match ('{}' vs '{}')", left_instance.class_name, right_instance.class_name);
return Err(TreeMismatch::new(&left_instance.name, message));
}
Ok(())
}
fn properties_equal(
left_tree: &TreeWithMetadata,
left_id: RbxId,
right_tree: &TreeWithMetadata,
right_id: RbxId,
) -> Result<(), TreeMismatch> {
let left_instance = left_tree.tree.get_instance(left_id)
.expect("ID did not exist in left tree");
let right_instance = right_tree.tree.get_instance(right_id)
.expect("ID did not exist in right tree");
let mut visited = HashSet::new();
for (key, left_value) in &left_instance.properties {
visited.insert(key);
let right_value = right_instance.properties.get(key);
if Some(left_value) != right_value {
let message = format!(
"Property {}:\n\tLeft: {:?}\n\tRight: {:?}",
key,
Some(left_value),
right_value,
);
return Err(TreeMismatch::new(&left_instance.name, message));
}
}
for (key, right_value) in &right_instance.properties {
if visited.contains(key) {
continue;
}
let left_value = left_instance.properties.get(key);
if left_value != Some(right_value) {
let message = format!(
"Property {}:\n\tLeft: {:?}\n\tRight: {:?}",
key,
left_value,
Some(right_value),
);
return Err(TreeMismatch::new(&left_instance.name, message));
}
}
Ok(())
}
fn children_equal(
left_tree: &TreeWithMetadata,
left_id: RbxId,
right_tree: &TreeWithMetadata,
right_id: RbxId,
) -> Result<(), TreeMismatch> {
let left_instance = left_tree.tree.get_instance(left_id)
.expect("ID did not exist in left tree");
let right_instance = right_tree.tree.get_instance(right_id)
.expect("ID did not exist in right tree");
let left_children = left_instance.get_children_ids();
let right_children = right_instance.get_children_ids();
if left_children.len() != right_children.len() {
return Err(TreeMismatch::new(&left_instance.name, "Instances had different numbers of children"));
}
for (left_child_id, right_child_id) in left_children.iter().zip(right_children) {
instances_equal(left_tree, *left_child_id, right_tree, *right_child_id)
.map_err(|e| e.add_parent(&left_instance.name))?;
}
Ok(())
}
fn metadata_equal(
left_tree: &TreeWithMetadata,
left_id: RbxId,
right_tree: &TreeWithMetadata,
right_id: RbxId,
) -> Result<(), TreeMismatch> {
let left_meta = left_tree.metadata.get(&left_id);
let right_meta = right_tree.metadata.get(&right_id);
if left_meta != right_meta {
let left_instance = left_tree.tree.get_instance(left_id)
.expect("Left instance didn't exist in tree");
let message = format!(
"Metadata mismatch:\n\tLeft: {:?}\n\tRight: {:?}",
left_meta,
right_meta,
);
return Err(TreeMismatch::new(&left_instance.name, message));
}
Ok(())
}

View File

@@ -0,0 +1,68 @@
mod test_util;
use std::{
fs,
path::{Path, PathBuf},
sync::Arc,
thread,
time::Duration,
};
use tempfile::{tempdir, TempDir};
use librojo::{
live_session::LiveSession,
project::Project,
};
use crate::test_util::{
copy_recursive,
tree::tree_step,
};
#[test]
fn multi_partition_game() {
let _ = env_logger::try_init();
let source_path = project_path("multi_partition_game");
let (dir, live_session) = start_session(&source_path);
tree_step("initial", &live_session, &source_path);
let added_path = dir.path().join("a/added");
fs::create_dir_all(&added_path)
.expect("Couldn't create directory");
thread::sleep(Duration::from_millis(250));
tree_step("with_dir", &live_session, &source_path);
let moved_path = dir.path().join("b/added");
fs::rename(&added_path, &moved_path)
.expect("Couldn't rename directory");
thread::sleep(Duration::from_millis(250));
tree_step("with_moved_dir", &live_session, &source_path);
}
/// Find the path to the given test project relative to the manifest.
fn project_path(name: &str) -> PathBuf {
let mut path = Path::new(env!("CARGO_MANIFEST_DIR")).join("../test-projects");
path.push(name);
path
}
/// Starts a new LiveSession for the project located at the given file path.
fn start_session(source_path: &Path) -> (TempDir, LiveSession) {
let dir = tempdir()
.expect("Couldn't create temporary directory");
copy_recursive(&source_path, dir.path())
.expect("Couldn't copy project to temporary directory");
let project = Arc::new(Project::load_fuzzy(dir.path())
.expect("Couldn't load project from temp directory"));
let live_session = LiveSession::new(Arc::clone(&project))
.expect("Couldn't start live session");
(dir, live_session)
}

View File

@@ -0,0 +1,6 @@
{
"name": "localization",
"tree": {
"$path": "src"
}
}

View File

@@ -0,0 +1,69 @@
{
"name": "localization",
"class_name": "Folder",
"properties": {},
"children": [
{
"name": "empty-column-bug-147",
"class_name": "LocalizationTable",
"properties": {
"Contents": {
"Type": "String",
"Value": "[{\"key\":\"Language.Name\",\"source\":\"English\",\"values\":{}},{\"key\":\"Language.Region\",\"source\":\"United States\",\"values\":{}},{\"key\":\"Label.Thickness\",\"source\":\"Thickness\",\"values\":{}},{\"key\":\"Label.Opacity\",\"source\":\"Opacity\",\"values\":{}},{\"key\":\"Toolbar.Undo\",\"source\":\"Undo\",\"values\":{}},{\"key\":\"Toolbar.Redo\",\"source\":\"Redo\",\"values\":{}},{\"key\":\"Toolbar.Camera\",\"source\":\"Top-down camera\",\"values\":{}},{\"key\":\"Toolbar.Saves\",\"source\":\"Saved drawings\",\"values\":{}},{\"key\":\"Toolbar.Preferences\",\"source\":\"Settings\",\"values\":{}},{\"key\":\"Toolbar.Mode.Vector\",\"source\":\"Vector mode\",\"values\":{}},{\"key\":\"Toolbar.Mode.Pixel\",\"source\":\"Pixel mode\",\"values\":{}}]"
}
},
"children": [],
"metadata": {
"ignore_unknown_instances": false,
"source_path": "src/empty-column-bug-147.csv",
"project_definition": null
}
},
{
"name": "integers-bug-145",
"class_name": "LocalizationTable",
"properties": {
"Contents": {
"Type": "String",
"Value": "[{\"key\":\"Count\",\"example\":\"A number demonstrating issue 145\",\"source\":\"3\",\"values\":{\"es\":\"7\"}}]"
}
},
"children": [],
"metadata": {
"ignore_unknown_instances": false,
"source_path": "src/integers-bug-145.csv",
"project_definition": null
}
},
{
"name": "normal",
"class_name": "LocalizationTable",
"properties": {
"Contents": {
"Type": "String",
"Value": "[{\"key\":\"Ack\",\"example\":\"An exclamation of despair\",\"source\":\"Ack!\",\"values\":{\"es\":\"¡Ay!\"}}]"
}
},
"children": [],
"metadata": {
"ignore_unknown_instances": false,
"source_path": "src/normal.csv",
"project_definition": null
}
}
],
"metadata": {
"ignore_unknown_instances": false,
"source_path": "src",
"project_definition": [
"localization",
{
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "src"
}
]
}
}

View File

@@ -0,0 +1,17 @@
,Key,Source,Context,Example
,,,,
Metadata,Language.Name,English,,
,Language.Region,United States,,
,,,,
Options,Label.Thickness,Thickness,,
,Label.Opacity,Opacity,,
,,,,
Toolbar,Toolbar.Undo,Undo,,
,Toolbar.Redo,Redo,,
,,,,
,Toolbar.Camera,Top-down camera,,
,Toolbar.Saves,Saved drawings,,
,Toolbar.Preferences,Settings,,
,,,,
,Toolbar.Mode.Vector,Vector mode,,
,Toolbar.Mode.Pixel,Pixel mode,,
1 Key Source Context Example
2
3 Metadata Language.Name English
4 Language.Region United States
5
6 Options Label.Thickness Thickness
7 Label.Opacity Opacity
8
9 Toolbar Toolbar.Undo Undo
10 Toolbar.Redo Redo
11
12 Toolbar.Camera Top-down camera
13 Toolbar.Saves Saved drawings
14 Toolbar.Preferences Settings
15
16 Toolbar.Mode.Vector Vector mode
17 Toolbar.Mode.Pixel Pixel mode

View File

@@ -0,0 +1,2 @@
Key,Source,Context,Example,es
Count,3,,A number demonstrating issue 145,7
1 Key Source Context Example es
2 Count 3 A number demonstrating issue 145 7

View File

@@ -0,0 +1,2 @@
Key,Source,Context,Example,es
Ack,Ack!,,An exclamation of despair,¡Ay!
1 Key Source Context Example es
2 Ack Ack! An exclamation of despair ¡Ay!

View File

@@ -0,0 +1,6 @@
{
"name": "malformed-stuff",
"tree": {
"$path": "src"
}
}

View File

@@ -0,0 +1,2 @@
ahhh this isn't a JSON model
bamboozled again

View File

@@ -0,0 +1 @@
Hello world, from a/foo.txt

View File

@@ -0,0 +1 @@
-- hello, from a/main.lua

View File

@@ -0,0 +1 @@
-- b/something.lua

View File

@@ -0,0 +1,21 @@
{
"name": "multi_partition_game",
"tree": {
"$className": "DataModel",
"ReplicatedStorage": {
"$className": "ReplicatedStorage",
"Ack": {
"$path": "a"
},
"Bar": {
"$path": "b"
}
},
"HttpService": {
"$className": "HttpService",
"$properties": {
"HttpEnabled": true
}
}
}
}

View File

@@ -0,0 +1,212 @@
{
"name": "multi_partition_game",
"class_name": "DataModel",
"properties": {},
"children": [
{
"name": "HttpService",
"class_name": "HttpService",
"properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"children": [],
"metadata": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"HttpService",
{
"class_name": "HttpService",
"children": {},
"properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"ignore_unknown_instances": null,
"path": null
}
]
}
},
{
"name": "ReplicatedStorage",
"class_name": "ReplicatedStorage",
"properties": {},
"children": [
{
"name": "Ack",
"class_name": "Folder",
"properties": {},
"children": [
{
"name": "foo",
"class_name": "StringValue",
"properties": {
"Value": {
"Type": "String",
"Value": "Hello world, from a/foo.txt"
}
},
"children": [],
"metadata": {
"ignore_unknown_instances": false,
"source_path": "a/foo.txt",
"project_definition": null
}
},
{
"name": "main",
"class_name": "ModuleScript",
"properties": {
"Source": {
"Type": "String",
"Value": "-- hello, from a/main.lua"
}
},
"children": [],
"metadata": {
"ignore_unknown_instances": false,
"source_path": "a/main.lua",
"project_definition": null
}
}
],
"metadata": {
"ignore_unknown_instances": false,
"source_path": "a",
"project_definition": [
"Ack",
{
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
}
]
}
},
{
"name": "Bar",
"class_name": "Folder",
"properties": {},
"children": [
{
"name": "something",
"class_name": "ModuleScript",
"properties": {
"Source": {
"Type": "String",
"Value": "-- b/something.lua"
}
},
"children": [],
"metadata": {
"ignore_unknown_instances": false,
"source_path": "b/something.lua",
"project_definition": null
}
}
],
"metadata": {
"ignore_unknown_instances": false,
"source_path": "b",
"project_definition": [
"Bar",
{
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
]
}
}
],
"metadata": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"ReplicatedStorage",
{
"class_name": "ReplicatedStorage",
"children": {
"Bar": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
},
"Ack": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
]
}
}
],
"metadata": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"multi_partition_game",
{
"class_name": "DataModel",
"children": {
"ReplicatedStorage": {
"class_name": "ReplicatedStorage",
"children": {
"Bar": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
},
"Ack": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
},
"HttpService": {
"class_name": "HttpService",
"children": {},
"properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"ignore_unknown_instances": null,
"path": null
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
]
}
}

View File

@@ -0,0 +1,242 @@
{
"instances": {
"00f207b1-fc18-4088-a45e-caf8cd98f5dd": {
"Name": "main",
"ClassName": "ModuleScript",
"Properties": {
"Source": {
"Type": "String",
"Value": "-- hello, from a/main.lua"
}
},
"Id": "00f207b1-fc18-4088-a45e-caf8cd98f5dd",
"Children": [],
"Parent": "14fed1a3-ba97-46a6-ae93-ac26bd9471df"
},
"14fed1a3-ba97-46a6-ae93-ac26bd9471df": {
"Name": "Ack",
"ClassName": "Folder",
"Properties": {},
"Id": "14fed1a3-ba97-46a6-ae93-ac26bd9471df",
"Children": [
"c55fd55c-258e-4a93-a63a-ea243038c9b9",
"00f207b1-fc18-4088-a45e-caf8cd98f5dd"
],
"Parent": "99eefe5f-ef74-49e6-8a8b-c833e00ca56b"
},
"c910510c-37a8-4fd8-ae41-01169ccb739c": {
"Name": "Bar",
"ClassName": "Folder",
"Properties": {},
"Id": "c910510c-37a8-4fd8-ae41-01169ccb739c",
"Children": [
"71a95983-c856-4cf2-aee6-bd8a523e80e4"
],
"Parent": "99eefe5f-ef74-49e6-8a8b-c833e00ca56b"
},
"c55fd55c-258e-4a93-a63a-ea243038c9b9": {
"Name": "foo",
"ClassName": "StringValue",
"Properties": {
"Value": {
"Type": "String",
"Value": "Hello world, from a/foo.txt"
}
},
"Id": "c55fd55c-258e-4a93-a63a-ea243038c9b9",
"Children": [],
"Parent": "14fed1a3-ba97-46a6-ae93-ac26bd9471df"
},
"71a95983-c856-4cf2-aee6-bd8a523e80e4": {
"Name": "something",
"ClassName": "ModuleScript",
"Properties": {
"Source": {
"Type": "String",
"Value": "-- b/something.lua"
}
},
"Id": "71a95983-c856-4cf2-aee6-bd8a523e80e4",
"Children": [],
"Parent": "c910510c-37a8-4fd8-ae41-01169ccb739c"
},
"3b5af13f-c997-4009-915c-0810b0e83032": {
"Name": "multi_partition_game",
"ClassName": "DataModel",
"Properties": {},
"Id": "3b5af13f-c997-4009-915c-0810b0e83032",
"Children": [
"bf8e2d4f-33a0-42a0-8168-1b62d6ac050c",
"99eefe5f-ef74-49e6-8a8b-c833e00ca56b"
],
"Parent": null
},
"bf8e2d4f-33a0-42a0-8168-1b62d6ac050c": {
"Name": "HttpService",
"ClassName": "HttpService",
"Properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"Id": "bf8e2d4f-33a0-42a0-8168-1b62d6ac050c",
"Children": [],
"Parent": "3b5af13f-c997-4009-915c-0810b0e83032"
},
"99eefe5f-ef74-49e6-8a8b-c833e00ca56b": {
"Name": "ReplicatedStorage",
"ClassName": "ReplicatedStorage",
"Properties": {},
"Id": "99eefe5f-ef74-49e6-8a8b-c833e00ca56b",
"Children": [
"14fed1a3-ba97-46a6-ae93-ac26bd9471df",
"c910510c-37a8-4fd8-ae41-01169ccb739c"
],
"Parent": "3b5af13f-c997-4009-915c-0810b0e83032"
}
},
"root_id": "3b5af13f-c997-4009-915c-0810b0e83032",
"metadata": {
"00f207b1-fc18-4088-a45e-caf8cd98f5dd": {
"ignore_unknown_instances": false,
"source_path": "a/main.lua",
"project_definition": null
},
"bf8e2d4f-33a0-42a0-8168-1b62d6ac050c": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"HttpService",
{
"class_name": "HttpService",
"children": {},
"properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"ignore_unknown_instances": null,
"path": null
}
]
},
"14fed1a3-ba97-46a6-ae93-ac26bd9471df": {
"ignore_unknown_instances": false,
"source_path": "a",
"project_definition": [
"Ack",
{
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
}
]
},
"c55fd55c-258e-4a93-a63a-ea243038c9b9": {
"ignore_unknown_instances": false,
"source_path": "a/foo.txt",
"project_definition": null
},
"71a95983-c856-4cf2-aee6-bd8a523e80e4": {
"ignore_unknown_instances": false,
"source_path": "b/something.lua",
"project_definition": null
},
"c910510c-37a8-4fd8-ae41-01169ccb739c": {
"ignore_unknown_instances": false,
"source_path": "b",
"project_definition": [
"Bar",
{
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
]
},
"99eefe5f-ef74-49e6-8a8b-c833e00ca56b": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"ReplicatedStorage",
{
"class_name": "ReplicatedStorage",
"children": {
"Ack": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
},
"Bar": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
]
},
"3b5af13f-c997-4009-915c-0810b0e83032": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"multi_partition_game",
{
"class_name": "DataModel",
"children": {
"HttpService": {
"class_name": "HttpService",
"children": {},
"properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"ignore_unknown_instances": null,
"path": null
},
"ReplicatedStorage": {
"class_name": "ReplicatedStorage",
"children": {
"Ack": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
},
"Bar": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
]
}
}
}

View File

@@ -0,0 +1,256 @@
{
"instances": {
"b48b369f-5706-4029-9fa6-90651a4910ea": {
"Name": "added",
"ClassName": "Folder",
"Properties": {},
"Id": "b48b369f-5706-4029-9fa6-90651a4910ea",
"Children": [],
"Parent": "14fed1a3-ba97-46a6-ae93-ac26bd9471df"
},
"00f207b1-fc18-4088-a45e-caf8cd98f5dd": {
"Name": "main",
"ClassName": "ModuleScript",
"Properties": {
"Source": {
"Type": "String",
"Value": "-- hello, from a/main.lua"
}
},
"Id": "00f207b1-fc18-4088-a45e-caf8cd98f5dd",
"Children": [],
"Parent": "14fed1a3-ba97-46a6-ae93-ac26bd9471df"
},
"14fed1a3-ba97-46a6-ae93-ac26bd9471df": {
"Name": "Ack",
"ClassName": "Folder",
"Properties": {},
"Id": "14fed1a3-ba97-46a6-ae93-ac26bd9471df",
"Children": [
"b48b369f-5706-4029-9fa6-90651a4910ea",
"c55fd55c-258e-4a93-a63a-ea243038c9b9",
"00f207b1-fc18-4088-a45e-caf8cd98f5dd"
],
"Parent": "99eefe5f-ef74-49e6-8a8b-c833e00ca56b"
},
"c910510c-37a8-4fd8-ae41-01169ccb739c": {
"Name": "Bar",
"ClassName": "Folder",
"Properties": {},
"Id": "c910510c-37a8-4fd8-ae41-01169ccb739c",
"Children": [
"71a95983-c856-4cf2-aee6-bd8a523e80e4"
],
"Parent": "99eefe5f-ef74-49e6-8a8b-c833e00ca56b"
},
"c55fd55c-258e-4a93-a63a-ea243038c9b9": {
"Name": "foo",
"ClassName": "StringValue",
"Properties": {
"Value": {
"Type": "String",
"Value": "Hello world, from a/foo.txt"
}
},
"Id": "c55fd55c-258e-4a93-a63a-ea243038c9b9",
"Children": [],
"Parent": "14fed1a3-ba97-46a6-ae93-ac26bd9471df"
},
"71a95983-c856-4cf2-aee6-bd8a523e80e4": {
"Name": "something",
"ClassName": "ModuleScript",
"Properties": {
"Source": {
"Type": "String",
"Value": "-- b/something.lua"
}
},
"Id": "71a95983-c856-4cf2-aee6-bd8a523e80e4",
"Children": [],
"Parent": "c910510c-37a8-4fd8-ae41-01169ccb739c"
},
"3b5af13f-c997-4009-915c-0810b0e83032": {
"Name": "multi_partition_game",
"ClassName": "DataModel",
"Properties": {},
"Id": "3b5af13f-c997-4009-915c-0810b0e83032",
"Children": [
"bf8e2d4f-33a0-42a0-8168-1b62d6ac050c",
"99eefe5f-ef74-49e6-8a8b-c833e00ca56b"
],
"Parent": null
},
"bf8e2d4f-33a0-42a0-8168-1b62d6ac050c": {
"Name": "HttpService",
"ClassName": "HttpService",
"Properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"Id": "bf8e2d4f-33a0-42a0-8168-1b62d6ac050c",
"Children": [],
"Parent": "3b5af13f-c997-4009-915c-0810b0e83032"
},
"99eefe5f-ef74-49e6-8a8b-c833e00ca56b": {
"Name": "ReplicatedStorage",
"ClassName": "ReplicatedStorage",
"Properties": {},
"Id": "99eefe5f-ef74-49e6-8a8b-c833e00ca56b",
"Children": [
"14fed1a3-ba97-46a6-ae93-ac26bd9471df",
"c910510c-37a8-4fd8-ae41-01169ccb739c"
],
"Parent": "3b5af13f-c997-4009-915c-0810b0e83032"
}
},
"root_id": "3b5af13f-c997-4009-915c-0810b0e83032",
"metadata": {
"c55fd55c-258e-4a93-a63a-ea243038c9b9": {
"ignore_unknown_instances": false,
"source_path": "a/foo.txt",
"project_definition": null
},
"99eefe5f-ef74-49e6-8a8b-c833e00ca56b": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"ReplicatedStorage",
{
"class_name": "ReplicatedStorage",
"children": {
"Ack": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
},
"Bar": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
]
},
"71a95983-c856-4cf2-aee6-bd8a523e80e4": {
"ignore_unknown_instances": false,
"source_path": "b/something.lua",
"project_definition": null
},
"00f207b1-fc18-4088-a45e-caf8cd98f5dd": {
"ignore_unknown_instances": false,
"source_path": "a/main.lua",
"project_definition": null
},
"bf8e2d4f-33a0-42a0-8168-1b62d6ac050c": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"HttpService",
{
"class_name": "HttpService",
"children": {},
"properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"ignore_unknown_instances": null,
"path": null
}
]
},
"b48b369f-5706-4029-9fa6-90651a4910ea": {
"ignore_unknown_instances": false,
"source_path": "a/added",
"project_definition": null
},
"3b5af13f-c997-4009-915c-0810b0e83032": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"multi_partition_game",
{
"class_name": "DataModel",
"children": {
"HttpService": {
"class_name": "HttpService",
"children": {},
"properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"ignore_unknown_instances": null,
"path": null
},
"ReplicatedStorage": {
"class_name": "ReplicatedStorage",
"children": {
"Ack": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
},
"Bar": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
]
},
"c910510c-37a8-4fd8-ae41-01169ccb739c": {
"ignore_unknown_instances": false,
"source_path": "b",
"project_definition": [
"Bar",
{
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
]
},
"14fed1a3-ba97-46a6-ae93-ac26bd9471df": {
"ignore_unknown_instances": false,
"source_path": "a",
"project_definition": [
"Ack",
{
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
}
]
}
}
}

View File

@@ -0,0 +1,256 @@
{
"instances": {
"866071d6-465a-4b88-8c63-07489d950916": {
"Name": "added",
"ClassName": "Folder",
"Properties": {},
"Id": "866071d6-465a-4b88-8c63-07489d950916",
"Children": [],
"Parent": "c910510c-37a8-4fd8-ae41-01169ccb739c"
},
"00f207b1-fc18-4088-a45e-caf8cd98f5dd": {
"Name": "main",
"ClassName": "ModuleScript",
"Properties": {
"Source": {
"Type": "String",
"Value": "-- hello, from a/main.lua"
}
},
"Id": "00f207b1-fc18-4088-a45e-caf8cd98f5dd",
"Children": [],
"Parent": "14fed1a3-ba97-46a6-ae93-ac26bd9471df"
},
"14fed1a3-ba97-46a6-ae93-ac26bd9471df": {
"Name": "Ack",
"ClassName": "Folder",
"Properties": {},
"Id": "14fed1a3-ba97-46a6-ae93-ac26bd9471df",
"Children": [
"c55fd55c-258e-4a93-a63a-ea243038c9b9",
"00f207b1-fc18-4088-a45e-caf8cd98f5dd"
],
"Parent": "99eefe5f-ef74-49e6-8a8b-c833e00ca56b"
},
"c910510c-37a8-4fd8-ae41-01169ccb739c": {
"Name": "Bar",
"ClassName": "Folder",
"Properties": {},
"Id": "c910510c-37a8-4fd8-ae41-01169ccb739c",
"Children": [
"866071d6-465a-4b88-8c63-07489d950916",
"71a95983-c856-4cf2-aee6-bd8a523e80e4"
],
"Parent": "99eefe5f-ef74-49e6-8a8b-c833e00ca56b"
},
"c55fd55c-258e-4a93-a63a-ea243038c9b9": {
"Name": "foo",
"ClassName": "StringValue",
"Properties": {
"Value": {
"Type": "String",
"Value": "Hello world, from a/foo.txt"
}
},
"Id": "c55fd55c-258e-4a93-a63a-ea243038c9b9",
"Children": [],
"Parent": "14fed1a3-ba97-46a6-ae93-ac26bd9471df"
},
"71a95983-c856-4cf2-aee6-bd8a523e80e4": {
"Name": "something",
"ClassName": "ModuleScript",
"Properties": {
"Source": {
"Type": "String",
"Value": "-- b/something.lua"
}
},
"Id": "71a95983-c856-4cf2-aee6-bd8a523e80e4",
"Children": [],
"Parent": "c910510c-37a8-4fd8-ae41-01169ccb739c"
},
"3b5af13f-c997-4009-915c-0810b0e83032": {
"Name": "multi_partition_game",
"ClassName": "DataModel",
"Properties": {},
"Id": "3b5af13f-c997-4009-915c-0810b0e83032",
"Children": [
"bf8e2d4f-33a0-42a0-8168-1b62d6ac050c",
"99eefe5f-ef74-49e6-8a8b-c833e00ca56b"
],
"Parent": null
},
"bf8e2d4f-33a0-42a0-8168-1b62d6ac050c": {
"Name": "HttpService",
"ClassName": "HttpService",
"Properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"Id": "bf8e2d4f-33a0-42a0-8168-1b62d6ac050c",
"Children": [],
"Parent": "3b5af13f-c997-4009-915c-0810b0e83032"
},
"99eefe5f-ef74-49e6-8a8b-c833e00ca56b": {
"Name": "ReplicatedStorage",
"ClassName": "ReplicatedStorage",
"Properties": {},
"Id": "99eefe5f-ef74-49e6-8a8b-c833e00ca56b",
"Children": [
"14fed1a3-ba97-46a6-ae93-ac26bd9471df",
"c910510c-37a8-4fd8-ae41-01169ccb739c"
],
"Parent": "3b5af13f-c997-4009-915c-0810b0e83032"
}
},
"root_id": "3b5af13f-c997-4009-915c-0810b0e83032",
"metadata": {
"bf8e2d4f-33a0-42a0-8168-1b62d6ac050c": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"HttpService",
{
"class_name": "HttpService",
"children": {},
"properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"ignore_unknown_instances": null,
"path": null
}
]
},
"c910510c-37a8-4fd8-ae41-01169ccb739c": {
"ignore_unknown_instances": false,
"source_path": "b",
"project_definition": [
"Bar",
{
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
]
},
"866071d6-465a-4b88-8c63-07489d950916": {
"ignore_unknown_instances": false,
"source_path": "b/added",
"project_definition": null
},
"14fed1a3-ba97-46a6-ae93-ac26bd9471df": {
"ignore_unknown_instances": false,
"source_path": "a",
"project_definition": [
"Ack",
{
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
}
]
},
"00f207b1-fc18-4088-a45e-caf8cd98f5dd": {
"ignore_unknown_instances": false,
"source_path": "a/main.lua",
"project_definition": null
},
"99eefe5f-ef74-49e6-8a8b-c833e00ca56b": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"ReplicatedStorage",
{
"class_name": "ReplicatedStorage",
"children": {
"Ack": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
},
"Bar": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
]
},
"71a95983-c856-4cf2-aee6-bd8a523e80e4": {
"ignore_unknown_instances": false,
"source_path": "b/something.lua",
"project_definition": null
},
"c55fd55c-258e-4a93-a63a-ea243038c9b9": {
"ignore_unknown_instances": false,
"source_path": "a/foo.txt",
"project_definition": null
},
"3b5af13f-c997-4009-915c-0810b0e83032": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"multi_partition_game",
{
"class_name": "DataModel",
"children": {
"HttpService": {
"class_name": "HttpService",
"children": {},
"properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"ignore_unknown_instances": null,
"path": null
},
"ReplicatedStorage": {
"class_name": "ReplicatedStorage",
"children": {
"Ack": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
},
"Bar": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
]
}
}
}

View File

@@ -11,10 +11,7 @@
"HttpService": {
"$className": "HttpService",
"$properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
"HttpEnabled": true
}
}
}