Compare commits

...

25 Commits

Author SHA1 Message Date
Lucien Greathouse
77f79fa913 0.5.0-alpha.8 2019-03-29 17:36:43 -07:00
Lucien Greathouse
6db714a2b1 Special-case Lighting.Technology in setCanonicalProperty, temporary fix 2019-03-29 17:25:57 -07:00
Lucien Greathouse
913ac7c9f5 Update dependencies 2019-03-28 15:44:56 -07:00
Lucien Greathouse
eecbfd29e7 Update dependencies, adding a bunch of new features 2019-03-27 13:31:12 -07:00
Lucien Greathouse
41025225b2 Rewrite message queue with oneshot futures (#139) 2019-03-27 13:27:50 -07:00
Lucien Greathouse
07c7b28c03 Fix plugin unloading 2019-03-21 22:35:30 -07:00
Lucien Greathouse
3faf3d2a56 Update Changelog for #135 2019-03-20 10:42:18 -07:00
Lucien Greathouse
be094d5b7c Make snapshot application communicative (#135)
* Add children sorting to snapshot_reconciler

* Update snapshot tests to include stable children order

* Bump dependencies, which should make this PR work
2019-03-20 10:39:53 -07:00
Lucien Greathouse
459673bd59 0.5.0-alpha.6 2019-03-19 18:24:30 -07:00
Lucien Greathouse
2968b70e6b Listen to Plugin.Unloading.
Closes #127.
2019-03-19 18:17:03 -07:00
Lucien Greathouse
b6989a18fc Add conditionally-enabled typechecking using t 2019-03-19 17:57:19 -07:00
Lucien Greathouse
4d6a504836 Remove Rodux and Roact-Rodux, add t dependency 2019-03-19 16:34:53 -07:00
Lucien Greathouse
6c3737df68 Update Changelog 2019-03-19 16:31:34 -07:00
Lucien Greathouse
9f382ed9bd Iterate on plugin reconciler
- Renamed setProperty to setCanonicalProperty, which is more usefully
  descriptive. Also added a detailed comment.
- Fixed reconciler behavior with regards to removing known instances
  when $ignoreUnknownInstances is set
2019-03-19 16:30:06 -07:00
Lucien Greathouse
f9e86e58d6 Add InstanceMap:destroyInstance for forgetting and destroying in one step 2019-03-19 16:29:56 -07:00
Lucien Greathouse
469f9c927f Improve plugin place project for testing 2019-03-19 16:29:31 -07:00
Lucien Greathouse
312724189b Remove ignore from old doc generator script 2019-03-14 14:20:38 -07:00
Lucien Greathouse
ec0a1f1ce4 New snapshot tests (#134)
* Changes project-related structures to use `BTreeMap` instead of `HashMap` for children to aid determiniusm
* Changes imfs-related structures to have total ordering and use `BTreeSet` instead of `HashSet`
* Upgrades dependencies to `bx_dom_weak`1.2.0 and rbx_xml 0.5.0 to aid in more determinism stuff
* Re-exposes the `RbxSession`'s root project via `root_project()`
* Implements `Default` for a couple things
* Tweaks visualization code to support visualizing trees not attached to an `RbxSession`
* Adds an ID-invariant comparison method for `rbx_tree` relying on previous determinism changes
* Adds a (disabled) test to start finding issues in the reconciler with regards to communicativity of snapshot application
* Adds a snapshot testing system that operates on `RbxTree` and associated metadata, which are committed in this change
2019-03-14 14:20:03 -07:00
Lucien Greathouse
ad93631ef8 Port to futures channel instead of std one.
Fixes #133.
2019-03-12 11:45:39 -07:00
Lucien Greathouse
3b6238ff93 Add more types to plugin 2019-03-11 16:55:42 -07:00
Lucien Greathouse
5b9facee00 Fix up variable naming in serialize_unresolved_minimal 2019-03-11 16:35:54 -07:00
Lucien Greathouse
376f2a554a Better default project, including minimal property types 2019-03-11 16:28:40 -07:00
Lucien Greathouse
5fd0bd3db9 Update/prune dependencies with help of cargo-outdated 2019-03-11 14:12:49 -07:00
Lucien Greathouse
2deb3bbf23 Add notable feature from dependency upgrade 2019-03-11 13:48:02 -07:00
Lucien Greathouse
01bef0c2b8 Update dependencies 2019-03-11 13:47:33 -07:00
49 changed files with 2789 additions and 589 deletions

2
.gitignore vendored
View File

@@ -2,4 +2,4 @@
/target /target
/scratch-project /scratch-project
**/*.rs.bk **/*.rs.bk
/generate-docs.run /server/failed-snapshots/

9
.gitmodules vendored
View File

@@ -1,12 +1,6 @@
[submodule "plugin/modules/roact"] [submodule "plugin/modules/roact"]
path = plugin/modules/roact path = plugin/modules/roact
url = https://github.com/Roblox/roact.git url = https://github.com/Roblox/roact.git
[submodule "plugin/modules/rodux"]
path = plugin/modules/rodux
url = https://github.com/Roblox/rodux.git
[submodule "plugin/modules/roact-rodux"]
path = plugin/modules/roact-rodux
url = https://github.com/Roblox/roact-rodux.git
[submodule "plugin/modules/testez"] [submodule "plugin/modules/testez"]
path = plugin/modules/testez path = plugin/modules/testez
url = https://github.com/Roblox/testez.git url = https://github.com/Roblox/testez.git
@@ -16,3 +10,6 @@
[submodule "plugin/modules/promise"] [submodule "plugin/modules/promise"]
path = plugin/modules/promise path = plugin/modules/promise
url = https://github.com/LPGhatguy/roblox-lua-promise.git url = https://github.com/LPGhatguy/roblox-lua-promise.git
[submodule "plugin/modules/t"]
path = plugin/modules/t
url = https://github.com/osyrisrblx/t.git

View File

@@ -2,6 +2,31 @@
## [Unreleased] ## [Unreleased]
## [0.5.0 Alpha 8](https://github.com/LPGhatguy/rojo/releases/tag/v0.5.0-alpha.8) (March 29, 2019)
* Added support for a bunch of new types when dealing with XML model/place files:
* `ColorSequence`
* `Float64`
* `Int64`
* `NumberRange`
* `NumberSequence`
* `PhysicalProperties`
* `Ray`
* `Rect`
* `Ref`
* Improved server instance ordering behavior when files are added during a live session ([#135](https://github.com/LPGhatguy/rojo/pull/135))
* Fixed error being thrown when trying to unload the Rojo plugin.
* Added partial fix for [issue #141](https://github.com/LPGhatguy/rojo/issues/141) for `Lighting.Technology`, which should restore live sync functionality for the default project file.
## [0.5.0 Alpha 6](https://github.com/LPGhatguy/rojo/releases/tag/v0.5.0-alpha.6) (March 19, 2019)
* Fixed `rojo init` giving unexpected results by upgrading to `rbx_dom_weak` 1.1.0
* Fixed live server not responding when the Rojo plugin is connected ([#133](https://github.com/LPGhatguy/rojo/issues/133))
* Updated default place file:
* Improved default properties to be closer to Studio's built-in 'Baseplate' template
* Added a baseplate to the project file (Thanks, [@AmaranthineCodices](https://github.com/AmaranthineCodices/)!)
* Added more type support to Rojo plugin
* Fixed some cases where the Rojo plugin would leave around objects that it knows should be deleted
* Updated plugin to correctly listen to `Plugin.Unloading` when installing or uninstalling new plugins
## [0.5.0 Alpha 5](https://github.com/LPGhatguy/rojo/releases/tag/v0.5.0-alpha.5) (March 1, 2019) ## [0.5.0 Alpha 5](https://github.com/LPGhatguy/rojo/releases/tag/v0.5.0-alpha.5) (March 1, 2019)
* Upgraded core dependencies, which improves compatibility for lots of instance types * Upgraded core dependencies, which improves compatibility for lots of instance types
* Upgraded from `rbx_tree` 0.2.0 to `rbx_dom_weak` 1.0.0 * Upgraded from `rbx_tree` 0.2.0 to `rbx_dom_weak` 1.0.0

478
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -25,7 +25,7 @@ If you have Rust installed, the easiest way to get Rojo is with Cargo!
To install the latest 0.5.0 alpha, use: To install the latest 0.5.0 alpha, use:
```sh ```sh
cargo install rojo --version 0.5.0-alpha.5 cargo install rojo --version 0.5.0-alpha.8
``` ```
## Installing the Plugin ## Installing the Plugin

View File

@@ -8,14 +8,11 @@
"Roact": { "Roact": {
"$path": "modules/roact/lib" "$path": "modules/roact/lib"
}, },
"Rodux": {
"$path": "modules/rodux/lib"
},
"RoactRodux": {
"$path": "modules/roact-rodux/lib"
},
"Promise": { "Promise": {
"$path": "modules/promise/lib" "$path": "modules/promise/lib"
},
"t": {
"$path": "modules/t/lib/t.lua"
} }
} }
} }

1
plugin/modules/t Submodule

Submodule plugin/modules/t added at a3a80ebf0a

View File

@@ -15,14 +15,11 @@
"Roact": { "Roact": {
"$path": "modules/roact/lib" "$path": "modules/roact/lib"
}, },
"Rodux": {
"$path": "modules/rodux/lib"
},
"RoactRodux": {
"$path": "modules/roact-rodux/lib"
},
"Promise": { "Promise": {
"$path": "modules/promise/lib" "$path": "modules/promise/lib"
},
"t": {
"$path": "modules/t/lib/t.lua"
} }
}, },
"TestEZ": { "TestEZ": {
@@ -40,8 +37,8 @@
} }
}, },
"TestService": { "ServerScriptService": {
"$className": "TestService", "$className": "ServerScriptService",
"TestBootstrap": { "TestBootstrap": {
"$path": "testBootstrap.server.lua" "$path": "testBootstrap.server.lua"

View File

@@ -182,6 +182,13 @@ function App:didMount()
preloadAssets() preloadAssets()
end end
function App:willUnmount()
if self.currentSession ~= nil then
self.currentSession:disconnect()
self.currentSession = nil
end
end
function App:didUpdate() function App:didUpdate()
local connectActive = self.state.sessionStatus == SessionStatus.ConfiguringSession local connectActive = self.state.sessionStatus == SessionStatus.ConfiguringSession
or self.state.sessionStatus == SessionStatus.Connected or self.state.sessionStatus == SessionStatus.Connected

View File

@@ -1,6 +1,6 @@
return { return {
codename = "Epiphany", codename = "Epiphany",
version = {0, 5, 0, "-alpha.5"}, version = {0, 5, 0, "-alpha.8"},
expectedServerVersionString = "0.5.0 or newer", expectedServerVersionString = "0.5.0 or newer",
protocolVersion = 2, protocolVersion = 2,
defaultHost = "localhost", defaultHost = "localhost",

View File

@@ -1,10 +1,27 @@
local Config = require(script.Parent.Config) local Config = require(script.Parent.Config)
local Environment = {
User = "User",
Dev = "Dev",
Test = "Test",
}
local VALUES = { local VALUES = {
LogLevel = { LogLevel = {
type = "IntValue", type = "IntValue",
defaultUserValue = 2, values = {
defaultDevValue = 3, [Environment.User] = 2,
[Environment.Dev] = 3,
[Environment.Test] = 3,
},
},
TypecheckingEnabled = {
type = "BoolValue",
values = {
[Environment.User] = false,
[Environment.Dev] = true,
[Environment.Test] = true,
},
}, },
} }
@@ -42,7 +59,9 @@ local function setStoredValue(name, kind, value)
object.Value = value object.Value = value
end end
local function createAllValues() local function createAllValues(environment)
assert(Environment[environment] ~= nil, "Invalid environment")
valueContainer = getValueContainer() valueContainer = getValueContainer()
if valueContainer == nil then if valueContainer == nil then
@@ -52,20 +71,57 @@ local function createAllValues()
end end
for name, value in pairs(VALUES) do for name, value in pairs(VALUES) do
setStoredValue(name, value.type, value.defaultDevValue) setStoredValue(name, value.type, value.values[environment])
end end
end end
_G[("ROJO_%s_DEV_CREATE"):format(Config.codename:upper())] = createAllValues local function getValue(name)
assert(VALUES[name] ~= nil, "Invalid DevSettings name")
local stored = getStoredValue(name)
if stored ~= nil then
return stored
end
return VALUES[name].values[Environment.User]
end
local DevSettings = {} local DevSettings = {}
function DevSettings:createDevSettings()
createAllValues(Environment.Dev)
end
function DevSettings:createTestSettings()
createAllValues(Environment.Test)
end
function DevSettings:hasChangedValues()
return valueContainer ~= nil
end
function DevSettings:resetValues()
if valueContainer then
valueContainer:Destroy()
valueContainer = nil
end
end
function DevSettings:isEnabled() function DevSettings:isEnabled()
return valueContainer ~= nil return valueContainer ~= nil
end end
function DevSettings:getLogLevel() function DevSettings:getLogLevel()
return getStoredValue("LogLevel") or VALUES.LogLevel.defaultUserValue return getValue("LogLevel")
end
function DevSettings:shouldTypecheck()
return getValue("TypecheckingEnabled")
end
function _G.ROJO_DEV_CREATE()
DevSettings:createDevSettings()
end end
return DevSettings return DevSettings

View File

@@ -45,6 +45,16 @@ function InstanceMap:removeInstance(instance)
end end
end end
function InstanceMap:destroyInstance(instance)
local id = self.fromInstances[instance]
if id ~= nil then
self:destroyId(id)
else
Logging.warn("Attempted to destroy untracked instance %s", tostring(instance))
end
end
function InstanceMap:destroyId(id) function InstanceMap:destroyId(id)
local instance = self.fromIds[id] local instance = self.fromIds[id]
self:removeId(id) self:removeId(id)

View File

@@ -1,7 +1,10 @@
local t = require(script.Parent.Parent.t)
local InstanceMap = require(script.Parent.InstanceMap) local InstanceMap = require(script.Parent.InstanceMap)
local Logging = require(script.Parent.Logging) local Logging = require(script.Parent.Logging)
local setProperty = require(script.Parent.setProperty) local setCanonicalProperty = require(script.Parent.setCanonicalProperty)
local rojoValueToRobloxValue = require(script.Parent.rojoValueToRobloxValue) local rojoValueToRobloxValue = require(script.Parent.rojoValueToRobloxValue)
local Types = require(script.Parent.Types)
local Reconciler = {} local Reconciler = {}
Reconciler.__index = Reconciler Reconciler.__index = Reconciler
@@ -24,11 +27,18 @@ function Reconciler:applyUpdate(requestedIds, virtualInstancesById)
end end
end end
local reconcileSchema = Types.ifEnabled(t.tuple(
t.map(t.string, Types.VirtualInstance),
t.string,
t.Instance
))
--[[ --[[
Update an existing instance, including its properties and children, to match Update an existing instance, including its properties and children, to match
the given information. the given information.
]] ]]
function Reconciler:reconcile(virtualInstancesById, id, instance) function Reconciler:reconcile(virtualInstancesById, id, instance)
assert(reconcileSchema(virtualInstancesById, id, instance))
local virtualInstance = virtualInstancesById[id] local virtualInstance = virtualInstancesById[id]
-- If an instance changes ClassName, we assume it's very different. That's -- If an instance changes ClassName, we assume it's very different. That's
@@ -43,10 +53,10 @@ function Reconciler:reconcile(virtualInstancesById, id, instance)
self.instanceMap:insert(id, instance) self.instanceMap:insert(id, instance)
-- Some instances don't like being named, even if their name already matches -- Some instances don't like being named, even if their name already matches
setProperty(instance, "Name", virtualInstance.Name) setCanonicalProperty(instance, "Name", virtualInstance.Name)
for key, value in pairs(virtualInstance.Properties) do for key, value in pairs(virtualInstance.Properties) do
setProperty(instance, key, rojoValueToRobloxValue(value)) setCanonicalProperty(instance, key, rojoValueToRobloxValue(value))
end end
local existingChildren = instance:GetChildren() local existingChildren = instance:GetChildren()
@@ -81,10 +91,17 @@ function Reconciler:reconcile(virtualInstancesById, id, instance)
end end
end end
if self:__shouldClearUnknownInstances(virtualInstance) then local shouldClearUnknown = self:__shouldClearUnknownChildren(virtualInstance)
for existingChildInstance in pairs(unvisitedExistingChildren) do
self.instanceMap:removeInstance(existingChildInstance) for existingChildInstance in pairs(unvisitedExistingChildren) do
existingChildInstance:Destroy() local childId = self.instanceMap.fromInstances[existingChildInstance]
if childId == nil then
if shouldClearUnknown then
existingChildInstance:Destroy()
end
else
self.instanceMap:destroyInstance(existingChildInstance)
end end
end end
@@ -100,13 +117,13 @@ function Reconciler:reconcile(virtualInstancesById, id, instance)
-- Some instances, like services, don't like having their Parent -- Some instances, like services, don't like having their Parent
-- property poked, even if we're setting it to the same value. -- property poked, even if we're setting it to the same value.
setProperty(instance, "Parent", parent) setCanonicalProperty(instance, "Parent", parent)
end end
return instance return instance
end end
function Reconciler:__shouldClearUnknownInstances(virtualInstance) function Reconciler:__shouldClearUnknownChildren(virtualInstance)
if virtualInstance.Metadata ~= nil then if virtualInstance.Metadata ~= nil then
return not virtualInstance.Metadata.ignoreUnknownInstances return not virtualInstance.Metadata.ignoreUnknownInstances
else else
@@ -114,28 +131,44 @@ function Reconciler:__shouldClearUnknownInstances(virtualInstance)
end end
end end
local reifySchema = Types.ifEnabled(t.tuple(
t.map(t.string, Types.VirtualInstance),
t.string,
t.Instance
))
function Reconciler:__reify(virtualInstancesById, id, parent) function Reconciler:__reify(virtualInstancesById, id, parent)
assert(reifySchema(virtualInstancesById, id, parent))
local virtualInstance = virtualInstancesById[id] local virtualInstance = virtualInstancesById[id]
local instance = Instance.new(virtualInstance.ClassName) local instance = Instance.new(virtualInstance.ClassName)
for key, value in pairs(virtualInstance.Properties) do for key, value in pairs(virtualInstance.Properties) do
setProperty(instance, key, rojoValueToRobloxValue(value)) setCanonicalProperty(instance, key, rojoValueToRobloxValue(value))
end end
instance.Name = virtualInstance.Name setCanonicalProperty(instance, "Name", virtualInstance.Name)
for _, childId in ipairs(virtualInstance.Children) do for _, childId in ipairs(virtualInstance.Children) do
self:__reify(virtualInstancesById, childId, instance) self:__reify(virtualInstancesById, childId, instance)
end end
setProperty(instance, "Parent", parent) setCanonicalProperty(instance, "Parent", parent)
self.instanceMap:insert(id, instance) self.instanceMap:insert(id, instance)
return instance return instance
end end
local applyUpdatePieceSchema = Types.ifEnabled(t.tuple(
t.string,
t.map(t.string, t.boolean),
t.map(t.string, Types.VirtualInstance)
))
function Reconciler:__applyUpdatePiece(id, visitedIds, virtualInstancesById) function Reconciler:__applyUpdatePiece(id, visitedIds, virtualInstancesById)
assert(applyUpdatePieceSchema(id, visitedIds, virtualInstancesById))
if visitedIds[id] then if visitedIds[id] then
return return
end end

View File

@@ -0,0 +1,218 @@
local Reconciler = require(script.Parent.Reconciler)
return function()
it("should leave instances alone if there's nothing specified", function()
local instance = Instance.new("Folder")
instance.Name = "TestFolder"
local instanceId = "test-id"
local virtualInstancesById = {
[instanceId] = {
Name = "TestFolder",
ClassName = "Folder",
Children = {},
Properties = {},
},
}
local reconciler = Reconciler.new()
reconciler:reconcile(virtualInstancesById, instanceId, instance)
end)
it("should assign names from virtual instances", function()
local instance = Instance.new("Folder")
instance.Name = "InitialName"
local instanceId = "test-id"
local virtualInstancesById = {
[instanceId] = {
Name = "NewName",
ClassName = "Folder",
Children = {},
Properties = {},
},
}
local reconciler = Reconciler.new()
reconciler:reconcile(virtualInstancesById, instanceId, instance)
expect(instance.Name).to.equal("NewName")
end)
it("should assign properties from virtual instances", function()
local instance = Instance.new("IntValue")
instance.Name = "TestValue"
instance.Value = 5
local instanceId = "test-id"
local virtualInstancesById = {
[instanceId] = {
Name = "TestValue",
ClassName = "IntValue",
Children = {},
Properties = {
Value = {
Type = "Int32",
Value = 9
}
},
},
}
local reconciler = Reconciler.new()
reconciler:reconcile(virtualInstancesById, instanceId, instance)
expect(instance.Value).to.equal(9)
end)
it("should wipe unknown children by default", function()
local parent = Instance.new("Folder")
parent.Name = "Parent"
local child = Instance.new("Folder")
child.Name = "Child"
local parentId = "test-id"
local virtualInstancesById = {
[parentId] = {
Name = "Parent",
ClassName = "Folder",
Children = {},
Properties = {},
},
}
local reconciler = Reconciler.new()
reconciler:reconcile(virtualInstancesById, parentId, parent)
expect(#parent:GetChildren()).to.equal(0)
end)
it("should preserve unknown children if ignoreUnknownInstances is set", function()
local parent = Instance.new("Folder")
parent.Name = "Parent"
local child = Instance.new("Folder")
child.Parent = parent
child.Name = "Child"
local parentId = "test-id"
local virtualInstancesById = {
[parentId] = {
Name = "Parent",
ClassName = "Folder",
Children = {},
Properties = {},
Metadata = {
ignoreUnknownInstances = true,
},
},
}
local reconciler = Reconciler.new()
reconciler:reconcile(virtualInstancesById, parentId, parent)
expect(child.Parent).to.equal(parent)
expect(#parent:GetChildren()).to.equal(1)
end)
it("should remove known removed children", function()
local parent = Instance.new("Folder")
parent.Name = "Parent"
local child = Instance.new("Folder")
child.Parent = parent
child.Name = "Child"
local parentId = "parent-id"
local childId = "child-id"
local reconciler = Reconciler.new()
local virtualInstancesById = {
[parentId] = {
Name = "Parent",
ClassName = "Folder",
Children = {childId},
Properties = {},
},
[childId] = {
Name = "Child",
ClassName = "Folder",
Children = {},
Properties = {},
},
}
reconciler:reconcile(virtualInstancesById, parentId, parent)
expect(child.Parent).to.equal(parent)
expect(#parent:GetChildren()).to.equal(1)
local newVirtualInstances = {
[parentId] = {
Name = "Parent",
ClassName = "Folder",
Children = {},
Properties = {},
},
[childId] = nil,
}
reconciler:reconcile(newVirtualInstances, parentId, parent)
expect(child.Parent).to.equal(nil)
expect(#parent:GetChildren()).to.equal(0)
end)
it("should remove known removed children if ignoreUnknownInstances is set", function()
local parent = Instance.new("Folder")
parent.Name = "Parent"
local child = Instance.new("Folder")
child.Parent = parent
child.Name = "Child"
local parentId = "parent-id"
local childId = "child-id"
local reconciler = Reconciler.new()
local virtualInstancesById = {
[parentId] = {
Name = "Parent",
ClassName = "Folder",
Children = {childId},
Properties = {},
Metadata = {
ignoreUnknownInstances = true,
},
},
[childId] = {
Name = "Child",
ClassName = "Folder",
Children = {},
Properties = {},
},
}
reconciler:reconcile(virtualInstancesById, parentId, parent)
expect(child.Parent).to.equal(parent)
expect(#parent:GetChildren()).to.equal(1)
local newVirtualInstances = {
[parentId] = {
Name = "Parent",
ClassName = "Folder",
Children = {},
Properties = {},
Metadata = {
ignoreUnknownInstances = true,
},
},
[childId] = nil,
}
reconciler:reconcile(newVirtualInstances, parentId, parent)
expect(child.Parent).to.equal(nil)
expect(#parent:GetChildren()).to.equal(0)
end)
end

36
plugin/src/Types.lua Normal file
View File

@@ -0,0 +1,36 @@
local t = require(script.Parent.Parent.t)
local DevSettings = require(script.Parent.DevSettings)
local VirtualValue = t.interface({
Type = t.string,
Value = t.optional(t.any),
})
local VirtualMetadata = t.interface({
ignoreUnknownInstances = t.optional(t.boolean),
})
local VirtualInstance = t.interface({
Name = t.string,
ClassName = t.string,
Properties = t.map(t.string, VirtualValue),
Metadata = t.optional(VirtualMetadata)
})
local function ifEnabled(innerCheck)
return function(...)
if DevSettings:shouldTypecheck() then
return innerCheck(...)
else
return true
end
end
end
return {
ifEnabled = ifEnabled,
VirtualInstance = VirtualInstance,
VirtualMetadata = VirtualMetadata,
VirtualValue = VirtualValue,
}

View File

@@ -4,16 +4,14 @@ end
local Roact = require(script.Parent.Roact) local Roact = require(script.Parent.Roact)
Roact.setGlobalConfig({
elementTracing = true,
})
local App = require(script.Components.App) local App = require(script.Components.App)
local app = Roact.createElement(App, { local app = Roact.createElement(App, {
plugin = plugin, plugin = plugin,
}) })
Roact.mount(app, game:GetService("CoreGui"), "Rojo UI") local tree = Roact.mount(app, game:GetService("CoreGui"), "Rojo UI")
-- TODO: Detect another instance of Rojo coming online and shut down this one. plugin.Unloading:Connect(function()
Roact.unmount(tree)
end)

View File

@@ -1,14 +1,20 @@
local primitiveTypes = { local primitiveTypes = {
String = true,
Bool = true, Bool = true,
Int32 = true,
Float32 = true,
Enum = true, Enum = true,
Float32 = true,
Float64 = true,
Int32 = true,
Int64 = true,
String = true,
} }
local directConstructors = { local directConstructors = {
CFrame = CFrame.new, CFrame = CFrame.new,
Color3 = Color3.new, Color3 = Color3.new,
Color3uint8 = Color3.fromRGB,
Rect = Rect.new,
UDim = UDim.new,
UDim2 = UDim2.new,
Vector2 = Vector2.new, Vector2 = Vector2.new,
Vector2int16 = Vector2int16.new, Vector2int16 = Vector2int16.new,
Vector3 = Vector3.new, Vector3 = Vector3.new,

View File

@@ -1,10 +1,17 @@
local Logging = require(script.Parent.Logging) local Logging = require(script.Parent.Logging)
--[[ --[[
Attempts to set a property on the given instance, correctly handling Attempts to set a property on the given instance.
'virtual properties', which aren't reflected directly to Lua.
This method deals in terms of what Rojo calls 'canonical properties', which
don't necessarily exist either in serialization or in Lua-reflected APIs,
but may be present in the API dump.
Ideally, canonical properties map 1:1 with properties we can assign, but in
some cases like LocalizationTable contents and CollectionService tags, we
have to read/write properties a little differently.
]] ]]
local function setProperty(instance, key, value) local function setCanonicalProperty(instance, key, value)
-- The 'Contents' property of LocalizationTable isn't directly exposed, but -- The 'Contents' property of LocalizationTable isn't directly exposed, but
-- has corresponding (deprecated) getters and setters. -- has corresponding (deprecated) getters and setters.
if instance.ClassName == "LocalizationTable" and key == "Contents" then if instance.ClassName == "LocalizationTable" and key == "Contents" then
@@ -12,6 +19,11 @@ local function setProperty(instance, key, value)
return return
end end
-- Temporary workaround for fixing issue #141 in this specific case.
if instance.ClassName == "Lighting" and key == "Technology" then
return
end
-- If we don't have permissions to access this value at all, we can skip it. -- If we don't have permissions to access this value at all, we can skip it.
local readSuccess, existingValue = pcall(function() local readSuccess, existingValue = pcall(function()
return instance[key] return instance[key]
@@ -42,4 +54,4 @@ local function setProperty(instance, key, value)
return true return true
end end
return setProperty return setCanonicalProperty

View File

@@ -1,2 +1,19 @@
local TestEZ = require(game.ReplicatedStorage.TestEZ) local ReplicatedStorage = game:GetService("ReplicatedStorage")
TestEZ.TestBootstrap:run({game.ReplicatedStorage.Rojo.Plugin})
local TestEZ = require(ReplicatedStorage.TestEZ)
local Rojo = ReplicatedStorage.Rojo
local DevSettings = require(Rojo.Plugin.DevSettings)
local setDevSettings = not DevSettings:hasChangedValues()
if setDevSettings then
DevSettings:createTestSettings()
end
TestEZ.TestBootstrap:run({Rojo.Plugin})
if setDevSettings then
DevSettings:resetValues()
end

View File

@@ -1,6 +1,6 @@
[package] [package]
name = "rojo" name = "rojo"
version = "0.5.0-alpha.5" version = "0.5.0-alpha.8"
authors = ["Lucien Greathouse <me@lpghatguy.com>"] authors = ["Lucien Greathouse <me@lpghatguy.com>"]
description = "A tool to create robust Roblox projects" description = "A tool to create robust Roblox projects"
license = "MIT" license = "MIT"
@@ -29,10 +29,9 @@ hyper = "0.12"
log = "0.4" log = "0.4"
maplit = "1.0.1" maplit = "1.0.1"
notify = "4.0" notify = "4.0"
rand = "0.4"
rbx_binary = "0.4.0" rbx_binary = "0.4.0"
rbx_dom_weak = "1.0.0" rbx_dom_weak = "1.3.0"
rbx_xml = "0.4.0" rbx_xml = "0.6.0"
rbx_reflection = "2.0.374" rbx_reflection = "2.0.374"
regex = "1.0" regex = "1.0"
reqwest = "0.9.5" reqwest = "0.9.5"
@@ -47,5 +46,5 @@ uuid = { version = "0.7", features = ["v4", "serde"] }
tempfile = "3.0" tempfile = "3.0"
walkdir = "2.1" walkdir = "2.1"
lazy_static = "1.2" lazy_static = "1.2"
pretty_assertions = "0.5.1" pretty_assertions = "0.6.1"
paste = "0.1" paste = "0.1"

View File

@@ -0,0 +1,66 @@
{
"name": "[placeholder]",
"tree": {
"$className": "DataModel",
"HttpService": {
"$className": "HttpService",
"$properties": {
"HttpEnabled": true
}
},
"Lighting": {
"$className": "Lighting",
"$properties": {
"Ambient": [
0,
0,
0
],
"Brightness": 2,
"GlobalShadows": true,
"Outlines": false,
"Technology": "Voxel"
}
},
"ReplicatedStorage": {
"$className": "ReplicatedStorage",
"Source": {
"$path": "src"
}
},
"SoundService": {
"$className": "SoundService",
"$properties": {
"RespectFilteringEnabled": true
}
},
"Workspace": {
"$className": "Workspace",
"$properties": {
"FilteringEnabled": true
},
"Baseplate": {
"$className": "Part",
"$properties": {
"Anchored": true,
"Color": [
0.38823,
0.37254,
0.38823
],
"Locked": true,
"Position": [
0,
-10,
0
],
"Size": [
512,
20,
512
]
}
}
}
}
}

View File

@@ -1,9 +1,10 @@
use std::{ use std::{
collections::{HashMap, HashSet}, cmp::Ordering,
path::{self, Path, PathBuf}, collections::{HashMap, HashSet, BTreeSet},
fmt, fmt,
fs, fs,
io, io,
path::{self, Path, PathBuf},
}; };
use failure::Fail; use failure::Fail;
@@ -237,7 +238,7 @@ impl Imfs {
} else if metadata.is_dir() { } else if metadata.is_dir() {
let item = ImfsItem::Directory(ImfsDirectory { let item = ImfsItem::Directory(ImfsDirectory {
path: path.to_path_buf(), path: path.to_path_buf(),
children: HashSet::new(), children: BTreeSet::new(),
}); });
self.items.insert(path.to_path_buf(), item); self.items.insert(path.to_path_buf(), item);
@@ -285,19 +286,43 @@ impl Imfs {
} }
} }
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)] #[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
pub struct ImfsFile { pub struct ImfsFile {
pub path: PathBuf, pub path: PathBuf,
pub contents: Vec<u8>, pub contents: Vec<u8>,
} }
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)] impl PartialOrd for ImfsFile {
pub struct ImfsDirectory { fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
pub path: PathBuf, Some(self.cmp(other))
pub children: HashSet<PathBuf>, }
} }
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)] impl Ord for ImfsFile {
fn cmp(&self, other: &Self) -> Ordering {
self.path.cmp(&other.path)
}
}
#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)]
pub struct ImfsDirectory {
pub path: PathBuf,
pub children: BTreeSet<PathBuf>,
}
impl PartialOrd for ImfsDirectory {
fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
Some(self.cmp(other))
}
}
impl Ord for ImfsDirectory {
fn cmp(&self, other: &Self) -> Ordering {
self.path.cmp(&other.path)
}
}
#[derive(Debug, Clone, PartialEq, Eq, PartialOrd, Ord, Serialize, Deserialize)]
pub enum ImfsItem { pub enum ImfsItem {
File(ImfsFile), File(ImfsFile),
Directory(ImfsDirectory), Directory(ImfsDirectory),

View File

@@ -85,6 +85,10 @@ impl LiveSession {
Ok(()) Ok(())
} }
pub fn root_project(&self) -> &Project {
&self.project
}
pub fn session_id(&self) -> SessionId { pub fn session_id(&self) -> SessionId {
self.session_id self.session_id
} }

View File

@@ -1,67 +1,83 @@
use std::{ use std::{
collections::HashMap, mem,
sync::{ sync::{
mpsc,
atomic::{AtomicUsize, Ordering},
RwLock, RwLock,
Mutex, Mutex,
}, },
}; };
/// A unique identifier, not guaranteed to be generated in any order. use futures::sync::oneshot;
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
pub struct ListenerId(usize);
/// Generate a new ID, which has no defined ordering. struct Listener<T> {
pub fn get_listener_id() -> ListenerId { sender: oneshot::Sender<(u32, Vec<T>)>,
static LAST_ID: AtomicUsize = AtomicUsize::new(0); cursor: u32,
}
ListenerId(LAST_ID.fetch_add(1, Ordering::SeqCst)) fn fire_listener_if_ready<T: Clone>(messages: &[T], listener: Listener<T>) -> Result<(), Listener<T>> {
let current_cursor = messages.len() as u32;
if listener.cursor < current_cursor {
let new_messages = messages[(listener.cursor as usize)..].to_vec();
let _ = listener.sender.send((current_cursor, new_messages));
Ok(())
} else {
Err(listener)
}
} }
/// A message queue with persistent history that can be subscribed to. /// A message queue with persistent history that can be subscribed to.
/// ///
/// Definitely non-optimal, but a simple design that works well for the /// Definitely non-optimal. This would ideally be a lockless mpmc queue.
/// synchronous web server Rojo uses, Rouille.
#[derive(Default)] #[derive(Default)]
pub struct MessageQueue<T> { pub struct MessageQueue<T> {
messages: RwLock<Vec<T>>, messages: RwLock<Vec<T>>,
message_listeners: Mutex<HashMap<ListenerId, mpsc::Sender<()>>>, message_listeners: Mutex<Vec<Listener<T>>>,
} }
impl<T: Clone> MessageQueue<T> { impl<T: Clone> MessageQueue<T> {
pub fn new() -> MessageQueue<T> { pub fn new() -> MessageQueue<T> {
MessageQueue { MessageQueue {
messages: RwLock::new(Vec::new()), messages: RwLock::new(Vec::new()),
message_listeners: Mutex::new(HashMap::new()), message_listeners: Mutex::new(Vec::new()),
} }
} }
pub fn push_messages(&self, new_messages: &[T]) { pub fn push_messages(&self, new_messages: &[T]) {
let message_listeners = self.message_listeners.lock().unwrap(); let mut message_listeners = self.message_listeners.lock().unwrap();
let mut messages = self.messages.write().unwrap();
messages.extend_from_slice(new_messages);
{ let mut remaining_listeners = Vec::new();
let mut messages = self.messages.write().unwrap();
messages.extend_from_slice(new_messages); for listener in message_listeners.drain(..) {
match fire_listener_if_ready(&messages, listener) {
Ok(_) => {}
Err(listener) => remaining_listeners.push(listener)
}
} }
for listener in message_listeners.values() { // Without this annotation, Rust gets confused since the first argument
listener.send(()).unwrap(); // is a MutexGuard, but the second is a Vec.
} mem::replace::<Vec<_>>(&mut message_listeners, remaining_listeners);
} }
pub fn subscribe(&self, sender: mpsc::Sender<()>) -> ListenerId { pub fn subscribe(&self, cursor: u32, sender: oneshot::Sender<(u32, Vec<T>)>) {
let id = get_listener_id(); let listener = {
let listener = Listener {
sender,
cursor,
};
let messages = self.messages.read().unwrap();
match fire_listener_if_ready(&messages, listener) {
Ok(_) => return,
Err(listener) => listener
}
};
let mut message_listeners = self.message_listeners.lock().unwrap(); let mut message_listeners = self.message_listeners.lock().unwrap();
message_listeners.insert(id, sender); message_listeners.push(listener);
id
}
pub fn unsubscribe(&self, id: ListenerId) {
let mut message_listeners = self.message_listeners.lock().unwrap();
message_listeners.remove(&id);
} }
pub fn get_message_cursor(&self) -> u32 { pub fn get_message_cursor(&self) -> u32 {

View File

@@ -20,6 +20,12 @@ pub struct PathMap<T> {
nodes: HashMap<PathBuf, PathMapNode<T>>, nodes: HashMap<PathBuf, PathMapNode<T>>,
} }
impl<T> Default for PathMap<T> {
fn default() -> Self {
Self::new()
}
}
impl<T> PathMap<T> { impl<T> PathMap<T> {
pub fn new() -> PathMap<T> { pub fn new() -> PathMap<T> {
PathMap { PathMap {

View File

@@ -1,5 +1,5 @@
use std::{ use std::{
collections::{HashMap, HashSet}, collections::{HashMap, HashSet, BTreeMap},
fmt, fmt,
fs::{self, File}, fs::{self, File},
io, io,
@@ -8,9 +8,11 @@ use std::{
use log::warn; use log::warn;
use failure::Fail; use failure::Fail;
use maplit::hashmap;
use rbx_dom_weak::{UnresolvedRbxValue, RbxValue}; use rbx_dom_weak::{UnresolvedRbxValue, RbxValue};
use serde_derive::{Serialize, Deserialize}; use serde_derive::{Serialize, Deserialize};
use serde::{Serialize, Serializer};
static DEFAULT_PLACE: &'static str = include_str!("../assets/place.project.json");
pub static PROJECT_FILENAME: &'static str = "default.project.json"; pub static PROJECT_FILENAME: &'static str = "default.project.json";
pub static COMPAT_PROJECT_FILENAME: &'static str = "roblox-project.json"; pub static COMPAT_PROJECT_FILENAME: &'static str = "roblox-project.json";
@@ -55,15 +57,88 @@ impl SourceProject {
} }
} }
/// An alternative serializer for `UnresolvedRbxValue` that uses the minimum
/// representation of the value.
///
/// For example, the default Serialize impl might give you:
///
/// ```json
/// {
/// "Type": "Bool",
/// "Value": true
/// }
/// ```
///
/// But in reality, users are expected to write just:
///
/// ```json
/// true
/// ```
///
/// This holds true for other values that might be ambiguous or just have more
/// complicated representations like enums.
fn serialize_unresolved_minimal<S>(unresolved: &UnresolvedRbxValue, serializer: S) -> Result<S::Ok, S::Error>
where S: Serializer
{
match unresolved {
UnresolvedRbxValue::Ambiguous(_) => unresolved.serialize(serializer),
UnresolvedRbxValue::Concrete(concrete) => {
match concrete {
RbxValue::Bool { value } => value.serialize(serializer),
RbxValue::CFrame { value } => value.serialize(serializer),
RbxValue::Color3 { value } => value.serialize(serializer),
RbxValue::Color3uint8 { value } => value.serialize(serializer),
RbxValue::Content { value } => value.serialize(serializer),
RbxValue::Enum { value } => value.serialize(serializer),
RbxValue::Float32 { value } => value.serialize(serializer),
RbxValue::Int32 { value } => value.serialize(serializer),
RbxValue::String { value } => value.serialize(serializer),
RbxValue::UDim { value } => value.serialize(serializer),
RbxValue::UDim2 { value } => value.serialize(serializer),
RbxValue::Vector2 { value } => value.serialize(serializer),
RbxValue::Vector2int16 { value } => value.serialize(serializer),
RbxValue::Vector3 { value } => value.serialize(serializer),
RbxValue::Vector3int16 { value } => value.serialize(serializer),
_ => concrete.serialize(serializer),
}
},
}
}
/// A wrapper around serialize_unresolved_minimal that handles the HashMap case.
fn serialize_unresolved_map<S>(value: &HashMap<String, UnresolvedRbxValue>, serializer: S) -> Result<S::Ok, S::Error>
where S: Serializer
{
use serde::ser::SerializeMap;
#[derive(Serialize)]
struct Minimal<'a>(
#[serde(serialize_with = "serialize_unresolved_minimal")]
&'a UnresolvedRbxValue
);
let mut map = serializer.serialize_map(Some(value.len()))?;
for (k, v) in value {
map.serialize_key(k)?;
map.serialize_value(&Minimal(v))?;
}
map.end()
}
/// Similar to SourceProject, the structure of nodes in the project tree is /// Similar to SourceProject, the structure of nodes in the project tree is
/// slightly different on-disk than how we want to handle them in the rest of /// slightly different on-disk than how we want to handle them in the rest of
/// Rojo. /// Rojo.
#[derive(Debug, Serialize, Deserialize)] #[derive(Debug, Clone, Serialize, Deserialize)]
struct SourceProjectNode { struct SourceProjectNode {
#[serde(rename = "$className", skip_serializing_if = "Option::is_none")] #[serde(rename = "$className", skip_serializing_if = "Option::is_none")]
class_name: Option<String>, class_name: Option<String>,
#[serde(rename = "$properties", default = "HashMap::new", skip_serializing_if = "HashMap::is_empty")] #[serde(
rename = "$properties",
default = "HashMap::new",
skip_serializing_if = "HashMap::is_empty",
serialize_with = "serialize_unresolved_map",
)]
properties: HashMap<String, UnresolvedRbxValue>, properties: HashMap<String, UnresolvedRbxValue>,
#[serde(rename = "$ignoreUnknownInstances", skip_serializing_if = "Option::is_none")] #[serde(rename = "$ignoreUnknownInstances", skip_serializing_if = "Option::is_none")]
@@ -73,14 +148,14 @@ struct SourceProjectNode {
path: Option<String>, path: Option<String>,
#[serde(flatten)] #[serde(flatten)]
children: HashMap<String, SourceProjectNode>, children: BTreeMap<String, SourceProjectNode>,
} }
impl SourceProjectNode { impl SourceProjectNode {
/// Consumes the SourceProjectNode and turns it into a ProjectNode. /// Consumes the SourceProjectNode and turns it into a ProjectNode.
pub fn into_project_node(mut self, project_file_location: &Path) -> ProjectNode { pub fn into_project_node(self, project_file_location: &Path) -> ProjectNode {
let children = self.children.drain() let children = self.children.iter()
.map(|(key, value)| (key, value.into_project_node(project_file_location))) .map(|(key, value)| (key.clone(), value.clone().into_project_node(project_file_location)))
.collect(); .collect();
// Make sure that paths are absolute, transforming them by adding the // Make sure that paths are absolute, transforming them by adding the
@@ -162,6 +237,7 @@ pub enum ProjectInitError {
AlreadyExists(PathBuf), AlreadyExists(PathBuf),
IoError(#[fail(cause)] io::Error), IoError(#[fail(cause)] io::Error),
SaveError(#[fail(cause)] ProjectSaveError), SaveError(#[fail(cause)] ProjectSaveError),
JsonError(#[fail(cause)] serde_json::Error),
} }
impl fmt::Display for ProjectInitError { impl fmt::Display for ProjectInitError {
@@ -170,6 +246,7 @@ impl fmt::Display for ProjectInitError {
ProjectInitError::AlreadyExists(path) => write!(output, "Path {} already exists", path.display()), ProjectInitError::AlreadyExists(path) => write!(output, "Path {} already exists", path.display()),
ProjectInitError::IoError(inner) => write!(output, "IO error: {}", inner), ProjectInitError::IoError(inner) => write!(output, "IO error: {}", inner),
ProjectInitError::SaveError(inner) => write!(output, "{}", inner), ProjectInitError::SaveError(inner) => write!(output, "{}", inner),
ProjectInitError::JsonError(inner) => write!(output, "{}", inner),
} }
} }
} }
@@ -187,7 +264,7 @@ pub enum ProjectSaveError {
#[derive(Debug, Clone, PartialEq, Default, Serialize, Deserialize)] #[derive(Debug, Clone, PartialEq, Default, Serialize, Deserialize)]
pub struct ProjectNode { pub struct ProjectNode {
pub class_name: Option<String>, pub class_name: Option<String>,
pub children: HashMap<String, ProjectNode>, pub children: BTreeMap<String, ProjectNode>,
pub properties: HashMap<String, UnresolvedRbxValue>, pub properties: HashMap<String, UnresolvedRbxValue>,
pub ignore_unknown_instances: Option<bool>, pub ignore_unknown_instances: Option<bool>,
@@ -259,47 +336,16 @@ pub struct Project {
impl Project { impl Project {
pub fn init_place(project_fuzzy_path: &Path) -> Result<PathBuf, ProjectInitError> { pub fn init_place(project_fuzzy_path: &Path) -> Result<PathBuf, ProjectInitError> {
let project_path = Project::init_pick_path(project_fuzzy_path)?; let project_path = Project::init_pick_path(project_fuzzy_path)?;
let project_folder_path = project_path.parent().unwrap();
let project_name = if project_fuzzy_path == project_path { let project_name = if project_fuzzy_path == project_path {
project_fuzzy_path.parent().unwrap().file_name().unwrap().to_str().unwrap() project_fuzzy_path.parent().unwrap().file_name().unwrap().to_str().unwrap()
} else { } else {
project_fuzzy_path.file_name().unwrap().to_str().unwrap() project_fuzzy_path.file_name().unwrap().to_str().unwrap()
}; };
let tree = ProjectNode { let mut project = Project::load_from_str(DEFAULT_PLACE, &project_path)
class_name: Some(String::from("DataModel")), .map_err(ProjectInitError::JsonError)?;
children: hashmap! {
String::from("ReplicatedStorage") => ProjectNode {
class_name: Some(String::from("ReplicatedStorage")),
children: hashmap! {
String::from("Source") => ProjectNode {
path: Some(project_folder_path.join("src")),
..Default::default()
},
},
..Default::default()
},
String::from("HttpService") => ProjectNode {
class_name: Some(String::from("HttpService")),
properties: hashmap! {
String::from("HttpEnabled") => RbxValue::Bool {
value: true,
}.into(),
},
..Default::default()
},
},
..Default::default()
};
let project = Project { project.name = project_name.to_owned();
name: project_name.to_string(),
tree,
plugins: Vec::new(),
serve_port: None,
serve_place_ids: None,
file_location: project_path.clone(),
};
project.save() project.save()
.map_err(ProjectInitError::SaveError)?; .map_err(ProjectInitError::SaveError)?;
@@ -387,6 +433,12 @@ impl Project {
} }
} }
fn load_from_str(contents: &str, project_file_location: &Path) -> Result<Project, serde_json::Error> {
let parsed: SourceProject = serde_json::from_str(&contents)?;
Ok(parsed.into_project(project_file_location))
}
pub fn load_fuzzy(fuzzy_project_location: &Path) -> Result<Project, ProjectLoadFuzzyError> { pub fn load_fuzzy(fuzzy_project_location: &Path) -> Result<Project, ProjectLoadFuzzyError> {
let project_path = Self::locate(fuzzy_project_location) let project_path = Self::locate(fuzzy_project_location)
.ok_or(ProjectLoadFuzzyError::NotFound)?; .ok_or(ProjectLoadFuzzyError::NotFound)?;
@@ -434,6 +486,10 @@ impl Project {
} }
} }
pub fn folder_location(&self) -> &Path {
self.file_location.parent().unwrap()
}
fn to_source_project(&self) -> SourceProject { fn to_source_project(&self) -> SourceProject {
let plugins = self.plugins let plugins = self.plugins
.iter() .iter()

View File

@@ -251,6 +251,10 @@ impl RbxSession {
&self.tree &self.tree
} }
pub fn get_all_instance_metadata(&self) -> &HashMap<RbxId, MetadataPerInstance> {
&self.metadata_per_instance
}
pub fn get_instance_metadata(&self, id: RbxId) -> Option<&MetadataPerInstance> { pub fn get_instance_metadata(&self, id: RbxId) -> Option<&MetadataPerInstance> {
self.metadata_per_instance.get(&id) self.metadata_per_instance.get(&id)
} }

View File

@@ -64,7 +64,7 @@ impl InstanceChanges {
/// A lightweight, hierarchical representation of an instance that can be /// A lightweight, hierarchical representation of an instance that can be
/// applied to the tree. /// applied to the tree.
#[derive(Debug, Clone, PartialEq, Serialize, Deserialize)] #[derive(Debug, Clone, Default, PartialEq, Serialize, Deserialize)]
pub struct RbxSnapshotInstance<'a> { pub struct RbxSnapshotInstance<'a> {
pub name: Cow<'a, str>, pub name: Cow<'a, str>,
pub class_name: Cow<'a, str>, pub class_name: Cow<'a, str>,
@@ -153,7 +153,7 @@ pub fn reify_subtree(
instance_per_path: &mut PathMap<HashSet<RbxId>>, instance_per_path: &mut PathMap<HashSet<RbxId>>,
metadata_per_instance: &mut HashMap<RbxId, MetadataPerInstance>, metadata_per_instance: &mut HashMap<RbxId, MetadataPerInstance>,
changes: &mut InstanceChanges, changes: &mut InstanceChanges,
) { ) -> RbxId {
let instance = reify_core(snapshot); let instance = reify_core(snapshot);
let id = tree.insert_instance(instance, parent_id); let id = tree.insert_instance(instance, parent_id);
@@ -164,6 +164,8 @@ pub fn reify_subtree(
for child in &snapshot.children { for child in &snapshot.children {
reify_subtree(child, tree, id, instance_per_path, metadata_per_instance, changes); reify_subtree(child, tree, id, instance_per_path, metadata_per_instance, changes);
} }
id
} }
fn reify_metadata( fn reify_metadata(
@@ -222,6 +224,9 @@ fn reify_core(snapshot: &RbxSnapshotInstance) -> RbxInstanceProperties {
instance instance
} }
/// Updates the given instance to match the properties defined on the snapshot.
///
/// Returns whether any changes were applied.
fn reconcile_instance_properties(instance: &mut RbxInstanceProperties, snapshot: &RbxSnapshotInstance) -> bool { fn reconcile_instance_properties(instance: &mut RbxInstanceProperties, snapshot: &RbxSnapshotInstance) -> bool {
let mut has_diffs = false; let mut has_diffs = false;
@@ -279,6 +284,8 @@ fn reconcile_instance_properties(instance: &mut RbxInstanceProperties, snapshot:
has_diffs has_diffs
} }
/// Updates the children of the instance in the `RbxTree` to match the children
/// of the `RbxSnapshotInstance`. Order will be updated to match.
fn reconcile_instance_children( fn reconcile_instance_children(
tree: &mut RbxTree, tree: &mut RbxTree,
id: RbxId, id: RbxId,
@@ -287,12 +294,21 @@ fn reconcile_instance_children(
metadata_per_instance: &mut HashMap<RbxId, MetadataPerInstance>, metadata_per_instance: &mut HashMap<RbxId, MetadataPerInstance>,
changes: &mut InstanceChanges, changes: &mut InstanceChanges,
) { ) {
let mut visited_snapshot_indices = HashSet::new(); // These lists are kept so that we can apply all the changes we figure out
let mut children_to_maybe_update: Vec<(RbxId, &RbxSnapshotInstance)> = Vec::new();
let mut children_to_update: Vec<(RbxId, &RbxSnapshotInstance)> = Vec::new(); let mut children_to_add: Vec<(usize, &RbxSnapshotInstance)> = Vec::new();
let mut children_to_add: Vec<&RbxSnapshotInstance> = Vec::new();
let mut children_to_remove: Vec<RbxId> = Vec::new(); let mut children_to_remove: Vec<RbxId> = Vec::new();
// This map is used once we're done mutating children to sort them according
// to the order specified in the snapshot. Without it, a snapshot with a new
// child prepended will cause the RbxTree instance to have out-of-order
// children and would make Rojo non-deterministic.
let mut ids_to_snapshot_indices = HashMap::new();
// Since we have to enumerate the children of both the RbxTree instance and
// our snapshot, we keep a set of the snapshot children we've seen.
let mut visited_snapshot_indices = vec![false; snapshot.children.len()];
let children_ids = tree.get_instance(id).unwrap().get_children_ids(); let children_ids = tree.get_instance(id).unwrap().get_children_ids();
// Find all instances that were removed or updated, which we derive by // Find all instances that were removed or updated, which we derive by
@@ -303,7 +319,7 @@ fn reconcile_instance_children(
// Locate a matching snapshot for this instance // Locate a matching snapshot for this instance
let mut matching_snapshot = None; let mut matching_snapshot = None;
for (snapshot_index, child_snapshot) in snapshot.children.iter().enumerate() { for (snapshot_index, child_snapshot) in snapshot.children.iter().enumerate() {
if visited_snapshot_indices.contains(&snapshot_index) { if visited_snapshot_indices[snapshot_index] {
continue; continue;
} }
@@ -311,7 +327,8 @@ fn reconcile_instance_children(
// similar. This heuristic is similar to React's reconciliation // similar. This heuristic is similar to React's reconciliation
// strategy. // strategy.
if child_snapshot.name == child_instance.name { if child_snapshot.name == child_instance.name {
visited_snapshot_indices.insert(snapshot_index); ids_to_snapshot_indices.insert(child_id, snapshot_index);
visited_snapshot_indices[snapshot_index] = true;
matching_snapshot = Some(child_snapshot); matching_snapshot = Some(child_snapshot);
break; break;
} }
@@ -319,26 +336,23 @@ fn reconcile_instance_children(
match matching_snapshot { match matching_snapshot {
Some(child_snapshot) => { Some(child_snapshot) => {
children_to_update.push((child_instance.get_id(), child_snapshot)); children_to_maybe_update.push((child_instance.get_id(), child_snapshot));
}, }
None => { None => {
children_to_remove.push(child_instance.get_id()); children_to_remove.push(child_instance.get_id());
}, }
} }
} }
// Find all instancs that were added, which is just the snapshots we didn't // Find all instancs that were added, which is just the snapshots we didn't
// match up to existing instances above. // match up to existing instances above.
for (snapshot_index, child_snapshot) in snapshot.children.iter().enumerate() { for (snapshot_index, child_snapshot) in snapshot.children.iter().enumerate() {
if !visited_snapshot_indices.contains(&snapshot_index) { if !visited_snapshot_indices[snapshot_index] {
children_to_add.push(child_snapshot); children_to_add.push((snapshot_index, child_snapshot));
} }
} }
for child_snapshot in &children_to_add { // Apply all of our removals we gathered from our diff
reify_subtree(child_snapshot, tree, id, instance_per_path, metadata_per_instance, changes);
}
for child_id in &children_to_remove { for child_id in &children_to_remove {
if let Some(subtree) = tree.remove_instance(*child_id) { if let Some(subtree) = tree.remove_instance(*child_id) {
for id in subtree.iter_all_ids() { for id in subtree.iter_all_ids() {
@@ -348,7 +362,18 @@ fn reconcile_instance_children(
} }
} }
for (child_id, child_snapshot) in &children_to_update { // Apply all of our children additions
for (snapshot_index, child_snapshot) in &children_to_add {
let id = reify_subtree(child_snapshot, tree, id, instance_per_path, metadata_per_instance, changes);
ids_to_snapshot_indices.insert(id, *snapshot_index);
}
// Apply any updates that might have updates
for (child_id, child_snapshot) in &children_to_maybe_update {
reconcile_subtree(tree, *child_id, child_snapshot, instance_per_path, metadata_per_instance, changes); reconcile_subtree(tree, *child_id, child_snapshot, instance_per_path, metadata_per_instance, changes);
} }
// Apply the sort mapping defined by ids_to_snapshot_indices above
let instance = tree.get_instance_mut(id).unwrap();
instance.sort_children_unstable_by_key(|id| ids_to_snapshot_indices.get(&id).unwrap());
} }

View File

@@ -1,4 +1,5 @@
use std::{ use std::{
collections::HashMap,
fmt, fmt,
io::Write, io::Write,
path::Path, path::Path,
@@ -6,12 +7,13 @@ use std::{
}; };
use log::warn; use log::warn;
use rbx_dom_weak::RbxId; use rbx_dom_weak::{RbxTree, RbxId};
use crate::{ use crate::{
imfs::{Imfs, ImfsItem}, imfs::{Imfs, ImfsItem},
rbx_session::RbxSession, rbx_session::RbxSession,
web::api::PublicInstanceMetadata, web::api::PublicInstanceMetadata,
rbx_session::MetadataPerInstance,
}; };
static GRAPHVIZ_HEADER: &str = r#" static GRAPHVIZ_HEADER: &str = r#"
@@ -53,42 +55,59 @@ pub fn graphviz_to_svg(source: &str) -> Option<String> {
Some(String::from_utf8(output.stdout).expect("Failed to parse stdout as UTF-8")) Some(String::from_utf8(output.stdout).expect("Failed to parse stdout as UTF-8"))
} }
pub struct VisualizeRbxTree<'a, 'b> {
pub tree: &'a RbxTree,
pub metadata: &'b HashMap<RbxId, MetadataPerInstance>,
}
impl<'a, 'b> fmt::Display for VisualizeRbxTree<'a, 'b> {
fn fmt(&self, output: &mut fmt::Formatter) -> fmt::Result {
writeln!(output, "{}", GRAPHVIZ_HEADER)?;
visualize_instance(&self.tree, self.tree.get_root_id(), &self.metadata, output)?;
writeln!(output, "}}")
}
}
/// A Display wrapper struct to visualize an RbxSession as SVG. /// A Display wrapper struct to visualize an RbxSession as SVG.
pub struct VisualizeRbxSession<'a>(pub &'a RbxSession); pub struct VisualizeRbxSession<'a>(pub &'a RbxSession);
impl<'a> fmt::Display for VisualizeRbxSession<'a> { impl<'a> fmt::Display for VisualizeRbxSession<'a> {
fn fmt(&self, output: &mut fmt::Formatter) -> fmt::Result { fn fmt(&self, output: &mut fmt::Formatter) -> fmt::Result {
writeln!(output, "{}", GRAPHVIZ_HEADER)?; writeln!(output, "{}", VisualizeRbxTree {
tree: self.0.get_tree(),
visualize_rbx_node(self.0, self.0.get_tree().get_root_id(), output)?; metadata: self.0.get_all_instance_metadata(),
})
writeln!(output, "}}")?;
Ok(())
} }
} }
fn visualize_rbx_node(session: &RbxSession, id: RbxId, output: &mut fmt::Formatter) -> fmt::Result { fn visualize_instance(
let node = session.get_tree().get_instance(id).unwrap(); tree: &RbxTree,
id: RbxId,
metadata: &HashMap<RbxId, MetadataPerInstance>,
output: &mut fmt::Formatter,
) -> fmt::Result {
let instance = tree.get_instance(id).unwrap();
let mut node_label = format!("{}|{}|{}", node.name, node.class_name, id); let mut instance_label = format!("{}|{}|{}", instance.name, instance.class_name, id);
if let Some(session_metadata) = session.get_instance_metadata(id) { if let Some(session_metadata) = metadata.get(&id) {
let metadata = PublicInstanceMetadata::from_session_metadata(session_metadata); let metadata = PublicInstanceMetadata::from_session_metadata(session_metadata);
node_label.push('|'); instance_label.push('|');
node_label.push_str(&serde_json::to_string(&metadata).unwrap()); instance_label.push_str(&serde_json::to_string(&metadata).unwrap());
} }
node_label = node_label instance_label = instance_label
.replace("\"", "&quot;") .replace("\"", "&quot;")
.replace("{", "\\{") .replace("{", "\\{")
.replace("}", "\\}"); .replace("}", "\\}");
writeln!(output, " \"{}\" [label=\"{}\"]", id, node_label)?; writeln!(output, " \"{}\" [label=\"{}\"]", id, instance_label)?;
for &child_id in node.get_children_ids() { for &child_id in instance.get_children_ids() {
writeln!(output, " \"{}\" -> \"{}\"", id, child_id)?; writeln!(output, " \"{}\" -> \"{}\"", id, child_id)?;
visualize_rbx_node(session, child_id, output)?; visualize_instance(tree, child_id, metadata, output)?;
} }
Ok(()) Ok(())

View File

@@ -4,10 +4,14 @@
use std::{ use std::{
borrow::Cow, borrow::Cow,
collections::{HashMap, HashSet}, collections::{HashMap, HashSet},
sync::{mpsc, Arc}, sync::Arc,
}; };
use futures::{future, Future}; use futures::{
future::{self, IntoFuture},
Future,
sync::oneshot,
};
use hyper::{ use hyper::{
service::Service, service::Service,
header, header,
@@ -114,14 +118,16 @@ impl Service for ApiService {
fn call(&mut self, request: hyper::Request<Self::ReqBody>) -> Self::Future { fn call(&mut self, request: hyper::Request<Self::ReqBody>) -> Self::Future {
let response = match (request.method(), request.uri().path()) { let response = match (request.method(), request.uri().path()) {
(&Method::GET, "/api/rojo") => self.handle_api_rojo(), (&Method::GET, "/api/rojo") => self.handle_api_rojo(),
(&Method::GET, path) if path.starts_with("/api/subscribe/") => self.handle_api_subscribe(request),
(&Method::GET, path) if path.starts_with("/api/read/") => self.handle_api_read(request), (&Method::GET, path) if path.starts_with("/api/read/") => self.handle_api_read(request),
(&Method::GET, path) if path.starts_with("/api/subscribe/") => {
return self.handle_api_subscribe(request);
}
_ => { _ => {
Response::builder() Response::builder()
.status(StatusCode::NOT_FOUND) .status(StatusCode::NOT_FOUND)
.body(Body::empty()) .body(Body::empty())
.unwrap() .unwrap()
}, }
}; };
Box::new(future::ok(response)) Box::new(future::ok(response))
@@ -152,57 +158,41 @@ impl ApiService {
/// Retrieve any messages past the given cursor index, and if /// Retrieve any messages past the given cursor index, and if
/// there weren't any, subscribe to receive any new messages. /// there weren't any, subscribe to receive any new messages.
fn handle_api_subscribe(&self, request: Request<Body>) -> Response<Body> { fn handle_api_subscribe(&self, request: Request<Body>) -> <ApiService as Service>::Future {
let argument = &request.uri().path()["/api/subscribe/".len()..]; let argument = &request.uri().path()["/api/subscribe/".len()..];
let cursor: u32 = match argument.parse() { let cursor: u32 = match argument.parse() {
Ok(v) => v, Ok(v) => v,
Err(err) => { Err(err) => {
return Response::builder() return Box::new(future::ok(Response::builder()
.status(StatusCode::BAD_REQUEST) .status(StatusCode::BAD_REQUEST)
.header(header::CONTENT_TYPE, "text/plain") .header(header::CONTENT_TYPE, "text/plain")
.body(Body::from(err.to_string())) .body(Body::from(err.to_string()))
.unwrap(); .unwrap()));
}, },
}; };
let message_queue = Arc::clone(&self.live_session.message_queue); let message_queue = Arc::clone(&self.live_session.message_queue);
let session_id = self.live_session.session_id();
// Did the client miss any messages since the last subscribe? let (tx, rx) = oneshot::channel();
{ message_queue.subscribe(cursor, tx);
let (new_cursor, new_messages) = message_queue.get_messages_since(cursor);
if !new_messages.is_empty() { let result = rx.into_future()
return response_json(&SubscribeResponse { .and_then(move |(new_cursor, new_messages)| {
session_id: self.live_session.session_id(), Box::new(future::ok(response_json(SubscribeResponse {
messages: Cow::Borrowed(&new_messages), session_id: session_id,
messages: Cow::Owned(new_messages),
message_cursor: new_cursor, message_cursor: new_cursor,
}) })))
}
}
// TOOD: Switch to futures mpsc instead to not block this task
let (tx, rx) = mpsc::channel();
let sender_id = message_queue.subscribe(tx);
match rx.recv() {
Ok(_) => (),
Err(_) => return Response::builder()
.status(500)
.body(Body::from("error!"))
.unwrap(),
}
message_queue.unsubscribe(sender_id);
{
let (new_cursor, new_messages) = message_queue.get_messages_since(cursor);
return response_json(&SubscribeResponse {
session_id: self.live_session.session_id(),
messages: Cow::Owned(new_messages),
message_cursor: new_cursor,
}) })
} .or_else(|e| {
Box::new(future::ok(Response::builder()
.status(500)
.body(Body::from(format!("Internal Error: {:?}", e)))
.unwrap()))
});
Box::new(result)
} }
fn handle_api_read(&self, request: Request<Body>) -> Response<Body> { fn handle_api_read(&self, request: Request<Body>) -> Response<Body> {

View File

@@ -1,5 +1,5 @@
use std::{ use std::{
collections::{HashMap, HashSet}, collections::{HashMap, HashSet, BTreeSet},
fs, fs,
path::PathBuf, path::PathBuf,
}; };
@@ -80,7 +80,7 @@ fn base_tree() -> Result<(TempDir, Imfs, ExpectedImfs, TestResources), Error> {
expected_roots.insert(root.path().to_path_buf()); expected_roots.insert(root.path().to_path_buf());
let root_item = { let root_item = {
let mut children = HashSet::new(); let mut children = BTreeSet::new();
children.insert(foo_path.clone()); children.insert(foo_path.clone());
children.insert(bar_path.clone()); children.insert(bar_path.clone());
@@ -91,7 +91,7 @@ fn base_tree() -> Result<(TempDir, Imfs, ExpectedImfs, TestResources), Error> {
}; };
let foo_item = { let foo_item = {
let mut children = HashSet::new(); let mut children = BTreeSet::new();
children.insert(baz_path.clone()); children.insert(baz_path.clone());
ImfsItem::Directory(ImfsDirectory { ImfsItem::Directory(ImfsDirectory {
@@ -199,7 +199,7 @@ fn adding_folder() -> Result<(), Error> {
} }
let folder_item = { let folder_item = {
let mut children = HashSet::new(); let mut children = BTreeSet::new();
children.insert(file1_path.clone()); children.insert(file1_path.clone());
children.insert(file2_path.clone()); children.insert(file2_path.clone());

View File

@@ -1,7 +1,7 @@
#[macro_use] extern crate lazy_static; #[macro_use] extern crate lazy_static;
use std::{ use std::{
collections::HashMap, collections::{HashMap, BTreeMap},
path::{Path, PathBuf}, path::{Path, PathBuf},
}; };
@@ -53,7 +53,7 @@ fn single_partition_game() {
..Default::default() ..Default::default()
}; };
let mut replicated_storage_children = HashMap::new(); let mut replicated_storage_children = BTreeMap::new();
replicated_storage_children.insert("Foo".to_string(), foo); replicated_storage_children.insert("Foo".to_string(), foo);
let replicated_storage = ProjectNode { let replicated_storage = ProjectNode {
@@ -73,7 +73,7 @@ fn single_partition_game() {
..Default::default() ..Default::default()
}; };
let mut root_children = HashMap::new(); let mut root_children = BTreeMap::new();
root_children.insert("ReplicatedStorage".to_string(), replicated_storage); root_children.insert("ReplicatedStorage".to_string(), replicated_storage);
root_children.insert("HttpService".to_string(), http_service); root_children.insert("HttpService".to_string(), http_service);

View File

@@ -0,0 +1,112 @@
mod test_util;
use std::collections::HashMap;
use pretty_assertions::assert_eq;
use rbx_dom_weak::{RbxTree, RbxInstanceProperties};
use librojo::{
snapshot_reconciler::{RbxSnapshotInstance, reconcile_subtree},
};
use test_util::tree::trees_equal;
#[test]
fn patch_communicativity() {
let base_tree = RbxTree::new(RbxInstanceProperties {
name: "DataModel".into(),
class_name: "DataModel".into(),
properties: HashMap::new(),
});
let patch_a = RbxSnapshotInstance {
name: "DataModel".into(),
class_name: "DataModel".into(),
children: vec![
RbxSnapshotInstance {
name: "Child-A".into(),
class_name: "Folder".into(),
..Default::default()
},
],
..Default::default()
};
let patch_b = RbxSnapshotInstance {
name: "DataModel".into(),
class_name: "DataModel".into(),
children: vec![
RbxSnapshotInstance {
name: "Child-B".into(),
class_name: "Folder".into(),
..Default::default()
},
],
..Default::default()
};
let patch_combined = RbxSnapshotInstance {
name: "DataModel".into(),
class_name: "DataModel".into(),
children: vec![
RbxSnapshotInstance {
name: "Child-A".into(),
class_name: "Folder".into(),
..Default::default()
},
RbxSnapshotInstance {
name: "Child-B".into(),
class_name: "Folder".into(),
..Default::default()
},
],
..Default::default()
};
let root_id = base_tree.get_root_id();
let mut tree_a = base_tree.clone();
reconcile_subtree(
&mut tree_a,
root_id,
&patch_a,
&mut Default::default(),
&mut Default::default(),
&mut Default::default(),
);
reconcile_subtree(
&mut tree_a,
root_id,
&patch_combined,
&mut Default::default(),
&mut Default::default(),
&mut Default::default(),
);
let mut tree_b = base_tree.clone();
reconcile_subtree(
&mut tree_b,
root_id,
&patch_b,
&mut Default::default(),
&mut Default::default(),
&mut Default::default(),
);
reconcile_subtree(
&mut tree_b,
root_id,
&patch_combined,
&mut Default::default(),
&mut Default::default(),
&mut Default::default(),
);
match trees_equal(&tree_a, &tree_b) {
Ok(_) => {}
Err(e) => panic!("{}", e),
}
}

View File

@@ -0,0 +1,68 @@
mod test_util;
use std::path::Path;
use pretty_assertions::assert_eq;
use librojo::{
imfs::Imfs,
project::Project,
rbx_snapshot::{SnapshotContext, snapshot_project_tree},
};
use crate::test_util::{
snapshot::*,
};
macro_rules! generate_snapshot_tests {
($($name: ident),*) => {
$(
paste::item! {
#[test]
fn [<snapshot_ $name>]() {
let _ = env_logger::try_init();
let tests_folder = Path::new(env!("CARGO_MANIFEST_DIR")).join("../test-projects");
let project_folder = tests_folder.join(stringify!($name));
run_snapshot_test(&project_folder);
}
}
)*
};
}
generate_snapshot_tests!(
empty,
multi_partition_game,
nested_partitions,
single_partition_game,
single_partition_model,
transmute_partition
);
fn run_snapshot_test(path: &Path) {
println!("Running snapshot from project: {}", path.display());
let project = Project::load_fuzzy(path)
.expect("Couldn't load project file for snapshot test");
let mut imfs = Imfs::new();
imfs.add_roots_from_project(&project)
.expect("Could not add IMFS roots to snapshot project");
let context = SnapshotContext {
plugin_context: None,
};
let mut snapshot = snapshot_project_tree(&context, &imfs, &project)
.expect("Could not generate snapshot for snapshot test");
if let Some(snapshot) = snapshot.as_mut() {
anonymize_snapshot(path, snapshot);
}
match read_expected_snapshot(path) {
Some(expected_snapshot) => assert_eq!(snapshot, expected_snapshot),
None => write_expected_snapshot(path, &snapshot),
}
}

View File

@@ -1,128 +0,0 @@
use std::{
fs::{self, File},
path::{Path, PathBuf},
};
use pretty_assertions::assert_eq;
use librojo::{
imfs::Imfs,
project::{Project, ProjectNode},
rbx_snapshot::{SnapshotContext, snapshot_project_tree},
snapshot_reconciler::{RbxSnapshotInstance},
};
macro_rules! generate_snapshot_tests {
($($name: ident),*) => {
$(
paste::item! {
#[test]
fn [<snapshot_ $name>]() {
let tests_folder = Path::new(env!("CARGO_MANIFEST_DIR")).join("../test-projects");
let project_folder = tests_folder.join(stringify!($name));
run_snapshot_test(&project_folder);
}
}
)*
};
}
generate_snapshot_tests!(
empty,
nested_partitions,
single_partition_game,
single_partition_model,
transmute_partition
);
const SNAPSHOT_EXPECTED_NAME: &str = "expected-snapshot.json";
fn run_snapshot_test(path: &Path) {
println!("Running snapshot from project: {}", path.display());
let project = Project::load_fuzzy(path)
.expect("Couldn't load project file for snapshot test");
let mut imfs = Imfs::new();
imfs.add_roots_from_project(&project)
.expect("Could not add IMFS roots to snapshot project");
let context = SnapshotContext {
plugin_context: None,
};
let mut snapshot = snapshot_project_tree(&context, &imfs, &project)
.expect("Could not generate snapshot for snapshot test");
if let Some(snapshot) = snapshot.as_mut() {
anonymize_snapshot(path, snapshot);
}
match read_expected_snapshot(path) {
Some(expected_snapshot) => assert_eq!(snapshot, expected_snapshot),
None => write_expected_snapshot(path, &snapshot),
}
}
/// Snapshots contain absolute paths, which simplifies much of Rojo.
///
/// For saving snapshots to the disk, we should strip off the project folder
/// path to make them machine-independent. This doesn't work for paths that fall
/// outside of the project folder, but that's okay here.
///
/// We also need to sort children, since Rojo tends to enumerate the filesystem
/// in an unpredictable order.
fn anonymize_snapshot(project_folder_path: &Path, snapshot: &mut RbxSnapshotInstance) {
match snapshot.metadata.source_path.as_mut() {
Some(path) => *path = anonymize_path(project_folder_path, path),
None => {},
}
match snapshot.metadata.project_definition.as_mut() {
Some((_, project_node)) => anonymize_project_node(project_folder_path, project_node),
None => {},
}
snapshot.children.sort_by(|a, b| a.partial_cmp(b).unwrap());
for child in snapshot.children.iter_mut() {
anonymize_snapshot(project_folder_path, child);
}
}
fn anonymize_project_node(project_folder_path: &Path, project_node: &mut ProjectNode) {
match project_node.path.as_mut() {
Some(path) => *path = anonymize_path(project_folder_path, path),
None => {},
}
for child_node in project_node.children.values_mut() {
anonymize_project_node(project_folder_path, child_node);
}
}
fn anonymize_path(project_folder_path: &Path, path: &Path) -> PathBuf {
if path.is_absolute() {
path.strip_prefix(project_folder_path)
.expect("Could not anonymize absolute path")
.to_path_buf()
} else {
path.to_path_buf()
}
}
fn read_expected_snapshot(path: &Path) -> Option<Option<RbxSnapshotInstance<'static>>> {
let contents = fs::read(path.join(SNAPSHOT_EXPECTED_NAME)).ok()?;
let snapshot: Option<RbxSnapshotInstance<'static>> = serde_json::from_slice(&contents)
.expect("Could not deserialize snapshot");
Some(snapshot)
}
fn write_expected_snapshot(path: &Path, snapshot: &Option<RbxSnapshotInstance>) {
let mut file = File::create(path.join(SNAPSHOT_EXPECTED_NAME))
.expect("Could not open file to write snapshot");
serde_json::to_writer_pretty(&mut file, snapshot)
.expect("Could not serialize snapshot to file");
}

View File

@@ -1,31 +1,13 @@
#![allow(dead_code)]
use std::fs::{create_dir, copy}; use std::fs::{create_dir, copy};
use std::path::Path; use std::path::Path;
use std::io; use std::io;
use rouille::Request;
use walkdir::WalkDir; use walkdir::WalkDir;
use librojo::web::Server; pub mod snapshot;
pub mod tree;
pub trait HttpTestUtil {
fn get_string(&self, url: &str) -> String;
}
impl HttpTestUtil for Server {
fn get_string(&self, url: &str) -> String {
let info_request = Request::fake_http("GET", url, vec![], vec![]);
let response = self.handle_request(&info_request);
assert_eq!(response.status_code, 200);
let (mut reader, _) = response.data.into_reader_and_size();
let mut body = String::new();
reader.read_to_string(&mut body).unwrap();
body
}
}
pub fn copy_recursive(from: &Path, to: &Path) -> io::Result<()> { pub fn copy_recursive(from: &Path, to: &Path) -> io::Result<()> {
for entry in WalkDir::new(from) { for entry in WalkDir::new(from) {
@@ -51,4 +33,4 @@ pub fn copy_recursive(from: &Path, to: &Path) -> io::Result<()> {
} }
Ok(()) Ok(())
} }

View File

@@ -0,0 +1,79 @@
use std::{
fs::{self, File},
path::{Path, PathBuf},
};
use librojo::{
project::ProjectNode,
snapshot_reconciler::RbxSnapshotInstance,
rbx_session::MetadataPerInstance,
};
const SNAPSHOT_EXPECTED_NAME: &str = "expected-snapshot.json";
/// Snapshots contain absolute paths, which simplifies much of Rojo.
///
/// For saving snapshots to the disk, we should strip off the project folder
/// path to make them machine-independent. This doesn't work for paths that fall
/// outside of the project folder, but that's okay here.
///
/// We also need to sort children, since Rojo tends to enumerate the filesystem
/// in an unpredictable order.
pub fn anonymize_snapshot(project_folder_path: &Path, snapshot: &mut RbxSnapshotInstance) {
anonymize_metadata(project_folder_path, &mut snapshot.metadata);
snapshot.children.sort_by(|a, b| a.partial_cmp(b).unwrap());
for child in snapshot.children.iter_mut() {
anonymize_snapshot(project_folder_path, child);
}
}
pub fn anonymize_metadata(project_folder_path: &Path, metadata: &mut MetadataPerInstance) {
match metadata.source_path.as_mut() {
Some(path) => *path = anonymize_path(project_folder_path, path),
None => {},
}
match metadata.project_definition.as_mut() {
Some((_, project_node)) => anonymize_project_node(project_folder_path, project_node),
None => {},
}
}
pub fn anonymize_project_node(project_folder_path: &Path, project_node: &mut ProjectNode) {
match project_node.path.as_mut() {
Some(path) => *path = anonymize_path(project_folder_path, path),
None => {},
}
for child_node in project_node.children.values_mut() {
anonymize_project_node(project_folder_path, child_node);
}
}
pub fn anonymize_path(project_folder_path: &Path, path: &Path) -> PathBuf {
if path.is_absolute() {
path.strip_prefix(project_folder_path)
.expect("Could not anonymize absolute path")
.to_path_buf()
} else {
path.to_path_buf()
}
}
pub fn read_expected_snapshot(path: &Path) -> Option<Option<RbxSnapshotInstance<'static>>> {
let contents = fs::read(path.join(SNAPSHOT_EXPECTED_NAME)).ok()?;
let snapshot: Option<RbxSnapshotInstance<'static>> = serde_json::from_slice(&contents)
.expect("Could not deserialize snapshot");
Some(snapshot)
}
pub fn write_expected_snapshot(path: &Path, snapshot: &Option<RbxSnapshotInstance>) {
let mut file = File::create(path.join(SNAPSHOT_EXPECTED_NAME))
.expect("Could not open file to write snapshot");
serde_json::to_writer_pretty(&mut file, snapshot)
.expect("Could not serialize snapshot to file");
}

View File

@@ -0,0 +1,351 @@
//! Defines a mechanism to compare two RbxTree objects and generate a useful
//! diff if they aren't the same. These methods ignore IDs, which are randomly
//! generated whenever a tree is constructed anyways. This makes matching up
//! pairs of instances that should be the same potentially difficult.
//!
//! It relies on a couple different ideas:
//! - Instances with the same name and class name are matched as the same
//! instance. See basic_equal for this logic
//! - A path of period-delimited names (like Roblox's GetFullName) should be
//! enough to debug most issues. If it isn't, we can do something fun like
//! generate GraphViz graphs.
use std::{
borrow::Cow,
collections::{HashMap, HashSet},
fmt,
fs::{self, File},
hash::Hash,
path::{Path, PathBuf},
};
use log::error;
use serde_derive::{Serialize, Deserialize};
use rbx_dom_weak::{RbxId, RbxTree};
use librojo::{
rbx_session::MetadataPerInstance,
live_session::LiveSession,
visualize::{VisualizeRbxTree, graphviz_to_svg},
};
use super::snapshot::anonymize_metadata;
/// Marks a 'step' in the test, which will snapshot the session's current
/// RbxTree object and compare it against the saved snapshot if it exists.
pub fn tree_step(step: &str, live_session: &LiveSession, source_path: &Path) {
let rbx_session = live_session.rbx_session.lock().unwrap();
let tree = rbx_session.get_tree();
let project_folder = live_session.root_project().folder_location();
let metadata = rbx_session.get_all_instance_metadata()
.iter()
.map(|(key, meta)| {
let mut meta = meta.clone();
anonymize_metadata(project_folder, &mut meta);
(*key, meta)
})
.collect();
let tree_with_metadata = TreeWithMetadata {
tree: Cow::Borrowed(&tree),
metadata: Cow::Owned(metadata),
};
match read_tree_by_name(source_path, step) {
Some(expected) => match trees_and_metadata_equal(&expected, &tree_with_metadata) {
Ok(_) => {}
Err(e) => {
error!("Trees at step '{}' were not equal.\n{}", step, e);
let expected_gv = format!("{}", VisualizeRbxTree {
tree: &expected.tree,
metadata: &expected.metadata,
});
let actual_gv = format!("{}", VisualizeRbxTree {
tree: &tree_with_metadata.tree,
metadata: &tree_with_metadata.metadata,
});
let output_dir = PathBuf::from("failed-snapshots");
fs::create_dir_all(&output_dir)
.expect("Could not create failed-snapshots directory");
let expected_basename = format!("{}-{}-expected", live_session.root_project().name, step);
let actual_basename = format!("{}-{}-actual", live_session.root_project().name, step);
let mut expected_out = output_dir.join(expected_basename);
let mut actual_out = output_dir.join(actual_basename);
match (graphviz_to_svg(&expected_gv), graphviz_to_svg(&actual_gv)) {
(Some(expected_svg), Some(actual_svg)) => {
expected_out.set_extension("svg");
actual_out.set_extension("svg");
fs::write(&expected_out, expected_svg)
.expect("Couldn't write expected SVG");
fs::write(&actual_out, actual_svg)
.expect("Couldn't write actual SVG");
}
_ => {
expected_out.set_extension("gv");
actual_out.set_extension("gv");
fs::write(&expected_out, expected_gv)
.expect("Couldn't write expected GV");
fs::write(&actual_out, actual_gv)
.expect("Couldn't write actual GV");
}
}
error!("Output at {} and {}", expected_out.display(), actual_out.display());
panic!("Tree mismatch at step '{}'", step);
}
}
None => {
write_tree_by_name(source_path, step, &tree_with_metadata);
}
}
}
fn new_cow_map<K: Clone + Eq + Hash, V: Clone>() -> Cow<'static, HashMap<K, V>> {
Cow::Owned(HashMap::new())
}
#[derive(Debug, Serialize, Deserialize)]
struct TreeWithMetadata<'a> {
#[serde(flatten)]
pub tree: Cow<'a, RbxTree>,
#[serde(default = "new_cow_map")]
pub metadata: Cow<'a, HashMap<RbxId, MetadataPerInstance>>,
}
fn read_tree_by_name(path: &Path, identifier: &str) -> Option<TreeWithMetadata<'static>> {
let mut file_path = path.join(identifier);
file_path.set_extension("tree.json");
let contents = fs::read(&file_path).ok()?;
let tree: TreeWithMetadata = serde_json::from_slice(&contents)
.expect("Could not deserialize tree");
Some(tree)
}
fn write_tree_by_name(path: &Path, identifier: &str, tree: &TreeWithMetadata) {
let mut file_path = path.join(identifier);
file_path.set_extension("tree.json");
let mut file = File::create(file_path)
.expect("Could not open file to write tree");
serde_json::to_writer_pretty(&mut file, tree)
.expect("Could not serialize tree to file");
}
#[derive(Debug)]
pub struct TreeMismatch {
pub path: Cow<'static, str>,
pub detail: Cow<'static, str>,
}
impl TreeMismatch {
pub fn new<'a, A: Into<Cow<'a, str>>, B: Into<Cow<'a, str>>>(path: A, detail: B) -> TreeMismatch {
TreeMismatch {
path: Cow::Owned(path.into().into_owned()),
detail: Cow::Owned(detail.into().into_owned()),
}
}
fn add_parent(mut self, name: &str) -> TreeMismatch {
self.path.to_mut().insert(0, '.');
self.path.to_mut().insert_str(0, name);
TreeMismatch {
path: self.path,
detail: self.detail,
}
}
}
impl fmt::Display for TreeMismatch {
fn fmt(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
writeln!(formatter, "Tree mismatch at path {}", self.path)?;
writeln!(formatter, "{}", self.detail)
}
}
pub fn trees_equal(
left_tree: &RbxTree,
right_tree: &RbxTree,
) -> Result<(), TreeMismatch> {
let left = TreeWithMetadata {
tree: Cow::Borrowed(left_tree),
metadata: Cow::Owned(HashMap::new()),
};
let right = TreeWithMetadata {
tree: Cow::Borrowed(right_tree),
metadata: Cow::Owned(HashMap::new()),
};
trees_and_metadata_equal(&left, &right)
}
fn trees_and_metadata_equal(
left_tree: &TreeWithMetadata,
right_tree: &TreeWithMetadata,
) -> Result<(), TreeMismatch> {
let left_id = left_tree.tree.get_root_id();
let right_id = right_tree.tree.get_root_id();
instances_equal(left_tree, left_id, right_tree, right_id)
}
fn instances_equal(
left_tree: &TreeWithMetadata,
left_id: RbxId,
right_tree: &TreeWithMetadata,
right_id: RbxId,
) -> Result<(), TreeMismatch> {
basic_equal(left_tree, left_id, right_tree, right_id)?;
properties_equal(left_tree, left_id, right_tree, right_id)?;
children_equal(left_tree, left_id, right_tree, right_id)?;
metadata_equal(left_tree, left_id, right_tree, right_id)
}
fn basic_equal(
left_tree: &TreeWithMetadata,
left_id: RbxId,
right_tree: &TreeWithMetadata,
right_id: RbxId,
) -> Result<(), TreeMismatch> {
let left_instance = left_tree.tree.get_instance(left_id)
.expect("ID did not exist in left tree");
let right_instance = right_tree.tree.get_instance(right_id)
.expect("ID did not exist in right tree");
if left_instance.name != right_instance.name {
let message = format!("Name did not match ('{}' vs '{}')", left_instance.name, right_instance.name);
return Err(TreeMismatch::new(&left_instance.name, message));
}
if left_instance.class_name != right_instance.class_name {
let message = format!("Class name did not match ('{}' vs '{}')", left_instance.class_name, right_instance.class_name);
return Err(TreeMismatch::new(&left_instance.name, message));
}
Ok(())
}
fn properties_equal(
left_tree: &TreeWithMetadata,
left_id: RbxId,
right_tree: &TreeWithMetadata,
right_id: RbxId,
) -> Result<(), TreeMismatch> {
let left_instance = left_tree.tree.get_instance(left_id)
.expect("ID did not exist in left tree");
let right_instance = right_tree.tree.get_instance(right_id)
.expect("ID did not exist in right tree");
let mut visited = HashSet::new();
for (key, left_value) in &left_instance.properties {
visited.insert(key);
let right_value = right_instance.properties.get(key);
if Some(left_value) != right_value {
let message = format!(
"Property {}:\n\tLeft: {:?}\n\tRight: {:?}",
key,
Some(left_value),
right_value,
);
return Err(TreeMismatch::new(&left_instance.name, message));
}
}
for (key, right_value) in &right_instance.properties {
if visited.contains(key) {
continue;
}
let left_value = left_instance.properties.get(key);
if left_value != Some(right_value) {
let message = format!(
"Property {}:\n\tLeft: {:?}\n\tRight: {:?}",
key,
left_value,
Some(right_value),
);
return Err(TreeMismatch::new(&left_instance.name, message));
}
}
Ok(())
}
fn children_equal(
left_tree: &TreeWithMetadata,
left_id: RbxId,
right_tree: &TreeWithMetadata,
right_id: RbxId,
) -> Result<(), TreeMismatch> {
let left_instance = left_tree.tree.get_instance(left_id)
.expect("ID did not exist in left tree");
let right_instance = right_tree.tree.get_instance(right_id)
.expect("ID did not exist in right tree");
let left_children = left_instance.get_children_ids();
let right_children = right_instance.get_children_ids();
if left_children.len() != right_children.len() {
return Err(TreeMismatch::new(&left_instance.name, "Instances had different numbers of children"));
}
for (left_child_id, right_child_id) in left_children.iter().zip(right_children) {
instances_equal(left_tree, *left_child_id, right_tree, *right_child_id)
.map_err(|e| e.add_parent(&left_instance.name))?;
}
Ok(())
}
fn metadata_equal(
left_tree: &TreeWithMetadata,
left_id: RbxId,
right_tree: &TreeWithMetadata,
right_id: RbxId,
) -> Result<(), TreeMismatch> {
let left_meta = left_tree.metadata.get(&left_id);
let right_meta = right_tree.metadata.get(&right_id);
if left_meta != right_meta {
let left_instance = left_tree.tree.get_instance(left_id)
.expect("Left instance didn't exist in tree");
let message = format!(
"Metadata mismatch:\n\tLeft: {:?}\n\tRight: {:?}",
left_meta,
right_meta,
);
return Err(TreeMismatch::new(&left_instance.name, message));
}
Ok(())
}

View File

@@ -0,0 +1,68 @@
mod test_util;
use std::{
fs,
path::{Path, PathBuf},
sync::Arc,
thread,
time::Duration,
};
use tempfile::{tempdir, TempDir};
use librojo::{
live_session::LiveSession,
project::Project,
};
use crate::test_util::{
copy_recursive,
tree::tree_step,
};
#[test]
fn multi_partition_game() {
let _ = env_logger::try_init();
let source_path = project_path("multi_partition_game");
let (dir, live_session) = start_session(&source_path);
tree_step("initial", &live_session, &source_path);
let added_path = dir.path().join("a/added");
fs::create_dir_all(&added_path)
.expect("Couldn't create directory");
thread::sleep(Duration::from_millis(250));
tree_step("with_dir", &live_session, &source_path);
let moved_path = dir.path().join("b/added");
fs::rename(&added_path, &moved_path)
.expect("Couldn't rename directory");
thread::sleep(Duration::from_millis(250));
tree_step("with_moved_dir", &live_session, &source_path);
}
/// Find the path to the given test project relative to the manifest.
fn project_path(name: &str) -> PathBuf {
let mut path = Path::new(env!("CARGO_MANIFEST_DIR")).join("../test-projects");
path.push(name);
path
}
/// Starts a new LiveSession for the project located at the given file path.
fn start_session(source_path: &Path) -> (TempDir, LiveSession) {
let dir = tempdir()
.expect("Couldn't create temporary directory");
copy_recursive(&source_path, dir.path())
.expect("Couldn't copy project to temporary directory");
let project = Arc::new(Project::load_fuzzy(dir.path())
.expect("Couldn't load project from temp directory"));
let live_session = LiveSession::new(Arc::clone(&project))
.expect("Couldn't start live session");
(dir, live_session)
}

View File

@@ -0,0 +1 @@
Hello world, from a/foo.txt

View File

@@ -0,0 +1 @@
-- hello, from a/main.lua

View File

@@ -0,0 +1 @@
-- b/something.lua

View File

@@ -0,0 +1,21 @@
{
"name": "multi_partition_game",
"tree": {
"$className": "DataModel",
"ReplicatedStorage": {
"$className": "ReplicatedStorage",
"Ack": {
"$path": "a"
},
"Bar": {
"$path": "b"
}
},
"HttpService": {
"$className": "HttpService",
"$properties": {
"HttpEnabled": true
}
}
}
}

View File

@@ -0,0 +1,212 @@
{
"name": "multi_partition_game",
"class_name": "DataModel",
"properties": {},
"children": [
{
"name": "HttpService",
"class_name": "HttpService",
"properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"children": [],
"metadata": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"HttpService",
{
"class_name": "HttpService",
"children": {},
"properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"ignore_unknown_instances": null,
"path": null
}
]
}
},
{
"name": "ReplicatedStorage",
"class_name": "ReplicatedStorage",
"properties": {},
"children": [
{
"name": "Ack",
"class_name": "Folder",
"properties": {},
"children": [
{
"name": "foo",
"class_name": "StringValue",
"properties": {
"Value": {
"Type": "String",
"Value": "Hello world, from a/foo.txt"
}
},
"children": [],
"metadata": {
"ignore_unknown_instances": false,
"source_path": "a/foo.txt",
"project_definition": null
}
},
{
"name": "main",
"class_name": "ModuleScript",
"properties": {
"Source": {
"Type": "String",
"Value": "-- hello, from a/main.lua"
}
},
"children": [],
"metadata": {
"ignore_unknown_instances": false,
"source_path": "a/main.lua",
"project_definition": null
}
}
],
"metadata": {
"ignore_unknown_instances": false,
"source_path": "a",
"project_definition": [
"Ack",
{
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
}
]
}
},
{
"name": "Bar",
"class_name": "Folder",
"properties": {},
"children": [
{
"name": "something",
"class_name": "ModuleScript",
"properties": {
"Source": {
"Type": "String",
"Value": "-- b/something.lua"
}
},
"children": [],
"metadata": {
"ignore_unknown_instances": false,
"source_path": "b/something.lua",
"project_definition": null
}
}
],
"metadata": {
"ignore_unknown_instances": false,
"source_path": "b",
"project_definition": [
"Bar",
{
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
]
}
}
],
"metadata": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"ReplicatedStorage",
{
"class_name": "ReplicatedStorage",
"children": {
"Bar": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
},
"Ack": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
]
}
}
],
"metadata": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"multi_partition_game",
{
"class_name": "DataModel",
"children": {
"ReplicatedStorage": {
"class_name": "ReplicatedStorage",
"children": {
"Bar": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
},
"Ack": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
},
"HttpService": {
"class_name": "HttpService",
"children": {},
"properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"ignore_unknown_instances": null,
"path": null
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
]
}
}

View File

@@ -0,0 +1,242 @@
{
"instances": {
"00f207b1-fc18-4088-a45e-caf8cd98f5dd": {
"Name": "main",
"ClassName": "ModuleScript",
"Properties": {
"Source": {
"Type": "String",
"Value": "-- hello, from a/main.lua"
}
},
"Id": "00f207b1-fc18-4088-a45e-caf8cd98f5dd",
"Children": [],
"Parent": "14fed1a3-ba97-46a6-ae93-ac26bd9471df"
},
"14fed1a3-ba97-46a6-ae93-ac26bd9471df": {
"Name": "Ack",
"ClassName": "Folder",
"Properties": {},
"Id": "14fed1a3-ba97-46a6-ae93-ac26bd9471df",
"Children": [
"c55fd55c-258e-4a93-a63a-ea243038c9b9",
"00f207b1-fc18-4088-a45e-caf8cd98f5dd"
],
"Parent": "99eefe5f-ef74-49e6-8a8b-c833e00ca56b"
},
"c910510c-37a8-4fd8-ae41-01169ccb739c": {
"Name": "Bar",
"ClassName": "Folder",
"Properties": {},
"Id": "c910510c-37a8-4fd8-ae41-01169ccb739c",
"Children": [
"71a95983-c856-4cf2-aee6-bd8a523e80e4"
],
"Parent": "99eefe5f-ef74-49e6-8a8b-c833e00ca56b"
},
"c55fd55c-258e-4a93-a63a-ea243038c9b9": {
"Name": "foo",
"ClassName": "StringValue",
"Properties": {
"Value": {
"Type": "String",
"Value": "Hello world, from a/foo.txt"
}
},
"Id": "c55fd55c-258e-4a93-a63a-ea243038c9b9",
"Children": [],
"Parent": "14fed1a3-ba97-46a6-ae93-ac26bd9471df"
},
"71a95983-c856-4cf2-aee6-bd8a523e80e4": {
"Name": "something",
"ClassName": "ModuleScript",
"Properties": {
"Source": {
"Type": "String",
"Value": "-- b/something.lua"
}
},
"Id": "71a95983-c856-4cf2-aee6-bd8a523e80e4",
"Children": [],
"Parent": "c910510c-37a8-4fd8-ae41-01169ccb739c"
},
"3b5af13f-c997-4009-915c-0810b0e83032": {
"Name": "multi_partition_game",
"ClassName": "DataModel",
"Properties": {},
"Id": "3b5af13f-c997-4009-915c-0810b0e83032",
"Children": [
"bf8e2d4f-33a0-42a0-8168-1b62d6ac050c",
"99eefe5f-ef74-49e6-8a8b-c833e00ca56b"
],
"Parent": null
},
"bf8e2d4f-33a0-42a0-8168-1b62d6ac050c": {
"Name": "HttpService",
"ClassName": "HttpService",
"Properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"Id": "bf8e2d4f-33a0-42a0-8168-1b62d6ac050c",
"Children": [],
"Parent": "3b5af13f-c997-4009-915c-0810b0e83032"
},
"99eefe5f-ef74-49e6-8a8b-c833e00ca56b": {
"Name": "ReplicatedStorage",
"ClassName": "ReplicatedStorage",
"Properties": {},
"Id": "99eefe5f-ef74-49e6-8a8b-c833e00ca56b",
"Children": [
"14fed1a3-ba97-46a6-ae93-ac26bd9471df",
"c910510c-37a8-4fd8-ae41-01169ccb739c"
],
"Parent": "3b5af13f-c997-4009-915c-0810b0e83032"
}
},
"root_id": "3b5af13f-c997-4009-915c-0810b0e83032",
"metadata": {
"00f207b1-fc18-4088-a45e-caf8cd98f5dd": {
"ignore_unknown_instances": false,
"source_path": "a/main.lua",
"project_definition": null
},
"bf8e2d4f-33a0-42a0-8168-1b62d6ac050c": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"HttpService",
{
"class_name": "HttpService",
"children": {},
"properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"ignore_unknown_instances": null,
"path": null
}
]
},
"14fed1a3-ba97-46a6-ae93-ac26bd9471df": {
"ignore_unknown_instances": false,
"source_path": "a",
"project_definition": [
"Ack",
{
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
}
]
},
"c55fd55c-258e-4a93-a63a-ea243038c9b9": {
"ignore_unknown_instances": false,
"source_path": "a/foo.txt",
"project_definition": null
},
"71a95983-c856-4cf2-aee6-bd8a523e80e4": {
"ignore_unknown_instances": false,
"source_path": "b/something.lua",
"project_definition": null
},
"c910510c-37a8-4fd8-ae41-01169ccb739c": {
"ignore_unknown_instances": false,
"source_path": "b",
"project_definition": [
"Bar",
{
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
]
},
"99eefe5f-ef74-49e6-8a8b-c833e00ca56b": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"ReplicatedStorage",
{
"class_name": "ReplicatedStorage",
"children": {
"Ack": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
},
"Bar": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
]
},
"3b5af13f-c997-4009-915c-0810b0e83032": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"multi_partition_game",
{
"class_name": "DataModel",
"children": {
"HttpService": {
"class_name": "HttpService",
"children": {},
"properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"ignore_unknown_instances": null,
"path": null
},
"ReplicatedStorage": {
"class_name": "ReplicatedStorage",
"children": {
"Ack": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
},
"Bar": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
]
}
}
}

View File

@@ -0,0 +1,256 @@
{
"instances": {
"b48b369f-5706-4029-9fa6-90651a4910ea": {
"Name": "added",
"ClassName": "Folder",
"Properties": {},
"Id": "b48b369f-5706-4029-9fa6-90651a4910ea",
"Children": [],
"Parent": "14fed1a3-ba97-46a6-ae93-ac26bd9471df"
},
"00f207b1-fc18-4088-a45e-caf8cd98f5dd": {
"Name": "main",
"ClassName": "ModuleScript",
"Properties": {
"Source": {
"Type": "String",
"Value": "-- hello, from a/main.lua"
}
},
"Id": "00f207b1-fc18-4088-a45e-caf8cd98f5dd",
"Children": [],
"Parent": "14fed1a3-ba97-46a6-ae93-ac26bd9471df"
},
"14fed1a3-ba97-46a6-ae93-ac26bd9471df": {
"Name": "Ack",
"ClassName": "Folder",
"Properties": {},
"Id": "14fed1a3-ba97-46a6-ae93-ac26bd9471df",
"Children": [
"b48b369f-5706-4029-9fa6-90651a4910ea",
"c55fd55c-258e-4a93-a63a-ea243038c9b9",
"00f207b1-fc18-4088-a45e-caf8cd98f5dd"
],
"Parent": "99eefe5f-ef74-49e6-8a8b-c833e00ca56b"
},
"c910510c-37a8-4fd8-ae41-01169ccb739c": {
"Name": "Bar",
"ClassName": "Folder",
"Properties": {},
"Id": "c910510c-37a8-4fd8-ae41-01169ccb739c",
"Children": [
"71a95983-c856-4cf2-aee6-bd8a523e80e4"
],
"Parent": "99eefe5f-ef74-49e6-8a8b-c833e00ca56b"
},
"c55fd55c-258e-4a93-a63a-ea243038c9b9": {
"Name": "foo",
"ClassName": "StringValue",
"Properties": {
"Value": {
"Type": "String",
"Value": "Hello world, from a/foo.txt"
}
},
"Id": "c55fd55c-258e-4a93-a63a-ea243038c9b9",
"Children": [],
"Parent": "14fed1a3-ba97-46a6-ae93-ac26bd9471df"
},
"71a95983-c856-4cf2-aee6-bd8a523e80e4": {
"Name": "something",
"ClassName": "ModuleScript",
"Properties": {
"Source": {
"Type": "String",
"Value": "-- b/something.lua"
}
},
"Id": "71a95983-c856-4cf2-aee6-bd8a523e80e4",
"Children": [],
"Parent": "c910510c-37a8-4fd8-ae41-01169ccb739c"
},
"3b5af13f-c997-4009-915c-0810b0e83032": {
"Name": "multi_partition_game",
"ClassName": "DataModel",
"Properties": {},
"Id": "3b5af13f-c997-4009-915c-0810b0e83032",
"Children": [
"bf8e2d4f-33a0-42a0-8168-1b62d6ac050c",
"99eefe5f-ef74-49e6-8a8b-c833e00ca56b"
],
"Parent": null
},
"bf8e2d4f-33a0-42a0-8168-1b62d6ac050c": {
"Name": "HttpService",
"ClassName": "HttpService",
"Properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"Id": "bf8e2d4f-33a0-42a0-8168-1b62d6ac050c",
"Children": [],
"Parent": "3b5af13f-c997-4009-915c-0810b0e83032"
},
"99eefe5f-ef74-49e6-8a8b-c833e00ca56b": {
"Name": "ReplicatedStorage",
"ClassName": "ReplicatedStorage",
"Properties": {},
"Id": "99eefe5f-ef74-49e6-8a8b-c833e00ca56b",
"Children": [
"14fed1a3-ba97-46a6-ae93-ac26bd9471df",
"c910510c-37a8-4fd8-ae41-01169ccb739c"
],
"Parent": "3b5af13f-c997-4009-915c-0810b0e83032"
}
},
"root_id": "3b5af13f-c997-4009-915c-0810b0e83032",
"metadata": {
"c55fd55c-258e-4a93-a63a-ea243038c9b9": {
"ignore_unknown_instances": false,
"source_path": "a/foo.txt",
"project_definition": null
},
"99eefe5f-ef74-49e6-8a8b-c833e00ca56b": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"ReplicatedStorage",
{
"class_name": "ReplicatedStorage",
"children": {
"Ack": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
},
"Bar": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
]
},
"71a95983-c856-4cf2-aee6-bd8a523e80e4": {
"ignore_unknown_instances": false,
"source_path": "b/something.lua",
"project_definition": null
},
"00f207b1-fc18-4088-a45e-caf8cd98f5dd": {
"ignore_unknown_instances": false,
"source_path": "a/main.lua",
"project_definition": null
},
"bf8e2d4f-33a0-42a0-8168-1b62d6ac050c": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"HttpService",
{
"class_name": "HttpService",
"children": {},
"properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"ignore_unknown_instances": null,
"path": null
}
]
},
"b48b369f-5706-4029-9fa6-90651a4910ea": {
"ignore_unknown_instances": false,
"source_path": "a/added",
"project_definition": null
},
"3b5af13f-c997-4009-915c-0810b0e83032": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"multi_partition_game",
{
"class_name": "DataModel",
"children": {
"HttpService": {
"class_name": "HttpService",
"children": {},
"properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"ignore_unknown_instances": null,
"path": null
},
"ReplicatedStorage": {
"class_name": "ReplicatedStorage",
"children": {
"Ack": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
},
"Bar": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
]
},
"c910510c-37a8-4fd8-ae41-01169ccb739c": {
"ignore_unknown_instances": false,
"source_path": "b",
"project_definition": [
"Bar",
{
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
]
},
"14fed1a3-ba97-46a6-ae93-ac26bd9471df": {
"ignore_unknown_instances": false,
"source_path": "a",
"project_definition": [
"Ack",
{
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
}
]
}
}
}

View File

@@ -0,0 +1,256 @@
{
"instances": {
"866071d6-465a-4b88-8c63-07489d950916": {
"Name": "added",
"ClassName": "Folder",
"Properties": {},
"Id": "866071d6-465a-4b88-8c63-07489d950916",
"Children": [],
"Parent": "c910510c-37a8-4fd8-ae41-01169ccb739c"
},
"00f207b1-fc18-4088-a45e-caf8cd98f5dd": {
"Name": "main",
"ClassName": "ModuleScript",
"Properties": {
"Source": {
"Type": "String",
"Value": "-- hello, from a/main.lua"
}
},
"Id": "00f207b1-fc18-4088-a45e-caf8cd98f5dd",
"Children": [],
"Parent": "14fed1a3-ba97-46a6-ae93-ac26bd9471df"
},
"14fed1a3-ba97-46a6-ae93-ac26bd9471df": {
"Name": "Ack",
"ClassName": "Folder",
"Properties": {},
"Id": "14fed1a3-ba97-46a6-ae93-ac26bd9471df",
"Children": [
"c55fd55c-258e-4a93-a63a-ea243038c9b9",
"00f207b1-fc18-4088-a45e-caf8cd98f5dd"
],
"Parent": "99eefe5f-ef74-49e6-8a8b-c833e00ca56b"
},
"c910510c-37a8-4fd8-ae41-01169ccb739c": {
"Name": "Bar",
"ClassName": "Folder",
"Properties": {},
"Id": "c910510c-37a8-4fd8-ae41-01169ccb739c",
"Children": [
"866071d6-465a-4b88-8c63-07489d950916",
"71a95983-c856-4cf2-aee6-bd8a523e80e4"
],
"Parent": "99eefe5f-ef74-49e6-8a8b-c833e00ca56b"
},
"c55fd55c-258e-4a93-a63a-ea243038c9b9": {
"Name": "foo",
"ClassName": "StringValue",
"Properties": {
"Value": {
"Type": "String",
"Value": "Hello world, from a/foo.txt"
}
},
"Id": "c55fd55c-258e-4a93-a63a-ea243038c9b9",
"Children": [],
"Parent": "14fed1a3-ba97-46a6-ae93-ac26bd9471df"
},
"71a95983-c856-4cf2-aee6-bd8a523e80e4": {
"Name": "something",
"ClassName": "ModuleScript",
"Properties": {
"Source": {
"Type": "String",
"Value": "-- b/something.lua"
}
},
"Id": "71a95983-c856-4cf2-aee6-bd8a523e80e4",
"Children": [],
"Parent": "c910510c-37a8-4fd8-ae41-01169ccb739c"
},
"3b5af13f-c997-4009-915c-0810b0e83032": {
"Name": "multi_partition_game",
"ClassName": "DataModel",
"Properties": {},
"Id": "3b5af13f-c997-4009-915c-0810b0e83032",
"Children": [
"bf8e2d4f-33a0-42a0-8168-1b62d6ac050c",
"99eefe5f-ef74-49e6-8a8b-c833e00ca56b"
],
"Parent": null
},
"bf8e2d4f-33a0-42a0-8168-1b62d6ac050c": {
"Name": "HttpService",
"ClassName": "HttpService",
"Properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"Id": "bf8e2d4f-33a0-42a0-8168-1b62d6ac050c",
"Children": [],
"Parent": "3b5af13f-c997-4009-915c-0810b0e83032"
},
"99eefe5f-ef74-49e6-8a8b-c833e00ca56b": {
"Name": "ReplicatedStorage",
"ClassName": "ReplicatedStorage",
"Properties": {},
"Id": "99eefe5f-ef74-49e6-8a8b-c833e00ca56b",
"Children": [
"14fed1a3-ba97-46a6-ae93-ac26bd9471df",
"c910510c-37a8-4fd8-ae41-01169ccb739c"
],
"Parent": "3b5af13f-c997-4009-915c-0810b0e83032"
}
},
"root_id": "3b5af13f-c997-4009-915c-0810b0e83032",
"metadata": {
"bf8e2d4f-33a0-42a0-8168-1b62d6ac050c": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"HttpService",
{
"class_name": "HttpService",
"children": {},
"properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"ignore_unknown_instances": null,
"path": null
}
]
},
"c910510c-37a8-4fd8-ae41-01169ccb739c": {
"ignore_unknown_instances": false,
"source_path": "b",
"project_definition": [
"Bar",
{
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
]
},
"866071d6-465a-4b88-8c63-07489d950916": {
"ignore_unknown_instances": false,
"source_path": "b/added",
"project_definition": null
},
"14fed1a3-ba97-46a6-ae93-ac26bd9471df": {
"ignore_unknown_instances": false,
"source_path": "a",
"project_definition": [
"Ack",
{
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
}
]
},
"00f207b1-fc18-4088-a45e-caf8cd98f5dd": {
"ignore_unknown_instances": false,
"source_path": "a/main.lua",
"project_definition": null
},
"99eefe5f-ef74-49e6-8a8b-c833e00ca56b": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"ReplicatedStorage",
{
"class_name": "ReplicatedStorage",
"children": {
"Ack": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
},
"Bar": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
]
},
"71a95983-c856-4cf2-aee6-bd8a523e80e4": {
"ignore_unknown_instances": false,
"source_path": "b/something.lua",
"project_definition": null
},
"c55fd55c-258e-4a93-a63a-ea243038c9b9": {
"ignore_unknown_instances": false,
"source_path": "a/foo.txt",
"project_definition": null
},
"3b5af13f-c997-4009-915c-0810b0e83032": {
"ignore_unknown_instances": true,
"source_path": null,
"project_definition": [
"multi_partition_game",
{
"class_name": "DataModel",
"children": {
"HttpService": {
"class_name": "HttpService",
"children": {},
"properties": {
"HttpEnabled": {
"Type": "Bool",
"Value": true
}
},
"ignore_unknown_instances": null,
"path": null
},
"ReplicatedStorage": {
"class_name": "ReplicatedStorage",
"children": {
"Ack": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "a"
},
"Bar": {
"class_name": null,
"children": {},
"properties": {},
"ignore_unknown_instances": null,
"path": "b"
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
},
"properties": {},
"ignore_unknown_instances": null,
"path": null
}
]
}
}
}