forked from glen/dyna3
Compare commits
9 commits
outline-cl
...
main
Author | SHA1 | Date | |
---|---|---|---|
46324fecc6 | |||
25017176fd | |||
817a446fad | |||
22870342f3 | |||
b490c8707f | |||
a8e13b8110 | |||
e917272c60 | |||
65cee1ecc2 | |||
707618cdd3 |
21 changed files with 2183 additions and 339 deletions
48
README.md
48
README.md
|
@ -17,3 +17,51 @@ Note that currently this is just the barest beginnings of the project, more of a
|
|||
* Able to run in browser (so implemented in WASM-compatible language)
|
||||
|
||||
* Produce scalable graphics of 3D diagrams, and maybe STL files (or other fabricatable file format) as well.
|
||||
|
||||
## Prototype
|
||||
|
||||
The latest prototype is in the folder `app-proto`. It includes both a user interface and a numerical constraint-solving engine.
|
||||
|
||||
### Install the prerequisites
|
||||
|
||||
1. Install [`rustup`](https://rust-lang.github.io/rustup/): the officially recommended Rust toolchain manager
|
||||
* It's available on Ubuntu as a [Snap](https://snapcraft.io/rustup)
|
||||
2. Call `rustup default stable` to "download the latest stable release of Rust and set it as your default toolchain"
|
||||
* If you forget, the `rustup` [help system](https://github.com/rust-lang/rustup/blob/d9b3601c3feb2e88cf3f8ca4f7ab4fdad71441fd/src/errors.rs#L109-L112) will remind you
|
||||
3. Call `rustup target add wasm32-unknown-unknown` to add the [most generic 32-bit WebAssembly target](https://doc.rust-lang.org/nightly/rustc/platform-support/wasm32-unknown-unknown.html)
|
||||
4. Call `cargo install wasm-pack` to install the [WebAssembly toolchain](https://rustwasm.github.io/docs/wasm-pack/)
|
||||
5. Call `cargo install trunk` to install the [Trunk](https://trunkrs.dev/) web-build tool
|
||||
6. Add the `.cargo/bin` folder in your home directory to your executable search path
|
||||
* This lets you call Trunk, and other tools installed by Cargo, without specifying their paths
|
||||
* On POSIX systems, the search path is stored in the `PATH` environment variable
|
||||
|
||||
### Play with the prototype
|
||||
|
||||
1. Go into the `app-proto` folder
|
||||
2. Call `trunk serve --release` to build and serve the prototype
|
||||
* *The crates the prototype depends on will be downloaded and served automatically*
|
||||
* *For a faster build, at the expense of a much slower prototype, you can call `trunk serve` without the `--release` flag*
|
||||
3. In a web browser, visit one of the URLs listed under the message `INFO 📡 server listening at:`
|
||||
* *Touching any file in the `app-proto` folder will make Trunk rebuild and live-reload the prototype*
|
||||
4. Press *ctrl+C* in the shell where Trunk is running to stop serving the prototype
|
||||
|
||||
### Run the engine on some example problems
|
||||
|
||||
1. Go into the `app-proto` folder
|
||||
2. Call `./run-examples`
|
||||
* *For each example problem, the engine will print the value of the loss function at each optimization step*
|
||||
* *The first example that prints is the same as the Irisawa hexlet example from the Julia version of the engine prototype. If you go into `engine-proto/gram-test`, launch Julia, and then*
|
||||
|
||||
```julia
|
||||
include("irisawa-hexlet.jl")
|
||||
for (step, scaled_loss) in enumerate(history_alt.scaled_loss)
|
||||
println(rpad(step-1, 4), " | ", scaled_loss)
|
||||
end
|
||||
```
|
||||
|
||||
*you should see that it prints basically the same loss history until the last few steps, when the lower default precision of the Rust engine really starts to show*
|
||||
|
||||
### Run the automated tests
|
||||
|
||||
1. Go into the `app-proto` folder
|
||||
2. Call `cargo test`
|
||||
|
|
|
@ -1,15 +1,17 @@
|
|||
[package]
|
||||
name = "sketch-outline"
|
||||
name = "dyna3"
|
||||
version = "0.1.0"
|
||||
authors = ["Aaron"]
|
||||
authors = ["Aaron Fenyes", "Glen Whitney"]
|
||||
edition = "2021"
|
||||
|
||||
[features]
|
||||
default = ["console_error_panic_hook"]
|
||||
dev = []
|
||||
|
||||
[dependencies]
|
||||
itertools = "0.13.0"
|
||||
js-sys = "0.3.70"
|
||||
lazy_static = "1.5.0"
|
||||
nalgebra = "0.33.0"
|
||||
rustc-hash = "2.0.0"
|
||||
slab = "0.4.9"
|
||||
|
@ -24,7 +26,9 @@ console_error_panic_hook = { version = "0.1.7", optional = true }
|
|||
[dependencies.web-sys]
|
||||
version = "0.3.69"
|
||||
features = [
|
||||
'DomRect',
|
||||
'HtmlCanvasElement',
|
||||
'HtmlInputElement',
|
||||
'Performance',
|
||||
'WebGl2RenderingContext',
|
||||
'WebGlBuffer',
|
||||
|
@ -34,7 +38,12 @@ features = [
|
|||
'WebGlVertexArrayObject'
|
||||
]
|
||||
|
||||
# the self-dependency specifies features to use for tests and examples
|
||||
#
|
||||
# https://github.com/rust-lang/cargo/issues/2911#issuecomment-1483256987
|
||||
#
|
||||
[dev-dependencies]
|
||||
dyna3 = { path = ".", default-features = false, features = ["dev"] }
|
||||
wasm-bindgen-test = "0.3.34"
|
||||
|
||||
[profile.release]
|
||||
|
|
25
app-proto/examples/irisawa-hexlet.rs
Normal file
25
app-proto/examples/irisawa-hexlet.rs
Normal file
|
@ -0,0 +1,25 @@
|
|||
use dyna3::engine::{Q, irisawa::realize_irisawa_hexlet};
|
||||
|
||||
fn main() {
|
||||
const SCALED_TOL: f64 = 1.0e-12;
|
||||
let (config, _, success, history) = realize_irisawa_hexlet(SCALED_TOL);
|
||||
print!("\nCompleted Gram matrix:{}", config.tr_mul(&*Q) * &config);
|
||||
if success {
|
||||
println!("Target accuracy achieved!");
|
||||
} else {
|
||||
println!("Failed to reach target accuracy");
|
||||
}
|
||||
println!("Steps: {}", history.scaled_loss.len() - 1);
|
||||
println!("Loss: {}", history.scaled_loss.last().unwrap());
|
||||
if success {
|
||||
println!("\nChain diameters:");
|
||||
println!(" {} sun (given)", 1.0 / config[(3, 3)]);
|
||||
for k in 4..9 {
|
||||
println!(" {} sun", 1.0 / config[(3, k)]);
|
||||
}
|
||||
}
|
||||
println!("\nStep │ Loss\n─────┼────────────────────────────────");
|
||||
for (step, scaled_loss) in history.scaled_loss.into_iter().enumerate() {
|
||||
println!("{:<4} │ {}", step, scaled_loss);
|
||||
}
|
||||
}
|
72
app-proto/examples/kaleidocycle.rs
Normal file
72
app-proto/examples/kaleidocycle.rs
Normal file
|
@ -0,0 +1,72 @@
|
|||
use nalgebra::{DMatrix, DVector};
|
||||
use std::{array, f64::consts::PI};
|
||||
|
||||
use dyna3::engine::{Q, point, realize_gram, PartialMatrix};
|
||||
|
||||
fn main() {
|
||||
// set up a kaleidocycle, made of points with fixed distances between them,
|
||||
// and find its tangent space
|
||||
const N_POINTS: usize = 12;
|
||||
let gram = {
|
||||
let mut gram_to_be = PartialMatrix::new();
|
||||
for block in (0..N_POINTS).step_by(2) {
|
||||
let block_next = (block + 2) % N_POINTS;
|
||||
for j in 0..2 {
|
||||
// diagonal and hinge edges
|
||||
for k in j..2 {
|
||||
gram_to_be.push_sym(block + j, block + k, if j == k { 0.0 } else { -0.5 });
|
||||
}
|
||||
|
||||
// non-hinge edges
|
||||
for k in 0..2 {
|
||||
gram_to_be.push_sym(block + j, block_next + k, -0.625);
|
||||
}
|
||||
}
|
||||
}
|
||||
gram_to_be
|
||||
};
|
||||
let guess = {
|
||||
const N_HINGES: usize = 6;
|
||||
let guess_elts = (0..N_HINGES).step_by(2).flat_map(
|
||||
|n| {
|
||||
let ang_hor = (n as f64) * PI/3.0;
|
||||
let ang_vert = ((n + 1) as f64) * PI/3.0;
|
||||
let x_vert = ang_vert.cos();
|
||||
let y_vert = ang_vert.sin();
|
||||
[
|
||||
point(0.0, 0.0, 0.0),
|
||||
point(ang_hor.cos(), ang_hor.sin(), 0.0),
|
||||
point(x_vert, y_vert, -0.5),
|
||||
point(x_vert, y_vert, 0.5)
|
||||
]
|
||||
}
|
||||
).collect::<Vec<_>>();
|
||||
DMatrix::from_columns(&guess_elts)
|
||||
};
|
||||
let frozen: [_; N_POINTS] = array::from_fn(|k| (3, k));
|
||||
let (config, tangent, success, history) = realize_gram(
|
||||
&gram, guess, &frozen,
|
||||
1.0e-12, 0.5, 0.9, 1.1, 200, 110
|
||||
);
|
||||
print!("Completed Gram matrix:{}", config.tr_mul(&*Q) * &config);
|
||||
print!("Configuration:{}", config);
|
||||
if success {
|
||||
println!("Target accuracy achieved!");
|
||||
} else {
|
||||
println!("Failed to reach target accuracy");
|
||||
}
|
||||
println!("Steps: {}", history.scaled_loss.len() - 1);
|
||||
println!("Loss: {}\n", history.scaled_loss.last().unwrap());
|
||||
|
||||
// find the kaleidocycle's twist motion
|
||||
let up = DVector::from_column_slice(&[0.0, 0.0, 1.0, 0.0]);
|
||||
let down = -&up;
|
||||
let twist_motion: DMatrix<_> = (0..N_POINTS).step_by(4).flat_map(
|
||||
|n| [
|
||||
tangent.proj(&up.as_view(), n),
|
||||
tangent.proj(&down.as_view(), n+1)
|
||||
]
|
||||
).sum();
|
||||
let normalization = 5.0 / twist_motion[(2, 0)];
|
||||
print!("Twist motion:{}", normalization * twist_motion);
|
||||
}
|
38
app-proto/examples/point-on-sphere.rs
Normal file
38
app-proto/examples/point-on-sphere.rs
Normal file
|
@ -0,0 +1,38 @@
|
|||
use nalgebra::DMatrix;
|
||||
|
||||
use dyna3::engine::{Q, point, realize_gram, sphere, PartialMatrix};
|
||||
|
||||
fn main() {
|
||||
let gram = {
|
||||
let mut gram_to_be = PartialMatrix::new();
|
||||
for j in 0..2 {
|
||||
for k in j..2 {
|
||||
gram_to_be.push_sym(j, k, if (j, k) == (1, 1) { 1.0 } else { 0.0 });
|
||||
}
|
||||
}
|
||||
gram_to_be
|
||||
};
|
||||
let guess = DMatrix::from_columns(&[
|
||||
point(0.0, 0.0, 2.0),
|
||||
sphere(0.0, 0.0, 0.0, 1.0)
|
||||
]);
|
||||
let frozen = [(3, 0)];
|
||||
println!();
|
||||
let (config, _, success, history) = realize_gram(
|
||||
&gram, guess, &frozen,
|
||||
1.0e-12, 0.5, 0.9, 1.1, 200, 110
|
||||
);
|
||||
print!("\nCompleted Gram matrix:{}", config.tr_mul(&*Q) * &config);
|
||||
print!("Configuration:{}", config);
|
||||
if success {
|
||||
println!("Target accuracy achieved!");
|
||||
} else {
|
||||
println!("Failed to reach target accuracy");
|
||||
}
|
||||
println!("Steps: {}", history.scaled_loss.len() - 1);
|
||||
println!("Loss: {}", history.scaled_loss.last().unwrap());
|
||||
println!("\nStep │ Loss\n─────┼────────────────────────────────");
|
||||
for (step, scaled_loss) in history.scaled_loss.into_iter().enumerate() {
|
||||
println!("{:<4} │ {}", step, scaled_loss);
|
||||
}
|
||||
}
|
40
app-proto/examples/three-spheres.rs
Normal file
40
app-proto/examples/three-spheres.rs
Normal file
|
@ -0,0 +1,40 @@
|
|||
use nalgebra::DMatrix;
|
||||
|
||||
use dyna3::engine::{Q, realize_gram, sphere, PartialMatrix};
|
||||
|
||||
fn main() {
|
||||
let gram = {
|
||||
let mut gram_to_be = PartialMatrix::new();
|
||||
for j in 0..3 {
|
||||
for k in j..3 {
|
||||
gram_to_be.push_sym(j, k, if j == k { 1.0 } else { -1.0 });
|
||||
}
|
||||
}
|
||||
gram_to_be
|
||||
};
|
||||
let guess = {
|
||||
let a: f64 = 0.75_f64.sqrt();
|
||||
DMatrix::from_columns(&[
|
||||
sphere(1.0, 0.0, 0.0, 1.0),
|
||||
sphere(-0.5, a, 0.0, 1.0),
|
||||
sphere(-0.5, -a, 0.0, 1.0)
|
||||
])
|
||||
};
|
||||
println!();
|
||||
let (config, _, success, history) = realize_gram(
|
||||
&gram, guess, &[],
|
||||
1.0e-12, 0.5, 0.9, 1.1, 200, 110
|
||||
);
|
||||
print!("\nCompleted Gram matrix:{}", config.tr_mul(&*Q) * &config);
|
||||
if success {
|
||||
println!("Target accuracy achieved!");
|
||||
} else {
|
||||
println!("Failed to reach target accuracy");
|
||||
}
|
||||
println!("Steps: {}", history.scaled_loss.len() - 1);
|
||||
println!("Loss: {}", history.scaled_loss.last().unwrap());
|
||||
println!("\nStep │ Loss\n─────┼────────────────────────────────");
|
||||
for (step, scaled_loss) in history.scaled_loss.into_iter().enumerate() {
|
||||
println!("{:<4} │ {}", step, scaled_loss);
|
||||
}
|
||||
}
|
|
@ -2,8 +2,10 @@
|
|||
<html>
|
||||
<head>
|
||||
<meta charset="utf-8"/>
|
||||
<title>Sketch outline</title>
|
||||
<title>dyna3</title>
|
||||
<link data-trunk rel="css" href="main.css"/>
|
||||
<link href="https://fonts.bunny.net/css?family=fira-sans:ital,wght@0,400;1,400&display=swap" rel="stylesheet">
|
||||
<link href="https://fonts.bunny.net/css?family=noto-emoji:wght@400&text=%f0%9f%94%97%e2%9a%a0&display=swap" rel="stylesheet">
|
||||
</head>
|
||||
<body></body>
|
||||
</html>
|
||||
|
|
|
@ -1,7 +1,20 @@
|
|||
:root {
|
||||
--text: #fcfcfc; /* almost white */
|
||||
--text-bright: white;
|
||||
--text-invalid: #f58fc2; /* bright pink */
|
||||
--border: #555; /* light gray */
|
||||
--border-focus: #aaa; /* bright gray */
|
||||
--border-invalid: #70495c; /* dusky pink */
|
||||
--selection-highlight: #444; /* medium gray */
|
||||
--page-background: #222; /* dark gray */
|
||||
--display-background: #020202; /* almost black */
|
||||
}
|
||||
|
||||
body {
|
||||
margin: 0px;
|
||||
color: #fcfcfc;
|
||||
background-color: #222;
|
||||
color: var(--text);
|
||||
background-color: var(--page-background);
|
||||
font-family: 'Fira Sans', sans-serif;
|
||||
}
|
||||
|
||||
/* sidebar */
|
||||
|
@ -16,7 +29,7 @@ body {
|
|||
padding: 0px;
|
||||
border-width: 0px 1px 0px 0px;
|
||||
border-style: solid;
|
||||
border-color: #555;
|
||||
border-color: var(--border);
|
||||
}
|
||||
|
||||
/* add-remove */
|
||||
|
@ -33,6 +46,15 @@ body {
|
|||
font-size: large;
|
||||
}
|
||||
|
||||
/* KLUDGE */
|
||||
/*
|
||||
for convenience, we're using emoji as temporary icons for some buttons. these
|
||||
buttons need to be displayed in an emoji font
|
||||
*/
|
||||
#add-remove > button.emoji {
|
||||
font-family: 'Noto Emoji', sans-serif;
|
||||
}
|
||||
|
||||
/* outline */
|
||||
|
||||
#outline {
|
||||
|
@ -51,74 +73,103 @@ summary {
|
|||
}
|
||||
|
||||
summary.selected {
|
||||
color: #fff;
|
||||
background-color: #444;
|
||||
color: var(--text-bright);
|
||||
background-color: var(--selection-highlight);
|
||||
}
|
||||
|
||||
summary > div, .cst {
|
||||
summary > div, .constraint {
|
||||
padding-top: 4px;
|
||||
padding-bottom: 4px;
|
||||
}
|
||||
|
||||
.elt, .cst {
|
||||
.element, .constraint {
|
||||
display: flex;
|
||||
flex-grow: 1;
|
||||
padding-left: 8px;
|
||||
padding-right: 8px;
|
||||
}
|
||||
|
||||
.elt-switch {
|
||||
.element-switch {
|
||||
width: 18px;
|
||||
padding-left: 2px;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
details:has(li) .elt-switch::after {
|
||||
details:has(li) .element-switch::after {
|
||||
content: '▸';
|
||||
}
|
||||
|
||||
details[open]:has(li) .elt-switch::after {
|
||||
details[open]:has(li) .element-switch::after {
|
||||
content: '▾';
|
||||
}
|
||||
|
||||
.elt-label {
|
||||
.element-label {
|
||||
flex-grow: 1;
|
||||
}
|
||||
|
||||
.cst-label {
|
||||
.constraint-label {
|
||||
flex-grow: 1;
|
||||
}
|
||||
|
||||
.elt-rep {
|
||||
.element-representation {
|
||||
display: flex;
|
||||
}
|
||||
|
||||
.elt-rep > div, .cst-rep {
|
||||
.element-representation > div {
|
||||
padding: 2px 0px 0px 0px;
|
||||
font-size: 10pt;
|
||||
text-align: center;
|
||||
font-variant-numeric: tabular-nums;
|
||||
text-align: right;
|
||||
width: 56px;
|
||||
}
|
||||
|
||||
.cst {
|
||||
.constraint {
|
||||
font-style: italic;
|
||||
}
|
||||
|
||||
.cst > input {
|
||||
.constraint.invalid {
|
||||
color: var(--text-invalid);
|
||||
}
|
||||
|
||||
.constraint > input[type=checkbox] {
|
||||
margin: 0px 8px 0px 0px;
|
||||
}
|
||||
|
||||
.constraint > input[type=text] {
|
||||
color: inherit;
|
||||
background-color: inherit;
|
||||
border: 1px solid var(--border);
|
||||
border-radius: 2px;
|
||||
}
|
||||
|
||||
.constraint.invalid > input[type=text] {
|
||||
border-color: var(--border-invalid);
|
||||
}
|
||||
|
||||
.status {
|
||||
width: 20px;
|
||||
padding-left: 4px;
|
||||
text-align: center;
|
||||
font-family: 'Noto Emoji';
|
||||
font-style: normal;
|
||||
}
|
||||
|
||||
.invalid > .status::after, details:has(.invalid):not([open]) .status::after {
|
||||
content: '⚠';
|
||||
color: var(--text-invalid);
|
||||
}
|
||||
|
||||
/* display */
|
||||
|
||||
canvas {
|
||||
float: left;
|
||||
margin-left: 20px;
|
||||
margin-top: 20px;
|
||||
background-color: #020202;
|
||||
border: 1px solid #555;
|
||||
background-color: var(--display-background);
|
||||
border: 1px solid var(--border);
|
||||
border-radius: 16px;
|
||||
}
|
||||
|
||||
canvas:focus {
|
||||
border-color: #aaa;
|
||||
border-color: var(--border-focus);
|
||||
}
|
12
app-proto/run-examples
Executable file
12
app-proto/run-examples
Executable file
|
@ -0,0 +1,12 @@
|
|||
#!/bin/sh
|
||||
|
||||
# run all Cargo examples, as described here:
|
||||
#
|
||||
# Karol Kuczmarski. "Add examples to your Rust libraries"
|
||||
# http://xion.io/post/code/rust-examples.html
|
||||
#
|
||||
|
||||
cargo run --example irisawa-hexlet
|
||||
cargo run --example three-spheres
|
||||
cargo run --example point-on-sphere
|
||||
cargo run --example kaleidocycle
|
|
@ -1,151 +1,130 @@
|
|||
use std::collections::BTreeSet; /* DEBUG */
|
||||
use sycamore::prelude::*;
|
||||
use web_sys::{console, wasm_bindgen::JsValue};
|
||||
|
||||
use crate::{engine, AppState, assembly::{Assembly, Constraint, Element}};
|
||||
|
||||
/* DEBUG */
|
||||
// load an example assembly for testing. this code will be removed once we've
|
||||
// built a more formal test assembly system
|
||||
fn load_gen_assemb(assembly: &Assembly) {
|
||||
let _ = assembly.try_insert_element(
|
||||
Element {
|
||||
id: String::from("gemini_a"),
|
||||
label: String::from("Castor"),
|
||||
color: [1.00_f32, 0.25_f32, 0.00_f32],
|
||||
rep: engine::sphere(0.5, 0.5, 0.0, 1.0),
|
||||
constraints: BTreeSet::default()
|
||||
}
|
||||
Element::new(
|
||||
String::from("gemini_a"),
|
||||
String::from("Castor"),
|
||||
[1.00_f32, 0.25_f32, 0.00_f32],
|
||||
engine::sphere(0.5, 0.5, 0.0, 1.0)
|
||||
)
|
||||
);
|
||||
let _ = assembly.try_insert_element(
|
||||
Element {
|
||||
id: String::from("gemini_b"),
|
||||
label: String::from("Pollux"),
|
||||
color: [0.00_f32, 0.25_f32, 1.00_f32],
|
||||
rep: engine::sphere(-0.5, -0.5, 0.0, 1.0),
|
||||
constraints: BTreeSet::default()
|
||||
}
|
||||
Element::new(
|
||||
String::from("gemini_b"),
|
||||
String::from("Pollux"),
|
||||
[0.00_f32, 0.25_f32, 1.00_f32],
|
||||
engine::sphere(-0.5, -0.5, 0.0, 1.0)
|
||||
)
|
||||
);
|
||||
let _ = assembly.try_insert_element(
|
||||
Element {
|
||||
id: String::from("ursa_major"),
|
||||
label: String::from("Ursa major"),
|
||||
color: [0.25_f32, 0.00_f32, 1.00_f32],
|
||||
rep: engine::sphere(-0.5, 0.5, 0.0, 0.75),
|
||||
constraints: BTreeSet::default()
|
||||
}
|
||||
Element::new(
|
||||
String::from("ursa_major"),
|
||||
String::from("Ursa major"),
|
||||
[0.25_f32, 0.00_f32, 1.00_f32],
|
||||
engine::sphere(-0.5, 0.5, 0.0, 0.75)
|
||||
)
|
||||
);
|
||||
let _ = assembly.try_insert_element(
|
||||
Element {
|
||||
id: String::from("ursa_minor"),
|
||||
label: String::from("Ursa minor"),
|
||||
color: [0.25_f32, 1.00_f32, 0.00_f32],
|
||||
rep: engine::sphere(0.5, -0.5, 0.0, 0.5),
|
||||
constraints: BTreeSet::default()
|
||||
}
|
||||
Element::new(
|
||||
String::from("ursa_minor"),
|
||||
String::from("Ursa minor"),
|
||||
[0.25_f32, 1.00_f32, 0.00_f32],
|
||||
engine::sphere(0.5, -0.5, 0.0, 0.5)
|
||||
)
|
||||
);
|
||||
let _ = assembly.try_insert_element(
|
||||
Element {
|
||||
id: String::from("moon_deimos"),
|
||||
label: String::from("Deimos"),
|
||||
color: [0.75_f32, 0.75_f32, 0.00_f32],
|
||||
rep: engine::sphere(0.0, 0.15, 1.0, 0.25),
|
||||
constraints: BTreeSet::default()
|
||||
}
|
||||
Element::new(
|
||||
String::from("moon_deimos"),
|
||||
String::from("Deimos"),
|
||||
[0.75_f32, 0.75_f32, 0.00_f32],
|
||||
engine::sphere(0.0, 0.15, 1.0, 0.25)
|
||||
)
|
||||
);
|
||||
let _ = assembly.try_insert_element(
|
||||
Element {
|
||||
id: String::from("moon_phobos"),
|
||||
label: String::from("Phobos"),
|
||||
color: [0.00_f32, 0.75_f32, 0.50_f32],
|
||||
rep: engine::sphere(0.0, -0.15, -1.0, 0.25),
|
||||
constraints: BTreeSet::default()
|
||||
}
|
||||
);
|
||||
assembly.insert_constraint(
|
||||
Constraint {
|
||||
args: (
|
||||
assembly.elements_by_id.with_untracked(|elts_by_id| elts_by_id["gemini_a"]),
|
||||
assembly.elements_by_id.with_untracked(|elts_by_id| elts_by_id["gemini_b"])
|
||||
),
|
||||
rep: 0.5,
|
||||
active: create_signal(true)
|
||||
}
|
||||
Element::new(
|
||||
String::from("moon_phobos"),
|
||||
String::from("Phobos"),
|
||||
[0.00_f32, 0.75_f32, 0.50_f32],
|
||||
engine::sphere(0.0, -0.15, -1.0, 0.25)
|
||||
)
|
||||
);
|
||||
}
|
||||
|
||||
/* DEBUG */
|
||||
// load an example assembly for testing. this code will be removed once we've
|
||||
// built a more formal test assembly system
|
||||
fn load_low_curv_assemb(assembly: &Assembly) {
|
||||
let a = 0.75_f64.sqrt();
|
||||
let _ = assembly.try_insert_element(
|
||||
Element {
|
||||
id: "central".to_string(),
|
||||
label: "Central".to_string(),
|
||||
color: [0.75_f32, 0.75_f32, 0.75_f32],
|
||||
rep: engine::sphere(0.0, 0.0, 0.0, 1.0),
|
||||
constraints: BTreeSet::default()
|
||||
}
|
||||
Element::new(
|
||||
"central".to_string(),
|
||||
"Central".to_string(),
|
||||
[0.75_f32, 0.75_f32, 0.75_f32],
|
||||
engine::sphere(0.0, 0.0, 0.0, 1.0)
|
||||
)
|
||||
);
|
||||
let _ = assembly.try_insert_element(
|
||||
Element {
|
||||
id: "assemb_plane".to_string(),
|
||||
label: "Assembly plane".to_string(),
|
||||
color: [0.75_f32, 0.75_f32, 0.75_f32],
|
||||
rep: engine::sphere_with_offset(0.0, 0.0, 1.0, 0.0, 0.0),
|
||||
constraints: BTreeSet::default()
|
||||
}
|
||||
Element::new(
|
||||
"assemb_plane".to_string(),
|
||||
"Assembly plane".to_string(),
|
||||
[0.75_f32, 0.75_f32, 0.75_f32],
|
||||
engine::sphere_with_offset(0.0, 0.0, 1.0, 0.0, 0.0)
|
||||
)
|
||||
);
|
||||
let _ = assembly.try_insert_element(
|
||||
Element {
|
||||
id: "side1".to_string(),
|
||||
label: "Side 1".to_string(),
|
||||
color: [1.00_f32, 0.00_f32, 0.25_f32],
|
||||
rep: engine::sphere_with_offset(1.0, 0.0, 0.0, 1.0, 0.0),
|
||||
constraints: BTreeSet::default()
|
||||
}
|
||||
Element::new(
|
||||
"side1".to_string(),
|
||||
"Side 1".to_string(),
|
||||
[1.00_f32, 0.00_f32, 0.25_f32],
|
||||
engine::sphere_with_offset(1.0, 0.0, 0.0, 1.0, 0.0)
|
||||
)
|
||||
);
|
||||
let _ = assembly.try_insert_element(
|
||||
Element {
|
||||
id: "side2".to_string(),
|
||||
label: "Side 2".to_string(),
|
||||
color: [0.25_f32, 1.00_f32, 0.00_f32],
|
||||
rep: engine::sphere_with_offset(-0.5, a, 0.0, 1.0, 0.0),
|
||||
constraints: BTreeSet::default()
|
||||
}
|
||||
Element::new(
|
||||
"side2".to_string(),
|
||||
"Side 2".to_string(),
|
||||
[0.25_f32, 1.00_f32, 0.00_f32],
|
||||
engine::sphere_with_offset(-0.5, a, 0.0, 1.0, 0.0)
|
||||
)
|
||||
);
|
||||
let _ = assembly.try_insert_element(
|
||||
Element {
|
||||
id: "side3".to_string(),
|
||||
label: "Side 3".to_string(),
|
||||
color: [0.00_f32, 0.25_f32, 1.00_f32],
|
||||
rep: engine::sphere_with_offset(-0.5, -a, 0.0, 1.0, 0.0),
|
||||
constraints: BTreeSet::default()
|
||||
}
|
||||
Element::new(
|
||||
"side3".to_string(),
|
||||
"Side 3".to_string(),
|
||||
[0.00_f32, 0.25_f32, 1.00_f32],
|
||||
engine::sphere_with_offset(-0.5, -a, 0.0, 1.0, 0.0)
|
||||
)
|
||||
);
|
||||
let _ = assembly.try_insert_element(
|
||||
Element {
|
||||
id: "corner1".to_string(),
|
||||
label: "Corner 1".to_string(),
|
||||
color: [0.75_f32, 0.75_f32, 0.75_f32],
|
||||
rep: engine::sphere(-4.0/3.0, 0.0, 0.0, 1.0/3.0),
|
||||
constraints: BTreeSet::default()
|
||||
}
|
||||
Element::new(
|
||||
"corner1".to_string(),
|
||||
"Corner 1".to_string(),
|
||||
[0.75_f32, 0.75_f32, 0.75_f32],
|
||||
engine::sphere(-4.0/3.0, 0.0, 0.0, 1.0/3.0)
|
||||
)
|
||||
);
|
||||
let _ = assembly.try_insert_element(
|
||||
Element {
|
||||
id: "corner2".to_string(),
|
||||
label: "Corner 2".to_string(),
|
||||
color: [0.75_f32, 0.75_f32, 0.75_f32],
|
||||
rep: engine::sphere(2.0/3.0, -4.0/3.0 * a, 0.0, 1.0/3.0),
|
||||
constraints: BTreeSet::default()
|
||||
}
|
||||
Element::new(
|
||||
"corner2".to_string(),
|
||||
"Corner 2".to_string(),
|
||||
[0.75_f32, 0.75_f32, 0.75_f32],
|
||||
engine::sphere(2.0/3.0, -4.0/3.0 * a, 0.0, 1.0/3.0)
|
||||
)
|
||||
);
|
||||
let _ = assembly.try_insert_element(
|
||||
Element {
|
||||
id: String::from("corner3"),
|
||||
label: String::from("Corner 3"),
|
||||
color: [0.75_f32, 0.75_f32, 0.75_f32],
|
||||
rep: engine::sphere(2.0/3.0, 4.0/3.0 * a, 0.0, 1.0/3.0),
|
||||
constraints: BTreeSet::default()
|
||||
}
|
||||
Element::new(
|
||||
String::from("corner3"),
|
||||
String::from("Corner 3"),
|
||||
[0.75_f32, 0.75_f32, 0.75_f32],
|
||||
engine::sphere(2.0/3.0, 4.0/3.0 * a, 0.0, 1.0/3.0)
|
||||
)
|
||||
);
|
||||
}
|
||||
|
||||
|
@ -198,44 +177,63 @@ pub fn AddRemove() -> View {
|
|||
}
|
||||
) { "+" }
|
||||
button(
|
||||
class="emoji", /* KLUDGE */ // for convenience, we're using an emoji as a temporary icon for this button
|
||||
disabled={
|
||||
let state = use_context::<AppState>();
|
||||
state.selection.with(|sel| sel.len() != 2)
|
||||
},
|
||||
on:click=|_| {
|
||||
let state = use_context::<AppState>();
|
||||
let args = state.selection.with(
|
||||
let subjects = state.selection.with(
|
||||
|sel| {
|
||||
let arg_vec: Vec<_> = sel.into_iter().collect();
|
||||
(arg_vec[0].clone(), arg_vec[1].clone())
|
||||
let subject_vec: Vec<_> = sel.into_iter().collect();
|
||||
(subject_vec[0].clone(), subject_vec[1].clone())
|
||||
}
|
||||
);
|
||||
let lorentz_prod = create_signal(0.0);
|
||||
let lorentz_prod_valid = create_signal(false);
|
||||
let active = create_signal(true);
|
||||
state.assembly.insert_constraint(Constraint {
|
||||
args: args,
|
||||
rep: 0.0,
|
||||
active: create_signal(true)
|
||||
subjects: subjects,
|
||||
lorentz_prod: lorentz_prod,
|
||||
lorentz_prod_text: create_signal(String::new()),
|
||||
lorentz_prod_valid: lorentz_prod_valid,
|
||||
active: active,
|
||||
});
|
||||
state.selection.update(|sel| sel.clear());
|
||||
|
||||
/* DEBUG */
|
||||
// print updated constraint list
|
||||
console::log_1(&JsValue::from("constraints:"));
|
||||
console::log_1(&JsValue::from("Constraints:"));
|
||||
state.assembly.constraints.with(|csts| {
|
||||
for (_, cst) in csts.into_iter() {
|
||||
console::log_5(
|
||||
&JsValue::from(" "),
|
||||
&JsValue::from(cst.args.0),
|
||||
&JsValue::from(cst.args.1),
|
||||
&JsValue::from(cst.subjects.0),
|
||||
&JsValue::from(cst.subjects.1),
|
||||
&JsValue::from(":"),
|
||||
&JsValue::from(cst.rep)
|
||||
&JsValue::from(cst.lorentz_prod.get_untracked())
|
||||
);
|
||||
}
|
||||
});
|
||||
|
||||
// update the realization when the constraint becomes active
|
||||
// and valid, or is edited while active and valid
|
||||
create_effect(move || {
|
||||
console::log_1(&JsValue::from(
|
||||
format!("Constraint ({}, {}) updated", subjects.0, subjects.1)
|
||||
));
|
||||
lorentz_prod.track();
|
||||
if active.get() && lorentz_prod_valid.get() {
|
||||
state.assembly.realize();
|
||||
}
|
||||
});
|
||||
}
|
||||
) { "🔗" }
|
||||
select(bind:value=assembly_name) { /* DEBUG */
|
||||
select(bind:value=assembly_name) { /* DEBUG */ // example assembly chooser
|
||||
option(value="general") { "General" }
|
||||
option(value="low-curv") { "Low-curvature" }
|
||||
option(value="empty") { "Empty" }
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,25 +1,133 @@
|
|||
use nalgebra::DVector;
|
||||
use nalgebra::{DMatrix, DVector, DVectorView, Vector3};
|
||||
use rustc_hash::FxHashMap;
|
||||
use slab::Slab;
|
||||
use std::collections::BTreeSet;
|
||||
use std::{collections::BTreeSet, sync::atomic::{AtomicU64, Ordering}};
|
||||
use sycamore::prelude::*;
|
||||
use web_sys::{console, wasm_bindgen::JsValue}; /* DEBUG */
|
||||
|
||||
use crate::engine::{realize_gram, local_unif_to_std, ConfigSubspace, PartialMatrix};
|
||||
|
||||
// the types of the keys we use to access an assembly's elements and constraints
|
||||
pub type ElementKey = usize;
|
||||
pub type ConstraintKey = usize;
|
||||
|
||||
pub type ElementColor = [f32; 3];
|
||||
|
||||
/* KLUDGE */
|
||||
// we should reconsider this design when we build a system for switching between
|
||||
// assemblies. at that point, we might want to switch to hierarchical keys,
|
||||
// where each each element has a key that identifies it within its assembly and
|
||||
// each assembly has a key that identifies it within the sesssion
|
||||
static NEXT_ELEMENT_SERIAL: AtomicU64 = AtomicU64::new(0);
|
||||
|
||||
#[derive(Clone, PartialEq)]
|
||||
pub struct Element {
|
||||
pub id: String,
|
||||
pub label: String,
|
||||
pub color: [f32; 3],
|
||||
pub rep: DVector<f64>,
|
||||
pub constraints: BTreeSet<usize>
|
||||
pub color: ElementColor,
|
||||
pub representation: Signal<DVector<f64>>,
|
||||
pub constraints: Signal<BTreeSet<ConstraintKey>>,
|
||||
|
||||
// a serial number, assigned by `Element::new`, that uniquely identifies
|
||||
// each element
|
||||
pub serial: u64,
|
||||
|
||||
// the configuration matrix column index that was assigned to this element
|
||||
// last time the assembly was realized, or `None` if the element has never
|
||||
// been through a realization
|
||||
column_index: Option<usize>
|
||||
}
|
||||
|
||||
impl Element {
|
||||
pub fn new(
|
||||
id: String,
|
||||
label: String,
|
||||
color: ElementColor,
|
||||
representation: DVector<f64>
|
||||
) -> Element {
|
||||
// take the next serial number, panicking if that was the last number we
|
||||
// had left. the technique we use to panic on overflow is taken from
|
||||
// _Rust Atomics and Locks_, by Mara Bos
|
||||
//
|
||||
// https://marabos.nl/atomics/atomics.html#example-handle-overflow
|
||||
//
|
||||
let serial = NEXT_ELEMENT_SERIAL.fetch_update(
|
||||
Ordering::SeqCst, Ordering::SeqCst,
|
||||
|serial| serial.checked_add(1)
|
||||
).expect("Out of serial numbers for elements");
|
||||
|
||||
Element {
|
||||
id: id,
|
||||
label: label,
|
||||
color: color,
|
||||
representation: create_signal(representation),
|
||||
constraints: create_signal(BTreeSet::default()),
|
||||
serial: serial,
|
||||
column_index: None
|
||||
}
|
||||
}
|
||||
|
||||
// the smallest positive depth, represented as a multiple of `dir`, where
|
||||
// the line generated by `dir` hits the element (which is assumed to be a
|
||||
// sphere). returns `None` if the line misses the sphere. this function
|
||||
// should be kept synchronized with `sphere_cast` in `inversive.frag`, which
|
||||
// does essentially the same thing on the GPU side
|
||||
pub fn cast(&self, dir: Vector3<f64>, assembly_to_world: &DMatrix<f64>) -> Option<f64> {
|
||||
// if `a/b` is less than this threshold, we approximate
|
||||
// `a*u^2 + b*u + c` by the linear function `b*u + c`
|
||||
const DEG_THRESHOLD: f64 = 1e-9;
|
||||
|
||||
let rep = self.representation.with_untracked(|rep| assembly_to_world * rep);
|
||||
let a = -rep[3] * dir.norm_squared();
|
||||
let b = rep.rows_range(..3).dot(&dir);
|
||||
let c = -rep[4];
|
||||
|
||||
let adjust = 4.0*a*c/(b*b);
|
||||
if adjust < 1.0 {
|
||||
// as long as `b` is non-zero, the linear approximation of
|
||||
//
|
||||
// a*u^2 + b*u + c
|
||||
//
|
||||
// at `u = 0` will reach zero at a finite depth `u_lin`. the root of
|
||||
// the quadratic adjacent to `u_lin` is stored in `lin_root`. if
|
||||
// both roots have the same sign, `lin_root` will be the one closer
|
||||
// to `u = 0`
|
||||
let square_rect_ratio = 1.0 + (1.0 - adjust).sqrt();
|
||||
let lin_root = -(2.0*c)/b / square_rect_ratio;
|
||||
if a.abs() > DEG_THRESHOLD * b.abs() {
|
||||
if lin_root > 0.0 {
|
||||
Some(lin_root)
|
||||
} else {
|
||||
let other_root = -b/(2.*a) * square_rect_ratio;
|
||||
(other_root > 0.0).then_some(other_root)
|
||||
}
|
||||
} else {
|
||||
(lin_root > 0.0).then_some(lin_root)
|
||||
}
|
||||
} else {
|
||||
// the line through `dir` misses the sphere completely
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct Constraint {
|
||||
pub args: (usize, usize),
|
||||
pub rep: f64,
|
||||
pub subjects: (ElementKey, ElementKey),
|
||||
pub lorentz_prod: Signal<f64>,
|
||||
pub lorentz_prod_text: Signal<String>,
|
||||
pub lorentz_prod_valid: Signal<bool>,
|
||||
pub active: Signal<bool>
|
||||
}
|
||||
|
||||
// the velocity is expressed in uniform coordinates
|
||||
pub struct ElementMotion<'a> {
|
||||
pub key: ElementKey,
|
||||
pub velocity: DVectorView<'a, f64>
|
||||
}
|
||||
|
||||
type AssemblyMotion<'a> = Vec<ElementMotion<'a>>;
|
||||
|
||||
// a complete, view-independent description of an assembly
|
||||
#[derive(Clone)]
|
||||
pub struct Assembly {
|
||||
|
@ -27,8 +135,20 @@ pub struct Assembly {
|
|||
pub elements: Signal<Slab<Element>>,
|
||||
pub constraints: Signal<Slab<Constraint>>,
|
||||
|
||||
// solution variety tangent space. the basis vectors are stored in
|
||||
// configuration matrix format, ordered according to the elements' column
|
||||
// indices. when you realize the assembly, every element that's present
|
||||
// during realization gets a column index and is reflected in the tangent
|
||||
// space. since the methods in this module never assign column indices
|
||||
// without later realizing the assembly, we get the following invariant:
|
||||
//
|
||||
// (1) if an element has a column index, its tangent motions can be found
|
||||
// in that column of the tangent space basis matrices
|
||||
//
|
||||
pub tangent: Signal<ConfigSubspace>,
|
||||
|
||||
// indexing
|
||||
pub elements_by_id: Signal<FxHashMap<String, usize>>
|
||||
pub elements_by_id: Signal<FxHashMap<String, ElementKey>>
|
||||
}
|
||||
|
||||
impl Assembly {
|
||||
|
@ -36,10 +156,13 @@ impl Assembly {
|
|||
Assembly {
|
||||
elements: create_signal(Slab::new()),
|
||||
constraints: create_signal(Slab::new()),
|
||||
tangent: create_signal(ConfigSubspace::zero(0)),
|
||||
elements_by_id: create_signal(FxHashMap::default())
|
||||
}
|
||||
}
|
||||
|
||||
// --- inserting elements and constraints ---
|
||||
|
||||
// insert an element into the assembly without checking whether we already
|
||||
// have an element with the same identifier. any element that does have the
|
||||
// same identifier will get kicked out of the `elements_by_id` index
|
||||
|
@ -72,22 +195,214 @@ impl Assembly {
|
|||
|
||||
// create and insert a new element
|
||||
self.insert_element_unchecked(
|
||||
Element {
|
||||
id: id,
|
||||
label: format!("Sphere {}", id_num),
|
||||
color: [0.75_f32, 0.75_f32, 0.75_f32],
|
||||
rep: DVector::<f64>::from_column_slice(&[0.0, 0.0, 0.0, 0.5, -0.5]),
|
||||
constraints: BTreeSet::default()
|
||||
}
|
||||
Element::new(
|
||||
id,
|
||||
format!("Sphere {}", id_num),
|
||||
[0.75_f32, 0.75_f32, 0.75_f32],
|
||||
DVector::<f64>::from_column_slice(&[0.0, 0.0, 0.0, 0.5, -0.5])
|
||||
)
|
||||
);
|
||||
}
|
||||
|
||||
pub fn insert_constraint(&self, constraint: Constraint) {
|
||||
let args = constraint.args;
|
||||
let subjects = constraint.subjects;
|
||||
let key = self.constraints.update(|csts| csts.insert(constraint));
|
||||
self.elements.update(|elts| {
|
||||
elts[args.0].constraints.insert(key);
|
||||
elts[args.1].constraints.insert(key);
|
||||
})
|
||||
let subject_constraints = self.elements.with(
|
||||
|elts| (elts[subjects.0].constraints, elts[subjects.1].constraints)
|
||||
);
|
||||
subject_constraints.0.update(|csts| csts.insert(key));
|
||||
subject_constraints.1.update(|csts| csts.insert(key));
|
||||
}
|
||||
|
||||
// --- realization ---
|
||||
|
||||
pub fn realize(&self) {
|
||||
// index the elements
|
||||
self.elements.update_silent(|elts| {
|
||||
for (index, (_, elt)) in elts.into_iter().enumerate() {
|
||||
elt.column_index = Some(index);
|
||||
}
|
||||
});
|
||||
|
||||
// set up the Gram matrix and the initial configuration matrix
|
||||
let (gram, guess) = self.elements.with_untracked(|elts| {
|
||||
// set up the off-diagonal part of the Gram matrix
|
||||
let mut gram_to_be = PartialMatrix::new();
|
||||
self.constraints.with_untracked(|csts| {
|
||||
for (_, cst) in csts {
|
||||
if cst.active.get_untracked() && cst.lorentz_prod_valid.get_untracked() {
|
||||
let subjects = cst.subjects;
|
||||
let row = elts[subjects.0].column_index.unwrap();
|
||||
let col = elts[subjects.1].column_index.unwrap();
|
||||
gram_to_be.push_sym(row, col, cst.lorentz_prod.get_untracked());
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
// set up the initial configuration matrix and the diagonal of the
|
||||
// Gram matrix
|
||||
let mut guess_to_be = DMatrix::<f64>::zeros(5, elts.len());
|
||||
for (_, elt) in elts {
|
||||
let index = elt.column_index.unwrap();
|
||||
gram_to_be.push_sym(index, index, 1.0);
|
||||
guess_to_be.set_column(index, &elt.representation.get_clone_untracked());
|
||||
}
|
||||
|
||||
(gram_to_be, guess_to_be)
|
||||
});
|
||||
|
||||
/* DEBUG */
|
||||
// log the Gram matrix
|
||||
console::log_1(&JsValue::from("Gram matrix:"));
|
||||
gram.log_to_console();
|
||||
|
||||
/* DEBUG */
|
||||
// log the initial configuration matrix
|
||||
console::log_1(&JsValue::from("Old configuration:"));
|
||||
for j in 0..guess.nrows() {
|
||||
let mut row_str = String::new();
|
||||
for k in 0..guess.ncols() {
|
||||
row_str.push_str(format!(" {:>8.3}", guess[(j, k)]).as_str());
|
||||
}
|
||||
console::log_1(&JsValue::from(row_str));
|
||||
}
|
||||
|
||||
// look for a configuration with the given Gram matrix
|
||||
let (config, tangent, success, history) = realize_gram(
|
||||
&gram, guess, &[],
|
||||
1.0e-12, 0.5, 0.9, 1.1, 200, 110
|
||||
);
|
||||
|
||||
/* DEBUG */
|
||||
// report the outcome of the search
|
||||
console::log_1(&JsValue::from(
|
||||
if success {
|
||||
"Target accuracy achieved!"
|
||||
} else {
|
||||
"Failed to reach target accuracy"
|
||||
}
|
||||
));
|
||||
console::log_2(&JsValue::from("Steps:"), &JsValue::from(history.scaled_loss.len() - 1));
|
||||
console::log_2(&JsValue::from("Loss:"), &JsValue::from(*history.scaled_loss.last().unwrap()));
|
||||
console::log_2(&JsValue::from("Tangent dimension:"), &JsValue::from(tangent.dim()));
|
||||
|
||||
if success {
|
||||
// read out the solution
|
||||
for (_, elt) in self.elements.get_clone_untracked() {
|
||||
elt.representation.update(
|
||||
|rep| rep.set_column(0, &config.column(elt.column_index.unwrap()))
|
||||
);
|
||||
}
|
||||
|
||||
// save the tangent space
|
||||
self.tangent.set_silent(tangent);
|
||||
}
|
||||
}
|
||||
|
||||
// --- deformation ---
|
||||
|
||||
// project the given motion to the tangent space of the solution variety and
|
||||
// move the assembly along it. the implementation is based on invariant (1)
|
||||
// from above and the following additional invariant:
|
||||
//
|
||||
// (2) if an element is affected by a constraint, it has a column index
|
||||
//
|
||||
// we have this invariant because the assembly gets realized each time you
|
||||
// add a constraint
|
||||
pub fn deform(&self, motion: AssemblyMotion) {
|
||||
/* KLUDGE */
|
||||
// when the tangent space is zero, deformation won't do anything, but
|
||||
// the attempt to deform should be registered in the UI. this console
|
||||
// message will do for now
|
||||
if self.tangent.with(|tan| tan.dim() <= 0 && tan.assembly_dim() > 0) {
|
||||
console::log_1(&JsValue::from("The assembly is rigid"));
|
||||
}
|
||||
|
||||
// give a column index to each moving element that doesn't have one yet.
|
||||
// this temporarily breaks invariant (1), but the invariant will be
|
||||
// restored when we realize the assembly at the end of the deformation.
|
||||
// in the process, we find out how many matrix columns we'll need to
|
||||
// hold the deformation
|
||||
let realized_dim = self.tangent.with(|tan| tan.assembly_dim());
|
||||
let motion_dim = self.elements.update_silent(|elts| {
|
||||
let mut next_column_index = realized_dim;
|
||||
for elt_motion in motion.iter() {
|
||||
let moving_elt = &mut elts[elt_motion.key];
|
||||
if moving_elt.column_index.is_none() {
|
||||
moving_elt.column_index = Some(next_column_index);
|
||||
next_column_index += 1;
|
||||
}
|
||||
}
|
||||
next_column_index
|
||||
});
|
||||
|
||||
// project the element motions onto the tangent space of the solution
|
||||
// variety and sum them to get a deformation of the whole assembly. the
|
||||
// matrix `motion_proj` that holds the deformation has extra columns for
|
||||
// any moving elements that aren't reflected in the saved tangent space
|
||||
const ELEMENT_DIM: usize = 5;
|
||||
let mut motion_proj = DMatrix::zeros(ELEMENT_DIM, motion_dim);
|
||||
for elt_motion in motion {
|
||||
// we can unwrap the column index because we know that every moving
|
||||
// element has one at this point
|
||||
let column_index = self.elements.with_untracked(
|
||||
|elts| elts[elt_motion.key].column_index.unwrap()
|
||||
);
|
||||
|
||||
if column_index < realized_dim {
|
||||
// this element had a column index when we started, so by
|
||||
// invariant (1), it's reflected in the tangent space
|
||||
let mut target_columns = motion_proj.columns_mut(0, realized_dim);
|
||||
target_columns += self.tangent.with(
|
||||
|tan| tan.proj(&elt_motion.velocity, column_index)
|
||||
);
|
||||
} else {
|
||||
// this element didn't have a column index when we started, so
|
||||
// by invariant (2), it's unconstrained
|
||||
let mut target_column = motion_proj.column_mut(column_index);
|
||||
let unif_to_std = self.elements.with_untracked(
|
||||
|elts| {
|
||||
elts[elt_motion.key].representation.with_untracked(
|
||||
|rep| local_unif_to_std(rep.as_view())
|
||||
)
|
||||
}
|
||||
);
|
||||
target_column += unif_to_std * elt_motion.velocity;
|
||||
}
|
||||
}
|
||||
|
||||
// step the assembly along the deformation. this changes the elements'
|
||||
// normalizations, so we restore those afterward
|
||||
/* KLUDGE */
|
||||
// since our test assemblies only include spheres, we assume that every
|
||||
// element is on the 1 mass shell
|
||||
for (_, elt) in self.elements.get_clone_untracked() {
|
||||
elt.representation.update_silent(|rep| {
|
||||
match elt.column_index {
|
||||
Some(column_index) => {
|
||||
// step the assembly along the deformation
|
||||
*rep += motion_proj.column(column_index);
|
||||
|
||||
// restore normalization by contracting toward the last
|
||||
// coordinate axis
|
||||
let q_sp = rep.fixed_rows::<3>(0).norm_squared();
|
||||
let half_q_lt = -2.0 * rep[3] * rep[4];
|
||||
let half_q_lt_sq = half_q_lt * half_q_lt;
|
||||
let scaling = half_q_lt + (q_sp + half_q_lt_sq).sqrt();
|
||||
rep.fixed_rows_mut::<4>(0).scale_mut(1.0 / scaling);
|
||||
},
|
||||
None => {
|
||||
console::log_1(&JsValue::from(
|
||||
format!("No velocity to unpack for fresh element \"{}\"", elt.id)
|
||||
))
|
||||
}
|
||||
};
|
||||
});
|
||||
}
|
||||
|
||||
// bring the configuration back onto the solution variety. this also
|
||||
// gets the elements' column indices and the saved tangent space back in
|
||||
// sync
|
||||
self.realize();
|
||||
}
|
||||
}
|
|
@ -1,10 +1,12 @@
|
|||
use core::array;
|
||||
use nalgebra::{DMatrix, Rotation3, Vector3};
|
||||
use nalgebra::{DMatrix, DVector, Rotation3, Vector3};
|
||||
use sycamore::{prelude::*, motion::create_raf};
|
||||
use web_sys::{
|
||||
console,
|
||||
window,
|
||||
Element,
|
||||
KeyboardEvent,
|
||||
MouseEvent,
|
||||
WebGl2RenderingContext,
|
||||
WebGlProgram,
|
||||
WebGlShader,
|
||||
|
@ -12,7 +14,7 @@ use web_sys::{
|
|||
wasm_bindgen::{JsCast, JsValue}
|
||||
};
|
||||
|
||||
use crate::AppState;
|
||||
use crate::{AppState, assembly::{ElementKey, ElementMotion}};
|
||||
|
||||
fn compile_shader(
|
||||
context: &WebGl2RenderingContext,
|
||||
|
@ -82,6 +84,24 @@ fn bind_vertex_attrib(
|
|||
);
|
||||
}
|
||||
|
||||
// the direction in camera space that a mouse event is pointing along
|
||||
fn event_dir(event: &MouseEvent) -> Vector3<f64> {
|
||||
let target: Element = event.target().unwrap().unchecked_into();
|
||||
let rect = target.get_bounding_client_rect();
|
||||
let width = rect.width();
|
||||
let height = rect.height();
|
||||
let shortdim = width.min(height);
|
||||
|
||||
// this constant should be kept synchronized with `inversive.frag`
|
||||
const FOCAL_SLOPE: f64 = 0.3;
|
||||
|
||||
Vector3::new(
|
||||
FOCAL_SLOPE * (2.0*(f64::from(event.client_x()) - rect.left()) - width) / shortdim,
|
||||
FOCAL_SLOPE * (2.0*(rect.bottom() - f64::from(event.client_y())) - height) / shortdim,
|
||||
-1.0
|
||||
)
|
||||
}
|
||||
|
||||
#[component]
|
||||
pub fn Display() -> View {
|
||||
let state = use_context::<AppState>();
|
||||
|
@ -89,6 +109,9 @@ pub fn Display() -> View {
|
|||
// canvas
|
||||
let display = create_node_ref();
|
||||
|
||||
// viewpoint
|
||||
let assembly_to_world = create_signal(DMatrix::<f64>::identity(5, 5));
|
||||
|
||||
// navigation
|
||||
let pitch_up = create_signal(0.0);
|
||||
let pitch_down = create_signal(0.0);
|
||||
|
@ -100,10 +123,24 @@ pub fn Display() -> View {
|
|||
let zoom_out = create_signal(0.0);
|
||||
let turntable = create_signal(false); /* BENCHMARKING */
|
||||
|
||||
// manipulation
|
||||
let translate_neg_x = create_signal(0.0);
|
||||
let translate_pos_x = create_signal(0.0);
|
||||
let translate_neg_y = create_signal(0.0);
|
||||
let translate_pos_y = create_signal(0.0);
|
||||
let translate_neg_z = create_signal(0.0);
|
||||
let translate_pos_z = create_signal(0.0);
|
||||
let shrink_neg = create_signal(0.0);
|
||||
let shrink_pos = create_signal(0.0);
|
||||
|
||||
// change listener
|
||||
let scene_changed = create_signal(true);
|
||||
create_effect(move || {
|
||||
state.assembly.elements.track();
|
||||
state.assembly.elements.with(|elts| {
|
||||
for (_, elt) in elts {
|
||||
elt.representation.track();
|
||||
}
|
||||
});
|
||||
state.selection.track();
|
||||
scene_changed.set(true);
|
||||
});
|
||||
|
@ -114,6 +151,7 @@ pub fn Display() -> View {
|
|||
let mut frames_since_last_sample = 0;
|
||||
let mean_frame_interval = create_signal(0.0);
|
||||
|
||||
let assembly_for_raf = state.assembly.clone();
|
||||
on_mount(move || {
|
||||
// timing
|
||||
let mut last_time = 0.0;
|
||||
|
@ -126,6 +164,10 @@ pub fn Display() -> View {
|
|||
let mut rotation = DMatrix::<f64>::identity(5, 5);
|
||||
let mut location_z: f64 = 5.0;
|
||||
|
||||
// manipulation
|
||||
const TRANSLATION_SPEED: f64 = 0.15; // in length units per second
|
||||
const SHRINKING_SPEED: f64 = 0.15; // in length units per second
|
||||
|
||||
// display parameters
|
||||
const OPACITY: f32 = 0.5; /* SCAFFOLDING */
|
||||
const HIGHLIGHT: f32 = 0.2; /* SCAFFOLDING */
|
||||
|
@ -246,6 +288,16 @@ pub fn Display() -> View {
|
|||
let zoom_out_val = zoom_out.get();
|
||||
let turntable_val = turntable.get(); /* BENCHMARKING */
|
||||
|
||||
// get the manipulation state
|
||||
let translate_neg_x_val = translate_neg_x.get();
|
||||
let translate_pos_x_val = translate_pos_x.get();
|
||||
let translate_neg_y_val = translate_neg_y.get();
|
||||
let translate_pos_y_val = translate_pos_y.get();
|
||||
let translate_neg_z_val = translate_neg_z.get();
|
||||
let translate_pos_z_val = translate_pos_z.get();
|
||||
let shrink_neg_val = shrink_neg.get();
|
||||
let shrink_pos_val = shrink_pos.get();
|
||||
|
||||
// update the assembly's orientation
|
||||
let ang_vel = {
|
||||
let pitch = pitch_up_val - pitch_down_val;
|
||||
|
@ -271,6 +323,44 @@ pub fn Display() -> View {
|
|||
let zoom = zoom_out_val - zoom_in_val;
|
||||
location_z *= (time_step * ZOOM_SPEED * zoom).exp();
|
||||
|
||||
// manipulate the assembly
|
||||
if state.selection.with(|sel| sel.len() == 1) {
|
||||
let sel = state.selection.with(
|
||||
|sel| *sel.into_iter().next().unwrap()
|
||||
);
|
||||
let translate_x = translate_pos_x_val - translate_neg_x_val;
|
||||
let translate_y = translate_pos_y_val - translate_neg_y_val;
|
||||
let translate_z = translate_pos_z_val - translate_neg_z_val;
|
||||
let shrink = shrink_pos_val - shrink_neg_val;
|
||||
let translating =
|
||||
translate_x != 0.0
|
||||
|| translate_y != 0.0
|
||||
|| translate_z != 0.0;
|
||||
if translating || shrink != 0.0 {
|
||||
let elt_motion = {
|
||||
let u = if translating {
|
||||
TRANSLATION_SPEED * Vector3::new(
|
||||
translate_x, translate_y, translate_z
|
||||
).normalize()
|
||||
} else {
|
||||
Vector3::zeros()
|
||||
};
|
||||
time_step * DVector::from_column_slice(
|
||||
&[u[0], u[1], u[2], SHRINKING_SPEED * shrink]
|
||||
)
|
||||
};
|
||||
assembly_for_raf.deform(
|
||||
vec![
|
||||
ElementMotion {
|
||||
key: sel,
|
||||
velocity: elt_motion.as_view()
|
||||
}
|
||||
]
|
||||
);
|
||||
scene_changed.set(true);
|
||||
}
|
||||
}
|
||||
|
||||
if scene_changed.get() {
|
||||
/* INSTRUMENTS */
|
||||
// measure mean frame interval
|
||||
|
@ -292,26 +382,43 @@ pub fn Display() -> View {
|
|||
0.0, 0.0, 0.0, 0.0, 1.0
|
||||
])
|
||||
};
|
||||
let assembly_to_world = &location * &orientation;
|
||||
let asm_to_world = &location * &orientation;
|
||||
|
||||
// get the assembly
|
||||
let elements = state.assembly.elements.get_clone();
|
||||
let element_iter = (&elements).into_iter();
|
||||
let reps_world: Vec<_> = element_iter.clone().map(|(_, elt)| &assembly_to_world * &elt.rep).collect();
|
||||
let colors: Vec<_> = element_iter.clone().map(|(key, elt)|
|
||||
if state.selection.with(|sel| sel.contains(&key)) {
|
||||
elt.color.map(|ch| 0.2 + 0.8*ch)
|
||||
} else {
|
||||
elt.color
|
||||
}
|
||||
).collect();
|
||||
let highlights: Vec<_> = element_iter.map(|(key, _)|
|
||||
if state.selection.with(|sel| sel.contains(&key)) {
|
||||
1.0_f32
|
||||
} else {
|
||||
HIGHLIGHT
|
||||
}
|
||||
).collect();
|
||||
let (
|
||||
elt_cnt,
|
||||
reps_world,
|
||||
colors,
|
||||
highlights
|
||||
) = state.assembly.elements.with(|elts| {
|
||||
(
|
||||
// number of elements
|
||||
elts.len() as i32,
|
||||
|
||||
// representation vectors in world coordinates
|
||||
elts.iter().map(
|
||||
|(_, elt)| elt.representation.with(|rep| &asm_to_world * rep)
|
||||
).collect::<Vec<_>>(),
|
||||
|
||||
// colors
|
||||
elts.iter().map(|(key, elt)| {
|
||||
if state.selection.with(|sel| sel.contains(&key)) {
|
||||
elt.color.map(|ch| 0.2 + 0.8*ch)
|
||||
} else {
|
||||
elt.color
|
||||
}
|
||||
}).collect::<Vec<_>>(),
|
||||
|
||||
// highlight levels
|
||||
elts.iter().map(|(key, _)| {
|
||||
if state.selection.with(|sel| sel.contains(&key)) {
|
||||
1.0_f32
|
||||
} else {
|
||||
HIGHLIGHT
|
||||
}
|
||||
}).collect::<Vec<_>>()
|
||||
)
|
||||
});
|
||||
|
||||
// set the resolution
|
||||
let width = canvas.width() as f32;
|
||||
|
@ -320,7 +427,7 @@ pub fn Display() -> View {
|
|||
ctx.uniform1f(shortdim_loc.as_ref(), width.min(height));
|
||||
|
||||
// pass the assembly
|
||||
ctx.uniform1i(sphere_cnt_loc.as_ref(), elements.len() as i32);
|
||||
ctx.uniform1i(sphere_cnt_loc.as_ref(), elt_cnt);
|
||||
for n in 0..reps_world.len() {
|
||||
let v = &reps_world[n];
|
||||
ctx.uniform3f(
|
||||
|
@ -349,6 +456,9 @@ pub fn Display() -> View {
|
|||
// draw the scene
|
||||
ctx.draw_arrays(WebGl2RenderingContext::TRIANGLES, 0, VERTEX_CNT as i32);
|
||||
|
||||
// update the viewpoint
|
||||
assembly_to_world.set(asm_to_world);
|
||||
|
||||
// clear the scene change flag
|
||||
scene_changed.set(
|
||||
pitch_up_val != 0.0
|
||||
|
@ -369,7 +479,7 @@ pub fn Display() -> View {
|
|||
start_animation_loop();
|
||||
});
|
||||
|
||||
let set_nav_signal = move |event: KeyboardEvent, value: f64| {
|
||||
let set_nav_signal = move |event: &KeyboardEvent, value: f64| {
|
||||
let mut navigating = true;
|
||||
let shift = event.shift_key();
|
||||
match event.key().as_str() {
|
||||
|
@ -389,6 +499,25 @@ pub fn Display() -> View {
|
|||
}
|
||||
};
|
||||
|
||||
let set_manip_signal = move |event: &KeyboardEvent, value: f64| {
|
||||
let mut manipulating = true;
|
||||
let shift = event.shift_key();
|
||||
match event.key().as_str() {
|
||||
"d" | "D" => translate_pos_x.set(value),
|
||||
"a" | "A" => translate_neg_x.set(value),
|
||||
"w" | "W" if shift => translate_neg_z.set(value),
|
||||
"s" | "S" if shift => translate_pos_z.set(value),
|
||||
"w" | "W" => translate_pos_y.set(value),
|
||||
"s" | "S" => translate_neg_y.set(value),
|
||||
"]" | "}" => shrink_neg.set(value),
|
||||
"[" | "{" => shrink_pos.set(value),
|
||||
_ => manipulating = false
|
||||
};
|
||||
if manipulating {
|
||||
event.prevent_default();
|
||||
}
|
||||
};
|
||||
|
||||
view! {
|
||||
/* TO DO */
|
||||
// switch back to integer-valued parameters when that becomes possible
|
||||
|
@ -400,6 +529,7 @@ pub fn Display() -> View {
|
|||
tabindex="0",
|
||||
on:keydown=move |event: KeyboardEvent| {
|
||||
if event.key() == "Shift" {
|
||||
// swap navigation inputs
|
||||
roll_cw.set(yaw_right.get());
|
||||
roll_ccw.set(yaw_left.get());
|
||||
zoom_in.set(pitch_up.get());
|
||||
|
@ -408,16 +538,24 @@ pub fn Display() -> View {
|
|||
yaw_left.set(0.0);
|
||||
pitch_up.set(0.0);
|
||||
pitch_down.set(0.0);
|
||||
|
||||
// swap manipulation inputs
|
||||
translate_pos_z.set(translate_neg_y.get());
|
||||
translate_neg_z.set(translate_pos_y.get());
|
||||
translate_pos_y.set(0.0);
|
||||
translate_neg_y.set(0.0);
|
||||
} else {
|
||||
if event.key() == "Enter" { /* BENCHMARKING */
|
||||
turntable.set_fn(|turn| !turn);
|
||||
scene_changed.set(true);
|
||||
}
|
||||
set_nav_signal(event, 1.0);
|
||||
set_nav_signal(&event, 1.0);
|
||||
set_manip_signal(&event, 1.0);
|
||||
}
|
||||
},
|
||||
on:keyup=move |event: KeyboardEvent| {
|
||||
if event.key() == "Shift" {
|
||||
// swap navigation inputs
|
||||
yaw_right.set(roll_cw.get());
|
||||
yaw_left.set(roll_ccw.get());
|
||||
pitch_up.set(zoom_in.get());
|
||||
|
@ -426,8 +564,15 @@ pub fn Display() -> View {
|
|||
roll_ccw.set(0.0);
|
||||
zoom_in.set(0.0);
|
||||
zoom_out.set(0.0);
|
||||
|
||||
// swap manipulation inputs
|
||||
translate_pos_y.set(translate_neg_z.get());
|
||||
translate_neg_y.set(translate_pos_z.get());
|
||||
translate_pos_z.set(0.0);
|
||||
translate_neg_z.set(0.0);
|
||||
} else {
|
||||
set_nav_signal(event, 0.0);
|
||||
set_nav_signal(&event, 0.0);
|
||||
set_manip_signal(&event, 0.0);
|
||||
}
|
||||
},
|
||||
on:blur=move |_| {
|
||||
|
@ -437,6 +582,31 @@ pub fn Display() -> View {
|
|||
yaw_left.set(0.0);
|
||||
roll_ccw.set(0.0);
|
||||
roll_cw.set(0.0);
|
||||
},
|
||||
on:click=move |event: MouseEvent| {
|
||||
// find the nearest element along the pointer direction
|
||||
let dir = event_dir(&event);
|
||||
console::log_1(&JsValue::from(dir.to_string()));
|
||||
let mut clicked: Option<(ElementKey, f64)> = None;
|
||||
for (key, elt) in state.assembly.elements.get_clone_untracked() {
|
||||
match assembly_to_world.with(|asm_to_world| elt.cast(dir, asm_to_world)) {
|
||||
Some(depth) => match clicked {
|
||||
Some((_, best_depth)) => {
|
||||
if depth < best_depth {
|
||||
clicked = Some((key, depth))
|
||||
}
|
||||
},
|
||||
None => clicked = Some((key, depth))
|
||||
}
|
||||
None => ()
|
||||
};
|
||||
}
|
||||
|
||||
// if we clicked something, select it
|
||||
match clicked {
|
||||
Some((key, _)) => state.select(key, event.shift_key()),
|
||||
None => state.selection.update(|sel| sel.clear())
|
||||
};
|
||||
}
|
||||
)
|
||||
}
|
||||
|
|
|
@ -1,4 +1,13 @@
|
|||
use nalgebra::DVector;
|
||||
use lazy_static::lazy_static;
|
||||
use nalgebra::{Const, DMatrix, DVector, DVectorView, Dyn, SymmetricEigen};
|
||||
use web_sys::{console, wasm_bindgen::JsValue}; /* DEBUG */
|
||||
|
||||
// --- elements ---
|
||||
|
||||
#[cfg(feature = "dev")]
|
||||
pub fn point(x: f64, y: f64, z: f64) -> DVector<f64> {
|
||||
DVector::from_column_slice(&[x, y, z, 0.5, 0.5*(x*x + y*y + z*z)])
|
||||
}
|
||||
|
||||
// the sphere with the given center and radius, with inward-pointing normals
|
||||
pub fn sphere(center_x: f64, center_y: f64, center_z: f64, radius: f64) -> DVector<f64> {
|
||||
|
@ -24,4 +33,831 @@ pub fn sphere_with_offset(dir_x: f64, dir_y: f64, dir_z: f64, off: f64, curv: f6
|
|||
0.5 * curv,
|
||||
off * (1.0 + 0.5 * off * curv)
|
||||
])
|
||||
}
|
||||
|
||||
// --- partial matrices ---
|
||||
|
||||
struct MatrixEntry {
|
||||
index: (usize, usize),
|
||||
value: f64
|
||||
}
|
||||
|
||||
pub struct PartialMatrix(Vec<MatrixEntry>);
|
||||
|
||||
impl PartialMatrix {
|
||||
pub fn new() -> PartialMatrix {
|
||||
PartialMatrix(Vec::<MatrixEntry>::new())
|
||||
}
|
||||
|
||||
pub fn push_sym(&mut self, row: usize, col: usize, value: f64) {
|
||||
let PartialMatrix(entries) = self;
|
||||
entries.push(MatrixEntry { index: (row, col), value: value });
|
||||
if row != col {
|
||||
entries.push(MatrixEntry { index: (col, row), value: value });
|
||||
}
|
||||
}
|
||||
|
||||
/* DEBUG */
|
||||
pub fn log_to_console(&self) {
|
||||
let PartialMatrix(entries) = self;
|
||||
for ent in entries {
|
||||
let ent_str = format!(" {} {} {}", ent.index.0, ent.index.1, ent.value);
|
||||
console::log_1(&JsValue::from(ent_str.as_str()));
|
||||
}
|
||||
}
|
||||
|
||||
fn proj(&self, a: &DMatrix<f64>) -> DMatrix<f64> {
|
||||
let mut result = DMatrix::<f64>::zeros(a.nrows(), a.ncols());
|
||||
let PartialMatrix(entries) = self;
|
||||
for ent in entries {
|
||||
result[ent.index] = a[ent.index];
|
||||
}
|
||||
result
|
||||
}
|
||||
|
||||
fn sub_proj(&self, rhs: &DMatrix<f64>) -> DMatrix<f64> {
|
||||
let mut result = DMatrix::<f64>::zeros(rhs.nrows(), rhs.ncols());
|
||||
let PartialMatrix(entries) = self;
|
||||
for ent in entries {
|
||||
result[ent.index] = ent.value - rhs[ent.index];
|
||||
}
|
||||
result
|
||||
}
|
||||
}
|
||||
|
||||
// --- configuration subspaces ---
|
||||
|
||||
#[derive(Clone)]
|
||||
pub struct ConfigSubspace {
|
||||
assembly_dim: usize,
|
||||
basis_std: Vec<DMatrix<f64>>,
|
||||
basis_proj: Vec<DMatrix<f64>>
|
||||
}
|
||||
|
||||
impl ConfigSubspace {
|
||||
pub fn zero(assembly_dim: usize) -> ConfigSubspace {
|
||||
ConfigSubspace {
|
||||
assembly_dim: assembly_dim,
|
||||
basis_proj: Vec::new(),
|
||||
basis_std: Vec::new()
|
||||
}
|
||||
}
|
||||
|
||||
// approximate the kernel of a symmetric endomorphism of the configuration
|
||||
// space for `assembly_dim` elements. we consider an eigenvector to be part
|
||||
// of the kernel if its eigenvalue is smaller than the constant `THRESHOLD`
|
||||
fn symmetric_kernel(a: DMatrix<f64>, proj_to_std: DMatrix<f64>, assembly_dim: usize) -> ConfigSubspace {
|
||||
// find a basis for the kernel. the basis is expressed in the projection
|
||||
// coordinates, and it's orthonormal with respect to the projection
|
||||
// inner product
|
||||
const THRESHOLD: f64 = 0.1;
|
||||
let eig = SymmetricEigen::new(proj_to_std.tr_mul(&a) * &proj_to_std);
|
||||
let eig_vecs = eig.eigenvectors.column_iter();
|
||||
let eig_pairs = eig.eigenvalues.iter().zip(eig_vecs);
|
||||
let basis_proj = DMatrix::from_columns(
|
||||
eig_pairs.filter_map(
|
||||
|(λ, v)| (λ.abs() < THRESHOLD).then_some(v)
|
||||
).collect::<Vec<_>>().as_slice()
|
||||
);
|
||||
|
||||
/* DEBUG */
|
||||
// print the eigenvalues
|
||||
#[cfg(all(target_family = "wasm", target_os = "unknown"))]
|
||||
console::log_1(&JsValue::from(
|
||||
format!("Eigenvalues used to find kernel:{}", eig.eigenvalues)
|
||||
));
|
||||
|
||||
// express the basis in the standard coordinates
|
||||
let basis_std = proj_to_std * &basis_proj;
|
||||
|
||||
const ELEMENT_DIM: usize = 5;
|
||||
const UNIFORM_DIM: usize = 4;
|
||||
ConfigSubspace {
|
||||
assembly_dim: assembly_dim,
|
||||
basis_std: basis_std.column_iter().map(
|
||||
|v| Into::<DMatrix<f64>>::into(
|
||||
v.reshape_generic(Dyn(ELEMENT_DIM), Dyn(assembly_dim))
|
||||
)
|
||||
).collect(),
|
||||
basis_proj: basis_proj.column_iter().map(
|
||||
|v| Into::<DMatrix<f64>>::into(
|
||||
v.reshape_generic(Dyn(UNIFORM_DIM), Dyn(assembly_dim))
|
||||
)
|
||||
).collect()
|
||||
}
|
||||
}
|
||||
|
||||
pub fn dim(&self) -> usize {
|
||||
self.basis_std.len()
|
||||
}
|
||||
|
||||
pub fn assembly_dim(&self) -> usize {
|
||||
self.assembly_dim
|
||||
}
|
||||
|
||||
// find the projection onto this subspace of the motion where the element
|
||||
// with the given column index has velocity `v`. the velocity is given in
|
||||
// projection coordinates, and the projection is done with respect to the
|
||||
// projection inner product
|
||||
pub fn proj(&self, v: &DVectorView<f64>, column_index: usize) -> DMatrix<f64> {
|
||||
if self.dim() == 0 {
|
||||
const ELEMENT_DIM: usize = 5;
|
||||
DMatrix::zeros(ELEMENT_DIM, self.assembly_dim)
|
||||
} else {
|
||||
self.basis_proj.iter().zip(self.basis_std.iter()).map(
|
||||
|(b_proj, b_std)| b_proj.column(column_index).dot(&v) * b_std
|
||||
).sum()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// --- descent history ---
|
||||
|
||||
pub struct DescentHistory {
|
||||
pub config: Vec<DMatrix<f64>>,
|
||||
pub scaled_loss: Vec<f64>,
|
||||
pub neg_grad: Vec<DMatrix<f64>>,
|
||||
pub min_eigval: Vec<f64>,
|
||||
pub base_step: Vec<DMatrix<f64>>,
|
||||
pub backoff_steps: Vec<i32>
|
||||
}
|
||||
|
||||
impl DescentHistory {
|
||||
fn new() -> DescentHistory {
|
||||
DescentHistory {
|
||||
config: Vec::<DMatrix<f64>>::new(),
|
||||
scaled_loss: Vec::<f64>::new(),
|
||||
neg_grad: Vec::<DMatrix<f64>>::new(),
|
||||
min_eigval: Vec::<f64>::new(),
|
||||
base_step: Vec::<DMatrix<f64>>::new(),
|
||||
backoff_steps: Vec::<i32>::new(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// --- gram matrix realization ---
|
||||
|
||||
// the Lorentz form
|
||||
lazy_static! {
|
||||
pub static ref Q: DMatrix<f64> = DMatrix::from_row_slice(5, 5, &[
|
||||
1.0, 0.0, 0.0, 0.0, 0.0,
|
||||
0.0, 1.0, 0.0, 0.0, 0.0,
|
||||
0.0, 0.0, 1.0, 0.0, 0.0,
|
||||
0.0, 0.0, 0.0, 0.0, -2.0,
|
||||
0.0, 0.0, 0.0, -2.0, 0.0
|
||||
]);
|
||||
}
|
||||
|
||||
struct SearchState {
|
||||
config: DMatrix<f64>,
|
||||
err_proj: DMatrix<f64>,
|
||||
loss: f64
|
||||
}
|
||||
|
||||
impl SearchState {
|
||||
fn from_config(gram: &PartialMatrix, config: DMatrix<f64>) -> SearchState {
|
||||
let err_proj = gram.sub_proj(&(config.tr_mul(&*Q) * &config));
|
||||
let loss = err_proj.norm_squared();
|
||||
SearchState {
|
||||
config: config,
|
||||
err_proj: err_proj,
|
||||
loss: loss
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn basis_matrix(index: (usize, usize), nrows: usize, ncols: usize) -> DMatrix<f64> {
|
||||
let mut result = DMatrix::<f64>::zeros(nrows, ncols);
|
||||
result[index] = 1.0;
|
||||
result
|
||||
}
|
||||
|
||||
// given a normalized vector `v` representing an element, build a basis for the
|
||||
// element's linear configuration space consisting of:
|
||||
// - the unit translation motions of the element
|
||||
// - the unit shrinking motion of the element, if it's a sphere
|
||||
// - one or two vectors whose coefficients vanish on the tangent space of the
|
||||
// normalization variety
|
||||
pub fn local_unif_to_std(v: DVectorView<f64>) -> DMatrix<f64> {
|
||||
const ELEMENT_DIM: usize = 5;
|
||||
const UNIFORM_DIM: usize = 4;
|
||||
let curv = 2.0*v[3];
|
||||
if v.dot(&(&*Q * v)) < 0.5 {
|
||||
// `v` represents a point. the normalization condition says that the
|
||||
// curvature component of `v` is 1/2
|
||||
DMatrix::from_column_slice(ELEMENT_DIM, UNIFORM_DIM, &[
|
||||
curv, 0.0, 0.0, 0.0, v[0],
|
||||
0.0, curv, 0.0, 0.0, v[1],
|
||||
0.0, 0.0, curv, 0.0, v[2],
|
||||
0.0, 0.0, 0.0, 0.0, 1.0
|
||||
])
|
||||
} else {
|
||||
// `v` represents a sphere. the normalization condition says that the
|
||||
// Lorentz product of `v` with itself is 1
|
||||
DMatrix::from_column_slice(ELEMENT_DIM, UNIFORM_DIM, &[
|
||||
curv, 0.0, 0.0, 0.0, v[0],
|
||||
0.0, curv, 0.0, 0.0, v[1],
|
||||
0.0, 0.0, curv, 0.0, v[2],
|
||||
curv*v[0], curv*v[1], curv*v[2], curv*v[3], curv*v[4] + 1.0
|
||||
])
|
||||
}
|
||||
}
|
||||
|
||||
// use backtracking line search to find a better configuration
|
||||
fn seek_better_config(
|
||||
gram: &PartialMatrix,
|
||||
state: &SearchState,
|
||||
base_step: &DMatrix<f64>,
|
||||
base_target_improvement: f64,
|
||||
min_efficiency: f64,
|
||||
backoff: f64,
|
||||
max_backoff_steps: i32
|
||||
) -> Option<(SearchState, i32)> {
|
||||
let mut rate = 1.0;
|
||||
for backoff_steps in 0..max_backoff_steps {
|
||||
let trial_config = &state.config + rate * base_step;
|
||||
let trial_state = SearchState::from_config(gram, trial_config);
|
||||
let improvement = state.loss - trial_state.loss;
|
||||
if improvement >= min_efficiency * rate * base_target_improvement {
|
||||
return Some((trial_state, backoff_steps));
|
||||
}
|
||||
rate *= backoff;
|
||||
}
|
||||
None
|
||||
}
|
||||
|
||||
// seek a matrix `config` for which `config' * Q * config` matches the partial
|
||||
// matrix `gram`. use gradient descent starting from `guess`
|
||||
pub fn realize_gram(
|
||||
gram: &PartialMatrix,
|
||||
guess: DMatrix<f64>,
|
||||
frozen: &[(usize, usize)],
|
||||
scaled_tol: f64,
|
||||
min_efficiency: f64,
|
||||
backoff: f64,
|
||||
reg_scale: f64,
|
||||
max_descent_steps: i32,
|
||||
max_backoff_steps: i32
|
||||
) -> (DMatrix<f64>, ConfigSubspace, bool, DescentHistory) {
|
||||
// start the descent history
|
||||
let mut history = DescentHistory::new();
|
||||
|
||||
// find the dimension of the search space
|
||||
let element_dim = guess.nrows();
|
||||
let assembly_dim = guess.ncols();
|
||||
let total_dim = element_dim * assembly_dim;
|
||||
|
||||
// scale the tolerance
|
||||
let scale_adjustment = (gram.0.len() as f64).sqrt();
|
||||
let tol = scale_adjustment * scaled_tol;
|
||||
|
||||
// convert the frozen indices to stacked format
|
||||
let frozen_stacked: Vec<usize> = frozen.into_iter().map(
|
||||
|index| index.1*element_dim + index.0
|
||||
).collect();
|
||||
|
||||
// use Newton's method with backtracking and gradient descent backup
|
||||
let mut state = SearchState::from_config(gram, guess);
|
||||
let mut hess = DMatrix::zeros(element_dim, assembly_dim);
|
||||
for _ in 0..max_descent_steps {
|
||||
// find the negative gradient of the loss function
|
||||
let neg_grad = 4.0 * &*Q * &state.config * &state.err_proj;
|
||||
let mut neg_grad_stacked = neg_grad.clone().reshape_generic(Dyn(total_dim), Const::<1>);
|
||||
history.neg_grad.push(neg_grad.clone());
|
||||
|
||||
// find the negative Hessian of the loss function
|
||||
let mut hess_cols = Vec::<DVector<f64>>::with_capacity(total_dim);
|
||||
for col in 0..assembly_dim {
|
||||
for row in 0..element_dim {
|
||||
let index = (row, col);
|
||||
let basis_mat = basis_matrix(index, element_dim, assembly_dim);
|
||||
let neg_d_err =
|
||||
basis_mat.tr_mul(&*Q) * &state.config
|
||||
+ state.config.tr_mul(&*Q) * &basis_mat;
|
||||
let neg_d_err_proj = gram.proj(&neg_d_err);
|
||||
let deriv_grad = 4.0 * &*Q * (
|
||||
-&basis_mat * &state.err_proj
|
||||
+ &state.config * &neg_d_err_proj
|
||||
);
|
||||
hess_cols.push(deriv_grad.reshape_generic(Dyn(total_dim), Const::<1>));
|
||||
}
|
||||
}
|
||||
hess = DMatrix::from_columns(hess_cols.as_slice());
|
||||
|
||||
// regularize the Hessian
|
||||
let min_eigval = hess.symmetric_eigenvalues().min();
|
||||
if min_eigval <= 0.0 {
|
||||
hess -= reg_scale * min_eigval * DMatrix::identity(total_dim, total_dim);
|
||||
}
|
||||
history.min_eigval.push(min_eigval);
|
||||
|
||||
// project the negative gradient and negative Hessian onto the
|
||||
// orthogonal complement of the frozen subspace
|
||||
let zero_col = DVector::zeros(total_dim);
|
||||
let zero_row = zero_col.transpose();
|
||||
for &k in &frozen_stacked {
|
||||
neg_grad_stacked[k] = 0.0;
|
||||
hess.set_row(k, &zero_row);
|
||||
hess.set_column(k, &zero_col);
|
||||
hess[(k, k)] = 1.0;
|
||||
}
|
||||
|
||||
// stop if the loss is tolerably low
|
||||
history.config.push(state.config.clone());
|
||||
history.scaled_loss.push(state.loss / scale_adjustment);
|
||||
if state.loss < tol { break; }
|
||||
|
||||
// compute the Newton step
|
||||
/*
|
||||
we need to either handle or eliminate the case where the minimum
|
||||
eigenvalue of the Hessian is zero, so the regularized Hessian is
|
||||
singular. right now, this causes the Cholesky decomposition to return
|
||||
`None`, leading to a panic when we unrap
|
||||
*/
|
||||
let base_step_stacked = hess.clone().cholesky().unwrap().solve(&neg_grad_stacked);
|
||||
let base_step = base_step_stacked.reshape_generic(Dyn(element_dim), Dyn(assembly_dim));
|
||||
history.base_step.push(base_step.clone());
|
||||
|
||||
// use backtracking line search to find a better configuration
|
||||
match seek_better_config(
|
||||
gram, &state, &base_step, neg_grad.dot(&base_step),
|
||||
min_efficiency, backoff, max_backoff_steps
|
||||
) {
|
||||
Some((better_state, backoff_steps)) => {
|
||||
state = better_state;
|
||||
history.backoff_steps.push(backoff_steps);
|
||||
},
|
||||
None => return (state.config, ConfigSubspace::zero(assembly_dim), false, history)
|
||||
};
|
||||
}
|
||||
let success = state.loss < tol;
|
||||
let tangent = if success {
|
||||
// express the uniform basis in the standard basis
|
||||
const UNIFORM_DIM: usize = 4;
|
||||
let total_dim_unif = UNIFORM_DIM * assembly_dim;
|
||||
let mut unif_to_std = DMatrix::<f64>::zeros(total_dim, total_dim_unif);
|
||||
for n in 0..assembly_dim {
|
||||
let block_start = (element_dim * n, UNIFORM_DIM * n);
|
||||
unif_to_std
|
||||
.view_mut(block_start, (element_dim, UNIFORM_DIM))
|
||||
.copy_from(&local_unif_to_std(state.config.column(n)));
|
||||
}
|
||||
|
||||
// find the kernel of the Hessian. give it the uniform inner product
|
||||
ConfigSubspace::symmetric_kernel(hess, unif_to_std, assembly_dim)
|
||||
} else {
|
||||
ConfigSubspace::zero(assembly_dim)
|
||||
};
|
||||
(state.config, tangent, success, history)
|
||||
}
|
||||
|
||||
// --- tests ---
|
||||
|
||||
// this problem is from a sangaku by Irisawa Shintarō Hiroatsu. the article
|
||||
// below includes a nice translation of the problem statement, which was
|
||||
// recorded in Uchida Itsumi's book _Kokon sankan_ (_Mathematics, Past and
|
||||
// Present_)
|
||||
//
|
||||
// "Japan's 'Wasan' Mathematical Tradition", by Abe Haruki
|
||||
// https://www.nippon.com/en/japan-topics/c12801/
|
||||
//
|
||||
#[cfg(feature = "dev")]
|
||||
pub mod irisawa {
|
||||
use std::{array, f64::consts::PI};
|
||||
|
||||
use super::*;
|
||||
|
||||
pub fn realize_irisawa_hexlet(scaled_tol: f64) -> (DMatrix<f64>, ConfigSubspace, bool, DescentHistory) {
|
||||
let gram = {
|
||||
let mut gram_to_be = PartialMatrix::new();
|
||||
for s in 0..9 {
|
||||
// each sphere is represented by a spacelike vector
|
||||
gram_to_be.push_sym(s, s, 1.0);
|
||||
|
||||
// the circumscribing sphere is tangent to all of the other
|
||||
// spheres, with matching orientation
|
||||
if s > 0 {
|
||||
gram_to_be.push_sym(0, s, 1.0);
|
||||
}
|
||||
|
||||
if s > 2 {
|
||||
// each chain sphere is tangent to the "sun" and "moon"
|
||||
// spheres, with opposing orientation
|
||||
for n in 1..3 {
|
||||
gram_to_be.push_sym(s, n, -1.0);
|
||||
}
|
||||
|
||||
// each chain sphere is tangent to the next chain sphere,
|
||||
// with opposing orientation
|
||||
let s_next = 3 + (s-2) % 6;
|
||||
gram_to_be.push_sym(s, s_next, -1.0);
|
||||
}
|
||||
}
|
||||
gram_to_be
|
||||
};
|
||||
|
||||
let guess = DMatrix::from_columns(
|
||||
[
|
||||
sphere(0.0, 0.0, 0.0, 15.0),
|
||||
sphere(0.0, 0.0, -9.0, 5.0),
|
||||
sphere(0.0, 0.0, 11.0, 3.0)
|
||||
].into_iter().chain(
|
||||
(1..=6).map(
|
||||
|k| {
|
||||
let ang = (k as f64) * PI/3.0;
|
||||
sphere(9.0 * ang.cos(), 9.0 * ang.sin(), 0.0, 2.5)
|
||||
}
|
||||
)
|
||||
).collect::<Vec<_>>().as_slice()
|
||||
);
|
||||
|
||||
// the frozen entries fix the radii of the circumscribing sphere, the
|
||||
// "sun" and "moon" spheres, and one of the chain spheres
|
||||
let frozen: [(usize, usize); 4] = array::from_fn(|k| (3, k));
|
||||
|
||||
realize_gram(
|
||||
&gram, guess, &frozen,
|
||||
scaled_tol, 0.5, 0.9, 1.1, 200, 110
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use nalgebra::Vector3;
|
||||
use std::{array, f64::consts::{FRAC_1_SQRT_2, PI}, iter};
|
||||
|
||||
use super::{*, irisawa::realize_irisawa_hexlet};
|
||||
|
||||
#[test]
|
||||
fn sub_proj_test() {
|
||||
let target = PartialMatrix(vec![
|
||||
MatrixEntry { index: (0, 0), value: 19.0 },
|
||||
MatrixEntry { index: (0, 2), value: 39.0 },
|
||||
MatrixEntry { index: (1, 1), value: 59.0 },
|
||||
MatrixEntry { index: (1, 2), value: 69.0 }
|
||||
]);
|
||||
let attempt = DMatrix::<f64>::from_row_slice(2, 3, &[
|
||||
1.0, 2.0, 3.0,
|
||||
4.0, 5.0, 6.0
|
||||
]);
|
||||
let expected_result = DMatrix::<f64>::from_row_slice(2, 3, &[
|
||||
18.0, 0.0, 36.0,
|
||||
0.0, 54.0, 63.0
|
||||
]);
|
||||
assert_eq!(target.sub_proj(&attempt), expected_result);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn zero_loss_test() {
|
||||
let gram = PartialMatrix({
|
||||
let mut entries = Vec::<MatrixEntry>::new();
|
||||
for j in 0..3 {
|
||||
for k in 0..3 {
|
||||
entries.push(MatrixEntry {
|
||||
index: (j, k),
|
||||
value: if j == k { 1.0 } else { -1.0 }
|
||||
});
|
||||
}
|
||||
}
|
||||
entries
|
||||
});
|
||||
let config = {
|
||||
let a: f64 = 0.75_f64.sqrt();
|
||||
DMatrix::from_columns(&[
|
||||
sphere(1.0, 0.0, 0.0, a),
|
||||
sphere(-0.5, a, 0.0, a),
|
||||
sphere(-0.5, -a, 0.0, a)
|
||||
])
|
||||
};
|
||||
let state = SearchState::from_config(&gram, config);
|
||||
assert!(state.loss.abs() < f64::EPSILON);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn irisawa_hexlet_test() {
|
||||
// solve Irisawa's problem
|
||||
const SCALED_TOL: f64 = 1.0e-12;
|
||||
let (config, _, _, _) = realize_irisawa_hexlet(SCALED_TOL);
|
||||
|
||||
// check against Irisawa's solution
|
||||
let entry_tol = SCALED_TOL.sqrt();
|
||||
let solution_diams = [30.0, 10.0, 6.0, 5.0, 15.0, 10.0, 3.75, 2.5, 2.0 + 8.0/11.0];
|
||||
for (k, diam) in solution_diams.into_iter().enumerate() {
|
||||
assert!((config[(3, k)] - 1.0 / diam).abs() < entry_tol);
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn tangent_test_three_spheres() {
|
||||
const SCALED_TOL: f64 = 1.0e-12;
|
||||
let gram = {
|
||||
let mut gram_to_be = PartialMatrix::new();
|
||||
for j in 0..3 {
|
||||
for k in j..3 {
|
||||
gram_to_be.push_sym(j, k, if j == k { 1.0 } else { -1.0 });
|
||||
}
|
||||
}
|
||||
gram_to_be
|
||||
};
|
||||
let guess = DMatrix::from_columns(&[
|
||||
sphere(0.0, 0.0, 0.0, -2.0),
|
||||
sphere(0.0, 0.0, 1.0, 1.0),
|
||||
sphere(0.0, 0.0, -1.0, 1.0)
|
||||
]);
|
||||
let frozen: [_; 5] = std::array::from_fn(|k| (k, 0));
|
||||
let (config, tangent, success, history) = realize_gram(
|
||||
&gram, guess.clone(), &frozen,
|
||||
SCALED_TOL, 0.5, 0.9, 1.1, 200, 110
|
||||
);
|
||||
assert_eq!(config, guess);
|
||||
assert_eq!(success, true);
|
||||
assert_eq!(history.scaled_loss.len(), 1);
|
||||
|
||||
// confirm that the tangent space has dimension five or less
|
||||
assert_eq!(tangent.basis_std.len(), 5);
|
||||
|
||||
// confirm that the tangent space contains all the motions we expect it
|
||||
// to. since we've already bounded the dimension of the tangent space,
|
||||
// this confirms that the tangent space is what we expect it to be
|
||||
const UNIFORM_DIM: usize = 4;
|
||||
let element_dim = guess.nrows();
|
||||
let assembly_dim = guess.ncols();
|
||||
let tangent_motions_unif = vec![
|
||||
basis_matrix((0, 1), UNIFORM_DIM, assembly_dim),
|
||||
basis_matrix((1, 1), UNIFORM_DIM, assembly_dim),
|
||||
basis_matrix((0, 2), UNIFORM_DIM, assembly_dim),
|
||||
basis_matrix((1, 2), UNIFORM_DIM, assembly_dim),
|
||||
DMatrix::<f64>::from_column_slice(UNIFORM_DIM, assembly_dim, &[
|
||||
0.0, 0.0, 0.0, 0.0,
|
||||
0.0, 0.0, -0.5, -0.5,
|
||||
0.0, 0.0, -0.5, 0.5
|
||||
])
|
||||
];
|
||||
let tangent_motions_std = vec![
|
||||
basis_matrix((0, 1), element_dim, assembly_dim),
|
||||
basis_matrix((1, 1), element_dim, assembly_dim),
|
||||
basis_matrix((0, 2), element_dim, assembly_dim),
|
||||
basis_matrix((1, 2), element_dim, assembly_dim),
|
||||
DMatrix::<f64>::from_column_slice(element_dim, assembly_dim, &[
|
||||
0.0, 0.0, 0.0, 0.00, 0.0,
|
||||
0.0, 0.0, -1.0, -0.25, -1.0,
|
||||
0.0, 0.0, -1.0, 0.25, 1.0
|
||||
])
|
||||
];
|
||||
let tol_sq = ((element_dim * assembly_dim) as f64) * SCALED_TOL * SCALED_TOL;
|
||||
for (motion_unif, motion_std) in tangent_motions_unif.into_iter().zip(tangent_motions_std) {
|
||||
let motion_proj: DMatrix<_> = motion_unif.column_iter().enumerate().map(
|
||||
|(k, v)| tangent.proj(&v, k)
|
||||
).sum();
|
||||
assert!((motion_std - motion_proj).norm_squared() < tol_sq);
|
||||
}
|
||||
}
|
||||
|
||||
fn translation_motion_unif(vel: &Vector3<f64>, assembly_dim: usize) -> Vec<DVector<f64>> {
|
||||
let mut elt_motion = DVector::zeros(4);
|
||||
elt_motion.fixed_rows_mut::<3>(0).copy_from(vel);
|
||||
iter::repeat(elt_motion).take(assembly_dim).collect()
|
||||
}
|
||||
|
||||
fn rotation_motion_unif(ang_vel: &Vector3<f64>, points: Vec<DVectorView<f64>>) -> Vec<DVector<f64>> {
|
||||
points.into_iter().map(
|
||||
|pt| {
|
||||
let vel = ang_vel.cross(&pt.fixed_rows::<3>(0));
|
||||
let mut elt_motion = DVector::zeros(4);
|
||||
elt_motion.fixed_rows_mut::<3>(0).copy_from(&vel);
|
||||
elt_motion
|
||||
}
|
||||
).collect()
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn tangent_test_kaleidocycle() {
|
||||
// set up a kaleidocycle, made of points with fixed distances between
|
||||
// them, and find its tangent space
|
||||
const N_POINTS: usize = 12;
|
||||
const N_HINGES: usize = 6;
|
||||
const SCALED_TOL: f64 = 1.0e-12;
|
||||
let gram = {
|
||||
let mut gram_to_be = PartialMatrix::new();
|
||||
for block in (0..N_POINTS).step_by(2) {
|
||||
let block_next = (block + 2) % N_POINTS;
|
||||
for j in 0..2 {
|
||||
// diagonal and hinge edges
|
||||
for k in j..2 {
|
||||
gram_to_be.push_sym(block + j, block + k, if j == k { 0.0 } else { -0.5 });
|
||||
}
|
||||
|
||||
// non-hinge edges
|
||||
for k in 0..2 {
|
||||
gram_to_be.push_sym(block + j, block_next + k, -0.625);
|
||||
}
|
||||
}
|
||||
}
|
||||
gram_to_be
|
||||
};
|
||||
let guess = {
|
||||
let guess_elts = (0..N_HINGES).step_by(2).flat_map(
|
||||
|n| {
|
||||
let ang_hor = (n as f64) * PI/3.0;
|
||||
let ang_vert = ((n + 1) as f64) * PI/3.0;
|
||||
let x_vert = ang_vert.cos();
|
||||
let y_vert = ang_vert.sin();
|
||||
[
|
||||
point(0.0, 0.0, 0.0),
|
||||
point(ang_hor.cos(), ang_hor.sin(), 0.0),
|
||||
point(x_vert, y_vert, -0.5),
|
||||
point(x_vert, y_vert, 0.5)
|
||||
]
|
||||
}
|
||||
).collect::<Vec<_>>();
|
||||
DMatrix::from_columns(&guess_elts)
|
||||
};
|
||||
let frozen: [_; N_POINTS] = array::from_fn(|k| (3, k));
|
||||
let (config, tangent, success, history) = realize_gram(
|
||||
&gram, guess.clone(), &frozen,
|
||||
SCALED_TOL, 0.5, 0.9, 1.1, 200, 110
|
||||
);
|
||||
assert_eq!(config, guess);
|
||||
assert_eq!(success, true);
|
||||
assert_eq!(history.scaled_loss.len(), 1);
|
||||
|
||||
// list some motions that should form a basis for the tangent space of
|
||||
// the solution variety
|
||||
let element_dim = guess.nrows();
|
||||
let assembly_dim = guess.ncols();
|
||||
let tangent_motions_unif = vec![
|
||||
// the translations along the coordinate axes
|
||||
translation_motion_unif(&Vector3::new(1.0, 0.0, 0.0), assembly_dim),
|
||||
translation_motion_unif(&Vector3::new(0.0, 1.0, 0.0), assembly_dim),
|
||||
translation_motion_unif(&Vector3::new(0.0, 0.0, 1.0), assembly_dim),
|
||||
|
||||
// the rotations about the coordinate axes
|
||||
rotation_motion_unif(&Vector3::new(1.0, 0.0, 0.0), guess.column_iter().collect()),
|
||||
rotation_motion_unif(&Vector3::new(0.0, 1.0, 0.0), guess.column_iter().collect()),
|
||||
rotation_motion_unif(&Vector3::new(0.0, 0.0, 1.0), guess.column_iter().collect()),
|
||||
|
||||
// the twist motion. more precisely: a motion that keeps the center
|
||||
// of mass stationary and preserves the distances between the
|
||||
// vertices to first order. this has to be the twist as long as:
|
||||
// - twisting is the kaleidocycle's only internal degree of
|
||||
// freedom
|
||||
// - every first-order motion of the kaleidocycle comes from an
|
||||
// actual motion
|
||||
(0..N_HINGES).step_by(2).flat_map(
|
||||
|n| {
|
||||
let ang_vert = ((n + 1) as f64) * PI/3.0;
|
||||
let vel_vert_x = 4.0 * ang_vert.cos();
|
||||
let vel_vert_y = 4.0 * ang_vert.sin();
|
||||
[
|
||||
DVector::from_column_slice(&[0.0, 0.0, 5.0, 0.0]),
|
||||
DVector::from_column_slice(&[0.0, 0.0, 1.0, 0.0]),
|
||||
DVector::from_column_slice(&[-vel_vert_x, -vel_vert_y, -3.0, 0.0]),
|
||||
DVector::from_column_slice(&[vel_vert_x, vel_vert_y, -3.0, 0.0])
|
||||
]
|
||||
}
|
||||
).collect::<Vec<_>>()
|
||||
];
|
||||
let tangent_motions_std = tangent_motions_unif.iter().map(
|
||||
|motion| DMatrix::from_columns(
|
||||
&guess.column_iter().zip(motion).map(
|
||||
|(v, elt_motion)| local_unif_to_std(v) * elt_motion
|
||||
).collect::<Vec<_>>()
|
||||
)
|
||||
).collect::<Vec<_>>();
|
||||
|
||||
// confirm that the dimension of the tangent space is no greater than
|
||||
// expected
|
||||
assert_eq!(tangent.basis_std.len(), tangent_motions_unif.len());
|
||||
|
||||
// confirm that the tangent space contains all the motions we expect it
|
||||
// to. since we've already bounded the dimension of the tangent space,
|
||||
// this confirms that the tangent space is what we expect it to be
|
||||
let tol_sq = ((element_dim * assembly_dim) as f64) * SCALED_TOL * SCALED_TOL;
|
||||
for (motion_unif, motion_std) in tangent_motions_unif.into_iter().zip(tangent_motions_std) {
|
||||
let motion_proj: DMatrix<_> = motion_unif.into_iter().enumerate().map(
|
||||
|(k, v)| tangent.proj(&v.as_view(), k)
|
||||
).sum();
|
||||
assert!((motion_std - motion_proj).norm_squared() < tol_sq);
|
||||
}
|
||||
}
|
||||
|
||||
fn translation(dis: Vector3<f64>) -> DMatrix<f64> {
|
||||
const ELEMENT_DIM: usize = 5;
|
||||
DMatrix::from_column_slice(ELEMENT_DIM, ELEMENT_DIM, &[
|
||||
1.0, 0.0, 0.0, 0.0, dis[0],
|
||||
0.0, 1.0, 0.0, 0.0, dis[1],
|
||||
0.0, 0.0, 1.0, 0.0, dis[2],
|
||||
2.0*dis[0], 2.0*dis[1], 2.0*dis[2], 1.0, dis.norm_squared(),
|
||||
0.0, 0.0, 0.0, 0.0, 1.0
|
||||
])
|
||||
}
|
||||
|
||||
// confirm that projection onto a configuration subspace is equivariant with
|
||||
// respect to Euclidean motions
|
||||
#[test]
|
||||
fn proj_equivar_test() {
|
||||
// find a pair of spheres that meet at 120°
|
||||
const SCALED_TOL: f64 = 1.0e-12;
|
||||
let gram = {
|
||||
let mut gram_to_be = PartialMatrix::new();
|
||||
gram_to_be.push_sym(0, 0, 1.0);
|
||||
gram_to_be.push_sym(1, 1, 1.0);
|
||||
gram_to_be.push_sym(0, 1, 0.5);
|
||||
gram_to_be
|
||||
};
|
||||
let guess_orig = DMatrix::from_columns(&[
|
||||
sphere(0.0, 0.0, 0.5, 1.0),
|
||||
sphere(0.0, 0.0, -0.5, 1.0)
|
||||
]);
|
||||
let (config_orig, tangent_orig, success_orig, history_orig) = realize_gram(
|
||||
&gram, guess_orig.clone(), &[],
|
||||
SCALED_TOL, 0.5, 0.9, 1.1, 200, 110
|
||||
);
|
||||
assert_eq!(config_orig, guess_orig);
|
||||
assert_eq!(success_orig, true);
|
||||
assert_eq!(history_orig.scaled_loss.len(), 1);
|
||||
|
||||
// find another pair of spheres that meet at 120°. we'll think of this
|
||||
// solution as a transformed version of the original one
|
||||
let guess_tfm = {
|
||||
let a = 0.5 * FRAC_1_SQRT_2;
|
||||
DMatrix::from_columns(&[
|
||||
sphere(a, 0.0, 7.0 + a, 1.0),
|
||||
sphere(-a, 0.0, 7.0 - a, 1.0)
|
||||
])
|
||||
};
|
||||
let (config_tfm, tangent_tfm, success_tfm, history_tfm) = realize_gram(
|
||||
&gram, guess_tfm.clone(), &[],
|
||||
SCALED_TOL, 0.5, 0.9, 1.1, 200, 110
|
||||
);
|
||||
assert_eq!(config_tfm, guess_tfm);
|
||||
assert_eq!(success_tfm, true);
|
||||
assert_eq!(history_tfm.scaled_loss.len(), 1);
|
||||
|
||||
// project a nudge to the tangent space of the solution variety at the
|
||||
// original solution
|
||||
let motion_orig = DVector::from_column_slice(&[0.0, 0.0, 1.0, 0.0]);
|
||||
let motion_orig_proj = tangent_orig.proj(&motion_orig.as_view(), 0);
|
||||
|
||||
// project the equivalent nudge to the tangent space of the solution
|
||||
// variety at the transformed solution
|
||||
let motion_tfm = DVector::from_column_slice(&[FRAC_1_SQRT_2, 0.0, FRAC_1_SQRT_2, 0.0]);
|
||||
let motion_tfm_proj = tangent_tfm.proj(&motion_tfm.as_view(), 0);
|
||||
|
||||
// take the transformation that sends the original solution to the
|
||||
// transformed solution and apply it to the motion that the original
|
||||
// solution makes in response to the nudge
|
||||
const ELEMENT_DIM: usize = 5;
|
||||
let rot = DMatrix::from_column_slice(ELEMENT_DIM, ELEMENT_DIM, &[
|
||||
FRAC_1_SQRT_2, 0.0, -FRAC_1_SQRT_2, 0.0, 0.0,
|
||||
0.0, 1.0, 0.0, 0.0, 0.0,
|
||||
FRAC_1_SQRT_2, 0.0, FRAC_1_SQRT_2, 0.0, 0.0,
|
||||
0.0, 0.0, 0.0, 1.0, 0.0,
|
||||
0.0, 0.0, 0.0, 0.0, 1.0
|
||||
]);
|
||||
let transl = translation(Vector3::new(0.0, 0.0, 7.0));
|
||||
let motion_proj_tfm = transl * rot * motion_orig_proj;
|
||||
|
||||
// confirm that the projection of the nudge is equivariant. we loosen
|
||||
// the comparison tolerance because the transformation seems to
|
||||
// introduce some numerical error
|
||||
const SCALED_TOL_TFM: f64 = 1.0e-9;
|
||||
let tol_sq = ((guess_orig.nrows() * guess_orig.ncols()) as f64) * SCALED_TOL_TFM * SCALED_TOL_TFM;
|
||||
assert!((motion_proj_tfm - motion_tfm_proj).norm_squared() < tol_sq);
|
||||
}
|
||||
|
||||
// at the frozen indices, the optimization steps should have exact zeros,
|
||||
// and the realized configuration should match the initial guess
|
||||
#[test]
|
||||
fn frozen_entry_test() {
|
||||
let gram = {
|
||||
let mut gram_to_be = PartialMatrix::new();
|
||||
for j in 0..2 {
|
||||
for k in j..2 {
|
||||
gram_to_be.push_sym(j, k, if (j, k) == (1, 1) { 1.0 } else { 0.0 });
|
||||
}
|
||||
}
|
||||
gram_to_be
|
||||
};
|
||||
let guess = DMatrix::from_columns(&[
|
||||
point(0.0, 0.0, 2.0),
|
||||
sphere(0.0, 0.0, 0.0, 1.0)
|
||||
]);
|
||||
let frozen = [(3, 0), (3, 1)];
|
||||
println!();
|
||||
let (config, _, success, history) = realize_gram(
|
||||
&gram, guess.clone(), &frozen,
|
||||
1.0e-12, 0.5, 0.9, 1.1, 200, 110
|
||||
);
|
||||
assert_eq!(success, true);
|
||||
for base_step in history.base_step.into_iter() {
|
||||
for index in frozen {
|
||||
assert_eq!(base_step[index], 0.0);
|
||||
}
|
||||
}
|
||||
for index in frozen {
|
||||
assert_eq!(config[index], guess[index]);
|
||||
}
|
||||
}
|
||||
}
|
1
app-proto/src/lib.rs
Normal file
1
app-proto/src/lib.rs
Normal file
|
@ -0,0 +1 @@
|
|||
pub mod engine;
|
|
@ -8,14 +8,14 @@ use rustc_hash::FxHashSet;
|
|||
use sycamore::prelude::*;
|
||||
|
||||
use add_remove::AddRemove;
|
||||
use assembly::Assembly;
|
||||
use assembly::{Assembly, ElementKey};
|
||||
use display::Display;
|
||||
use outline::Outline;
|
||||
|
||||
#[derive(Clone)]
|
||||
struct AppState {
|
||||
assembly: Assembly,
|
||||
selection: Signal<FxHashSet<usize>>
|
||||
selection: Signal<FxHashSet<ElementKey>>
|
||||
}
|
||||
|
||||
impl AppState {
|
||||
|
@ -25,9 +25,31 @@ impl AppState {
|
|||
selection: create_signal(FxHashSet::default())
|
||||
}
|
||||
}
|
||||
|
||||
// in single-selection mode, select the element with the given key. in
|
||||
// multiple-selection mode, toggle whether the element with the given key
|
||||
// is selected
|
||||
fn select(&self, key: ElementKey, multi: bool) {
|
||||
if multi {
|
||||
self.selection.update(|sel| {
|
||||
if !sel.remove(&key) {
|
||||
sel.insert(key);
|
||||
}
|
||||
});
|
||||
} else {
|
||||
self.selection.update(|sel| {
|
||||
sel.clear();
|
||||
sel.insert(key);
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn main() {
|
||||
// set the console error panic hook
|
||||
#[cfg(feature = "console_error_panic_hook")]
|
||||
console_error_panic_hook::set_once();
|
||||
|
||||
sycamore::render(|| {
|
||||
provide_context(AppState::new());
|
||||
|
||||
|
|
|
@ -1,26 +1,178 @@
|
|||
use itertools::Itertools;
|
||||
use sycamore::{prelude::*, web::tags::div};
|
||||
use web_sys::{Element, KeyboardEvent, MouseEvent, wasm_bindgen::JsCast};
|
||||
use sycamore::prelude::*;
|
||||
use web_sys::{
|
||||
Event,
|
||||
HtmlInputElement,
|
||||
KeyboardEvent,
|
||||
MouseEvent,
|
||||
wasm_bindgen::JsCast
|
||||
};
|
||||
|
||||
use crate::AppState;
|
||||
use crate::{AppState, assembly, assembly::{Constraint, ConstraintKey, ElementKey}};
|
||||
|
||||
// this component lists the elements of the assembly, showing the constraints
|
||||
// on each element as a collapsible sub-list. its implementation is based on
|
||||
// Kate Morley's HTML + CSS tree views:
|
||||
// an editable view of the Lorentz product representing a constraint
|
||||
#[component(inline_props)]
|
||||
fn LorentzProductInput(constraint: Constraint) -> View {
|
||||
view! {
|
||||
input(
|
||||
r#type="text",
|
||||
bind:value=constraint.lorentz_prod_text,
|
||||
on:change=move |event: Event| {
|
||||
let target: HtmlInputElement = event.target().unwrap().unchecked_into();
|
||||
match target.value().parse::<f64>() {
|
||||
Ok(lorentz_prod) => batch(|| {
|
||||
constraint.lorentz_prod.set(lorentz_prod);
|
||||
constraint.lorentz_prod_valid.set(true);
|
||||
}),
|
||||
Err(_) => constraint.lorentz_prod_valid.set(false)
|
||||
};
|
||||
}
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
// a list item that shows a constraint in an outline view of an element
|
||||
#[component(inline_props)]
|
||||
fn ConstraintOutlineItem(constraint_key: ConstraintKey, element_key: ElementKey) -> View {
|
||||
let state = use_context::<AppState>();
|
||||
let assembly = &state.assembly;
|
||||
let constraint = assembly.constraints.with(|csts| csts[constraint_key].clone());
|
||||
let other_subject = if constraint.subjects.0 == element_key {
|
||||
constraint.subjects.1
|
||||
} else {
|
||||
constraint.subjects.0
|
||||
};
|
||||
let other_subject_label = assembly.elements.with(|elts| elts[other_subject].label.clone());
|
||||
let class = constraint.lorentz_prod_valid.map(
|
||||
|&lorentz_prod_valid| if lorentz_prod_valid { "constraint" } else { "constraint invalid" }
|
||||
);
|
||||
view! {
|
||||
li(class=class.get()) {
|
||||
input(r#type="checkbox", bind:checked=constraint.active)
|
||||
div(class="constraint-label") { (other_subject_label) }
|
||||
LorentzProductInput(constraint=constraint)
|
||||
div(class="status")
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// a list item that shows an element in an outline view of an assembly
|
||||
#[component(inline_props)]
|
||||
fn ElementOutlineItem(key: ElementKey, element: assembly::Element) -> View {
|
||||
let state = use_context::<AppState>();
|
||||
let class = state.selection.map(
|
||||
move |sel| if sel.contains(&key) { "selected" } else { "" }
|
||||
);
|
||||
let label = element.label.clone();
|
||||
let rep_components = move || {
|
||||
element.representation.with(
|
||||
|rep| rep.iter().map(
|
||||
|u| {
|
||||
let u_str = format!("{:.3}", u).replace("-", "\u{2212}");
|
||||
view! { div { (u_str) } }
|
||||
}
|
||||
).collect::<Vec<_>>()
|
||||
)
|
||||
};
|
||||
let constrained = element.constraints.map(|csts| csts.len() > 0);
|
||||
let constraint_list = element.constraints.map(
|
||||
|csts| csts.clone().into_iter().collect()
|
||||
);
|
||||
let details_node = create_node_ref();
|
||||
view! {
|
||||
li {
|
||||
details(ref=details_node) {
|
||||
summary(
|
||||
class=class.get(),
|
||||
on:keydown={
|
||||
move |event: KeyboardEvent| {
|
||||
match event.key().as_str() {
|
||||
"Enter" => {
|
||||
state.select(key, event.shift_key());
|
||||
event.prevent_default();
|
||||
},
|
||||
"ArrowRight" if constrained.get() => {
|
||||
let _ = details_node
|
||||
.get()
|
||||
.unchecked_into::<web_sys::Element>()
|
||||
.set_attribute("open", "");
|
||||
},
|
||||
"ArrowLeft" => {
|
||||
let _ = details_node
|
||||
.get()
|
||||
.unchecked_into::<web_sys::Element>()
|
||||
.remove_attribute("open");
|
||||
},
|
||||
_ => ()
|
||||
}
|
||||
}
|
||||
}
|
||||
) {
|
||||
div(
|
||||
class="element-switch",
|
||||
on:click=|event: MouseEvent| event.stop_propagation()
|
||||
)
|
||||
div(
|
||||
class="element",
|
||||
on:click={
|
||||
move |event: MouseEvent| {
|
||||
if event.shift_key() {
|
||||
state.selection.update(|sel| {
|
||||
if !sel.remove(&key) {
|
||||
sel.insert(key);
|
||||
}
|
||||
});
|
||||
} else {
|
||||
state.selection.update(|sel| {
|
||||
sel.clear();
|
||||
sel.insert(key);
|
||||
});
|
||||
}
|
||||
event.stop_propagation();
|
||||
event.prevent_default();
|
||||
}
|
||||
}
|
||||
) {
|
||||
div(class="element-label") { (label) }
|
||||
div(class="element-representation") { (rep_components) }
|
||||
div(class="status")
|
||||
}
|
||||
}
|
||||
ul(class="constraints") {
|
||||
Keyed(
|
||||
list=constraint_list,
|
||||
view=move |cst_key| view! {
|
||||
ConstraintOutlineItem(
|
||||
constraint_key=cst_key,
|
||||
element_key=key
|
||||
)
|
||||
},
|
||||
key=|cst_key| cst_key.clone()
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// a component that lists the elements of the current assembly, showing the
|
||||
// constraints on each element as a collapsible sub-list. its implementation
|
||||
// is based on Kate Morley's HTML + CSS tree views:
|
||||
//
|
||||
// https://iamkate.com/code/tree-views/
|
||||
//
|
||||
#[component]
|
||||
pub fn Outline() -> View {
|
||||
// sort the elements alphabetically by ID
|
||||
let elements_sorted = create_memo(|| {
|
||||
let state = use_context::<AppState>();
|
||||
state.assembly.elements
|
||||
.get_clone()
|
||||
let state = use_context::<AppState>();
|
||||
|
||||
// list the elements alphabetically by ID
|
||||
let element_list = state.assembly.elements.map(
|
||||
|elts| elts
|
||||
.clone()
|
||||
.into_iter()
|
||||
.sorted_by_key(|(_, elt)| elt.id.clone())
|
||||
.collect()
|
||||
});
|
||||
);
|
||||
|
||||
view! {
|
||||
ul(
|
||||
|
@ -31,130 +183,11 @@ pub fn Outline() -> View {
|
|||
}
|
||||
) {
|
||||
Keyed(
|
||||
list=elements_sorted,
|
||||
view=|(key, elt)| {
|
||||
let state = use_context::<AppState>();
|
||||
let class = create_memo({
|
||||
move || {
|
||||
if state.selection.with(|sel| sel.contains(&key)) {
|
||||
"selected"
|
||||
} else {
|
||||
""
|
||||
}
|
||||
}
|
||||
});
|
||||
let label = elt.label.clone();
|
||||
let rep_components = elt.rep.iter().map(|u| {
|
||||
let u_coord = u.to_string().replace("-", "\u{2212}");
|
||||
View::from(div().children(u_coord))
|
||||
}).collect::<Vec<_>>();
|
||||
let constrained = elt.constraints.len() > 0;
|
||||
let details_node = create_node_ref();
|
||||
view! {
|
||||
/* [TO DO] switch to integer-valued parameters whenever
|
||||
that becomes possible again */
|
||||
li {
|
||||
details(ref=details_node) {
|
||||
summary(
|
||||
class=class.get(),
|
||||
on:keydown={
|
||||
move |event: KeyboardEvent| {
|
||||
match event.key().as_str() {
|
||||
"Enter" => {
|
||||
if event.shift_key() {
|
||||
state.selection.update(|sel| {
|
||||
if !sel.remove(&key) {
|
||||
sel.insert(key);
|
||||
}
|
||||
});
|
||||
} else {
|
||||
state.selection.update(|sel| {
|
||||
sel.clear();
|
||||
sel.insert(key);
|
||||
});
|
||||
}
|
||||
event.prevent_default();
|
||||
},
|
||||
"ArrowRight" if constrained => {
|
||||
let _ = details_node
|
||||
.get()
|
||||
.unchecked_into::<Element>()
|
||||
.set_attribute("open", "");
|
||||
},
|
||||
"ArrowLeft" => {
|
||||
let _ = details_node
|
||||
.get()
|
||||
.unchecked_into::<Element>()
|
||||
.remove_attribute("open");
|
||||
},
|
||||
_ => ()
|
||||
}
|
||||
}
|
||||
}
|
||||
) {
|
||||
div(
|
||||
class="elt-switch",
|
||||
on:click=|event: MouseEvent| event.stop_propagation()
|
||||
)
|
||||
div(
|
||||
class="elt",
|
||||
on:click={
|
||||
move |event: MouseEvent| {
|
||||
if event.shift_key() {
|
||||
state.selection.update(|sel| {
|
||||
if !sel.remove(&key) {
|
||||
sel.insert(key);
|
||||
}
|
||||
});
|
||||
} else {
|
||||
state.selection.update(|sel| {
|
||||
sel.clear();
|
||||
sel.insert(key);
|
||||
});
|
||||
}
|
||||
event.stop_propagation();
|
||||
event.prevent_default();
|
||||
}
|
||||
}
|
||||
) {
|
||||
div(class="elt-label") { (label) }
|
||||
div(class="elt-rep") { (rep_components) }
|
||||
}
|
||||
}
|
||||
ul(class="constraints") {
|
||||
Keyed(
|
||||
list=elt.constraints.into_iter().collect::<Vec<_>>(),
|
||||
view=move |c_key: usize| {
|
||||
let c_state = use_context::<AppState>();
|
||||
let assembly = &c_state.assembly;
|
||||
let cst = assembly.constraints.with(|csts| csts[c_key].clone());
|
||||
let other_arg = if cst.args.0 == key {
|
||||
cst.args.1
|
||||
} else {
|
||||
cst.args.0
|
||||
};
|
||||
let other_arg_label = assembly.elements.with(|elts| elts[other_arg].label.clone());
|
||||
view! {
|
||||
li(class="cst") {
|
||||
input(r#type="checkbox", bind:checked=cst.active)
|
||||
div(class="cst-label") { (other_arg_label) }
|
||||
div(class="cst-rep") { (cst.rep) }
|
||||
}
|
||||
}
|
||||
},
|
||||
key=|c_key| c_key.clone()
|
||||
)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
list=element_list,
|
||||
view=|(key, elt)| view! {
|
||||
ElementOutlineItem(key=key, element=elt)
|
||||
},
|
||||
key=|(key, elt)| (
|
||||
key.clone(),
|
||||
elt.id.clone(),
|
||||
elt.label.clone(),
|
||||
elt.constraints.clone()
|
||||
)
|
||||
key=|(_, elt)| elt.serial
|
||||
)
|
||||
}
|
||||
}
|
||||
|
|
|
@ -8,7 +8,8 @@ using Optim
|
|||
|
||||
export
|
||||
rand_on_shell, Q, DescentHistory,
|
||||
realize_gram_gradient, realize_gram_newton, realize_gram_optim, realize_gram
|
||||
realize_gram_gradient, realize_gram_newton, realize_gram_optim,
|
||||
realize_gram_alt_proj, realize_gram
|
||||
|
||||
# === guessing ===
|
||||
|
||||
|
@ -143,7 +144,7 @@ function realize_gram_gradient(
|
|||
break
|
||||
end
|
||||
|
||||
# find negative gradient of loss function
|
||||
# find the negative gradient of the loss function
|
||||
neg_grad = 4*Q*L*Δ_proj
|
||||
slope = norm(neg_grad)
|
||||
dir = neg_grad / slope
|
||||
|
@ -232,7 +233,7 @@ function realize_gram_newton(
|
|||
break
|
||||
end
|
||||
|
||||
# find the negative gradient of loss function
|
||||
# find the negative gradient of the loss function
|
||||
neg_grad = 4*Q*L*Δ_proj
|
||||
|
||||
# find the negative Hessian of the loss function
|
||||
|
@ -313,6 +314,129 @@ function realize_gram_optim(
|
|||
)
|
||||
end
|
||||
|
||||
# seek a matrix `L` for which `L'QL` matches the sparse matrix `gram` at every
|
||||
# explicit entry of `gram`. use gradient descent starting from `guess`, with an
|
||||
# alternate technique for finding the projected base step from the unprojected
|
||||
# Hessian
|
||||
function realize_gram_alt_proj(
|
||||
gram::SparseMatrixCSC{T, <:Any},
|
||||
guess::Matrix{T},
|
||||
frozen = CartesianIndex[];
|
||||
scaled_tol = 1e-30,
|
||||
min_efficiency = 0.5,
|
||||
backoff = 0.9,
|
||||
reg_scale = 1.1,
|
||||
max_descent_steps = 200,
|
||||
max_backoff_steps = 110
|
||||
) where T <: Number
|
||||
# start history
|
||||
history = DescentHistory{T}()
|
||||
|
||||
# find the dimension of the search space
|
||||
dims = size(guess)
|
||||
element_dim, construction_dim = dims
|
||||
total_dim = element_dim * construction_dim
|
||||
|
||||
# list the constrained entries of the gram matrix
|
||||
J, K, _ = findnz(gram)
|
||||
constrained = zip(J, K)
|
||||
|
||||
# scale the tolerance
|
||||
scale_adjustment = sqrt(T(length(constrained)))
|
||||
tol = scale_adjustment * scaled_tol
|
||||
|
||||
# convert the frozen indices to stacked format
|
||||
frozen_stacked = [(index[2]-1)*element_dim + index[1] for index in frozen]
|
||||
|
||||
# initialize search state
|
||||
L = copy(guess)
|
||||
Δ_proj = proj_diff(gram, L'*Q*L)
|
||||
loss = dot(Δ_proj, Δ_proj)
|
||||
|
||||
# use Newton's method with backtracking and gradient descent backup
|
||||
for step in 1:max_descent_steps
|
||||
# stop if the loss is tolerably low
|
||||
if loss < tol
|
||||
break
|
||||
end
|
||||
|
||||
# find the negative gradient of the loss function
|
||||
neg_grad = 4*Q*L*Δ_proj
|
||||
|
||||
# find the negative Hessian of the loss function
|
||||
hess = Matrix{T}(undef, total_dim, total_dim)
|
||||
indices = [(j, k) for k in 1:construction_dim for j in 1:element_dim]
|
||||
for (j, k) in indices
|
||||
basis_mat = basis_matrix(T, j, k, dims)
|
||||
neg_dΔ = basis_mat'*Q*L + L'*Q*basis_mat
|
||||
neg_dΔ_proj = proj_to_entries(neg_dΔ, constrained)
|
||||
deriv_grad = 4*Q*(-basis_mat*Δ_proj + L*neg_dΔ_proj)
|
||||
hess[:, (k-1)*element_dim + j] = reshape(deriv_grad, total_dim)
|
||||
end
|
||||
hess_sym = Hermitian(hess)
|
||||
push!(history.hess, hess_sym)
|
||||
|
||||
# regularize the Hessian
|
||||
min_eigval = minimum(eigvals(hess_sym))
|
||||
push!(history.positive, min_eigval > 0)
|
||||
if min_eigval <= 0
|
||||
hess -= reg_scale * min_eigval * I
|
||||
end
|
||||
|
||||
# compute the Newton step
|
||||
neg_grad_stacked = reshape(neg_grad, total_dim)
|
||||
for k in frozen_stacked
|
||||
neg_grad_stacked[k] = 0
|
||||
hess[k, :] .= 0
|
||||
hess[:, k] .= 0
|
||||
hess[k, k] = 1
|
||||
end
|
||||
base_step_stacked = Hermitian(hess) \ neg_grad_stacked
|
||||
base_step = reshape(base_step_stacked, dims)
|
||||
push!(history.base_step, base_step)
|
||||
|
||||
# store the current position, loss, and slope
|
||||
L_last = L
|
||||
loss_last = loss
|
||||
push!(history.scaled_loss, loss / scale_adjustment)
|
||||
push!(history.neg_grad, neg_grad)
|
||||
push!(history.slope, norm(neg_grad))
|
||||
|
||||
# find a good step size using backtracking line search
|
||||
push!(history.stepsize, 0)
|
||||
push!(history.backoff_steps, max_backoff_steps)
|
||||
empty!(history.last_line_L)
|
||||
empty!(history.last_line_loss)
|
||||
rate = one(T)
|
||||
step_success = false
|
||||
base_target_improvement = dot(neg_grad, base_step)
|
||||
for backoff_steps in 0:max_backoff_steps
|
||||
history.stepsize[end] = rate
|
||||
L = L_last + rate * base_step
|
||||
Δ_proj = proj_diff(gram, L'*Q*L)
|
||||
loss = dot(Δ_proj, Δ_proj)
|
||||
improvement = loss_last - loss
|
||||
push!(history.last_line_L, L)
|
||||
push!(history.last_line_loss, loss / scale_adjustment)
|
||||
if improvement >= min_efficiency * rate * base_target_improvement
|
||||
history.backoff_steps[end] = backoff_steps
|
||||
step_success = true
|
||||
break
|
||||
end
|
||||
rate *= backoff
|
||||
end
|
||||
|
||||
# if we've hit a wall, quit
|
||||
if !step_success
|
||||
return L_last, false, history
|
||||
end
|
||||
end
|
||||
|
||||
# return the factorization and its history
|
||||
push!(history.scaled_loss, loss / scale_adjustment)
|
||||
L, loss < tol, history
|
||||
end
|
||||
|
||||
# seek a matrix `L` for which `L'QL` matches the sparse matrix `gram` at every
|
||||
# explicit entry of `gram`. use gradient descent starting from `guess`
|
||||
function realize_gram(
|
||||
|
@ -321,7 +445,6 @@ function realize_gram(
|
|||
frozen = nothing;
|
||||
scaled_tol = 1e-30,
|
||||
min_efficiency = 0.5,
|
||||
init_rate = 1.0,
|
||||
backoff = 0.9,
|
||||
reg_scale = 1.1,
|
||||
max_descent_steps = 200,
|
||||
|
@ -352,20 +475,19 @@ function realize_gram(
|
|||
unfrozen_stacked = reshape(is_unfrozen, total_dim)
|
||||
end
|
||||
|
||||
# initialize variables
|
||||
grad_rate = init_rate
|
||||
# initialize search state
|
||||
L = copy(guess)
|
||||
|
||||
# use Newton's method with backtracking and gradient descent backup
|
||||
Δ_proj = proj_diff(gram, L'*Q*L)
|
||||
loss = dot(Δ_proj, Δ_proj)
|
||||
|
||||
# use Newton's method with backtracking and gradient descent backup
|
||||
for step in 1:max_descent_steps
|
||||
# stop if the loss is tolerably low
|
||||
if loss < tol
|
||||
break
|
||||
end
|
||||
|
||||
# find the negative gradient of loss function
|
||||
# find the negative gradient of the loss function
|
||||
neg_grad = 4*Q*L*Δ_proj
|
||||
|
||||
# find the negative Hessian of the loss function
|
||||
|
@ -420,6 +542,7 @@ function realize_gram(
|
|||
empty!(history.last_line_loss)
|
||||
rate = one(T)
|
||||
step_success = false
|
||||
base_target_improvement = dot(neg_grad, base_step)
|
||||
for backoff_steps in 0:max_backoff_steps
|
||||
history.stepsize[end] = rate
|
||||
L = L_last + rate * base_step
|
||||
|
@ -428,7 +551,7 @@ function realize_gram(
|
|||
improvement = loss_last - loss
|
||||
push!(history.last_line_L, L)
|
||||
push!(history.last_line_loss, loss / scale_adjustment)
|
||||
if improvement >= min_efficiency * rate * dot(neg_grad, base_step)
|
||||
if improvement >= min_efficiency * rate * base_target_improvement
|
||||
history.backoff_steps[end] = backoff_steps
|
||||
step_success = true
|
||||
break
|
||||
|
|
|
@ -74,4 +74,13 @@ if success
|
|||
for k in 5:9
|
||||
println(" ", 1 / L[4,k], " sun")
|
||||
end
|
||||
end
|
||||
end
|
||||
|
||||
# test an alternate technique for finding the projected base step from the
|
||||
# unprojected Hessian
|
||||
L_alt, success_alt, history_alt = Engine.realize_gram_alt_proj(gram, guess, frozen)
|
||||
completed_gram_alt = L_alt'*Engine.Q*L_alt
|
||||
println("\nDifference in result using alternate projection:\n")
|
||||
display(completed_gram_alt - completed_gram)
|
||||
println("\nDifference in steps: ", size(history_alt.scaled_loss, 1) - size(history.scaled_loss, 1))
|
||||
println("Difference in loss: ", history_alt.scaled_loss[end] - history.scaled_loss[end], "\n")
|
|
@ -64,4 +64,13 @@ else
|
|||
println("\nFailed to reach target accuracy")
|
||||
end
|
||||
println("Steps: ", size(history.scaled_loss, 1))
|
||||
println("Loss: ", history.scaled_loss[end], "\n")
|
||||
println("Loss: ", history.scaled_loss[end], "\n")
|
||||
|
||||
# test an alternate technique for finding the projected base step from the
|
||||
# unprojected Hessian
|
||||
L_alt, success_alt, history_alt = Engine.realize_gram_alt_proj(gram, guess, frozen)
|
||||
completed_gram_alt = L_alt'*Engine.Q*L_alt
|
||||
println("\nDifference in result using alternate projection:\n")
|
||||
display(completed_gram_alt - completed_gram)
|
||||
println("\nDifference in steps: ", size(history_alt.scaled_loss, 1) - size(history.scaled_loss, 1))
|
||||
println("Difference in loss: ", history_alt.scaled_loss[end] - history.scaled_loss[end], "\n")
|
|
@ -93,4 +93,13 @@ if success
|
|||
infty = BigFloat[0, 0, 0, 0, 1]
|
||||
radius_ratio = dot(infty, Engine.Q * L[:,5]) / dot(infty, Engine.Q * L[:,6])
|
||||
println("\nCircumradius / inradius: ", radius_ratio)
|
||||
end
|
||||
end
|
||||
|
||||
# test an alternate technique for finding the projected base step from the
|
||||
# unprojected Hessian
|
||||
L_alt, success_alt, history_alt = Engine.realize_gram_alt_proj(gram, guess, frozen)
|
||||
completed_gram_alt = L_alt'*Engine.Q*L_alt
|
||||
println("\nDifference in result using alternate projection:\n")
|
||||
display(completed_gram_alt - completed_gram)
|
||||
println("\nDifference in steps: ", size(history_alt.scaled_loss, 1) - size(history.scaled_loss, 1))
|
||||
println("Difference in loss: ", history_alt.scaled_loss[end] - history.scaled_loss[end], "\n")
|
|
@ -41,3 +41,25 @@ I will have to work out formulas for the Euclidean distance between two entities
|
|||
In this vein, it seems as though if J1 and J2 are the reps of two points, then Q(J1,J2) = d^2/2. So then the sphere centered at J1 through J2 is (J1-(2Q(J1,J2),0,0,0,0))/sqrt(2Q(J1,J2)). Ugh has a sqrt in it. Similarly for sphere centered at J3 through J2, (J3-(2Q(J3,J2),0000))/sqrt(2Q(J3,J2)). J1,J2,J3 are collinear if these spheres are tangent, i.e. if those vectors have Q-inner-product 1, which is to say Q(J1,J3) - Q(J1,J2) - Q(J3,J2) = 2sqrt(Q(J1,J2)Q(J2,J3)). But maybe that's not the simplest way of putting it. After all, we can just say that the cross-product of the two differences is 0; that has no square roots in it.
|
||||
|
||||
One conceivable way to canonicalize lines is to use the *perpendicular* plane that goes through the origin, that's uniquely defined, and anyway just amounts to I = (0,0,d) where d is the ordinary direction vector of the line; and a point J in that plane that the line goes through, which just amounts to J=(r^2,1,E) with Q(I,J) = 0, i.e. E\dot d = 0. It's also the point on the line closest to the origin. The reason that we don't usually use that point as the companion to the direction vector is that the resulting set of six coordinates is not homogeneous. But here that's not an issue, since we have our standard point coordinates and plane coordinates; and for a plane through the origin, only two of the direction coordinates are really free, and then we have the one dot-product relation, so only two of the point coordinates are really free, giving us the correct dimensionality of 4 for the set of lines. So in some sense this says that we could take naively as coordinates for a line the projection of the unit direction vector to the xy plane and the projection of the line's closest point to the origin to the xy plane. That doesn't seem to have any weird gimbal locks or discontinuities or anything. And with these coordinates, you can test if the point E=x,y,z is on the line (dx,dy,cx,cy) by extending (dx,dy) to d via dz = sqrt(1-dx^2 - dy^2), extending (cx,cy) to c by determining cz via d\dot c = 0, and then checking if d\cross(E-c) = 0. And you can see if two lines are parallel just by checking if they have the same direction vector, and if not, you can see if they are coplanar by projecting both of their closest points perpendicularly onto the line in the direction of the cross product of their directions, and if the projections match they are coplanar.
|
||||
|
||||
#### Engine Conventions
|
||||
|
||||
The coordinate conventions used in the engine are different from the ones used in these notes. Marking the engine vectors and coordinates with $'$, we have
|
||||
$$I' = (x', y', z', b', c'),$$
|
||||
where
|
||||
$$
|
||||
\begin{align*}
|
||||
x' & = x & b' & = b/2 \\
|
||||
y' & = y & c' & = c/2. \\
|
||||
z' & = z
|
||||
\end{align*}
|
||||
$$
|
||||
The engine uses the quadratic form $Q' = -Q$, which is expressed in engine coordinates as
|
||||
$$Q'(I'_1, I'_2) = x'_1 x'_2 + y'_1 y'_2 + z'_1 z'_2 - 2(b'_1c'_2 + c'_1 b'_2).$$
|
||||
In the `engine` module, the matrix of $Q'$ is encoded in the lazy static variable `Q`.
|
||||
|
||||
In the engine's coordinate conventions, a sphere with radius $r > 0$ centered on $P = (P_x, P_y, P_z)$ is represented by the vector
|
||||
$$I'_s = \left(\frac{P_x}{r}, \frac{P_y}{r}, \frac{P_z}{r}, \frac1{2r}, \frac{\|P\|^2 - r^2}{2r}\right),$$
|
||||
which has the normalization $Q'(I'_s, I'_s) = 1$. The point $P$ is represented by the vector
|
||||
$$I'_P = \left(P_x, P_y, P_z, \frac{1}{2}, \frac{\|P\|^2}{2}\right).$$
|
||||
In the `engine` module, these formulas are encoded in the `sphere` and `point` functions.
|
Loading…
Add table
Reference in a new issue