brain parallism, comsci, cognitive aspect ๐ŸŽฏ: fundamental concepts of PP - generative programs, traces, inference, meta programs, SMC, scaling inference by synthesizing exact, data-driven proposals ![[Pasted image 20230822130907.png]] ![[Pasted image 20230822130934.png]] ![[Pasted image 20230822130952.png]] commercial drone market is growing fast (few settings where robotics meet real world; much faster than AV; open-ended than warehouse ; real system than home robotics); thermal sensors (rescue, conservation, but have some worries) - weather like fog is uncertainty make SLAM: simulat.locailziation ![[Pasted image 20230822131420.png]]![[Pasted image 20230822131426.png]] what prob.interest can do (take drone - is actually in top right, but ![[Pasted image 20230822131447.png]]![[Pasted image 20230822131537.png]] rejuvanation - widening the (trigger in larger hypo space) than my motion prior would suggest ![[Pasted image 20230822131643.png]] ![[Pasted image 20230822132027.png]] can trace (record of every stoch choice made) hd is head direcation ![[Pasted image 20230822132114.png]] fly the robot in the direction of control; start.dp (velocity controls), ![[Pasted image 20230822132239.png]] ![[Pasted image 20230822132343.png]] ![[Pasted image 20230822132333.png]] ![[Pasted image 20230822132442.png]] ![[Pasted image 20230822132612.png]] lager range scanner (construct one by computation, classical motion sensor - vision) noisy sensors are roatating and uniforming sample from interval (by taking actual disctace from wall and adding randomness) ![[Pasted image 20230822132749.png]] where robots were trying to go, not where it actually went ![[Pasted image 20230822133009.png]] gaussian sensors ![[Pasted image 20230822133314.png]] gen trace are (index of angle - key for distance ![[Pasted image 20230822133416.png]]) glue the sensor and (curly mode is defining initial) - sequence of pose in sensor readings ![[Pasted image 20230822133431.png]] motion model () ![[Pasted image 20230822133602.png]] each choice is latent variable in probability model because of taking out of concrete object in code (stochastic choices) - include deep learning ecology (because of modularity - infernce meta program) able to scale logic that mathematics, robotics couldn't do (10 billion codes); mathematical objects trace data structure ![[Pasted image 20230822133532.png]] only give idea (Q. Doesn't tell how to find it?) ![[Pasted image 20230822134045.png]] Another big idea in gen (inf meta prog) made it possible to use all that are seen as alternatives (inf lib for diff methods and its combination - even transformers and diffusion model's engine)![[Pasted image 20230822134228.png]] simulating generative execution (update trace with small change and calculate = post-updating, evaluating gradients of score); incremental computation (efficiency updating output) ![[Pasted image 20230822134338.png]] most scalable; evolving set of traces (updating them and how they fit locally, and get credit for those via weights (how close it to the actual posterior)) ![[Pasted image 20230822134501.png]] ![[Pasted image 20230822134710.png]] ![[Pasted image 20230822134809.png]] generate bunch of traces along with the trace and likelihood of any contstraints from the data (data via observation pass into `generate` ; partial trace which stores - can exact (rejection sample - much closer) ![[Pasted image 20230822135007.png]] for shorter path, SIRI and rejection sampling workes well but ![[Pasted image 20230822135044.png]] iterate over steps (reset weights at each time, for each particles, improve the particle by using stoch choice as if to perturb via gaussian centered on current trace - wiggle around) - design space of ![[Pasted image 20230822135115.png]] ![[Pasted image 20230822135233.png]] genjax (likelihood and hierarchical extension - don't give good output for clutter) in graduate level, we learn how each SIRI, MCMC rej are instances of much larger algorithm ![[Pasted image 20230822135421.png]] gen program that corrupts: forw gen + post (missing from fig)![[Pasted image 20230822135533.png]] ![[Pasted image 20230822135602.png]] repetitions - not normalized and inconsistent repitition![[Pasted image 20230822135635.png]] ![[Pasted image 20230822135706.png]] generate traces acc to ontology ![[Pasted image 20230822135744.png]] 90% of imputation were correct (97% were real fixes) ![[Pasted image 20230822135813.png]]structure of domain (link str)+ skeleton ![[Pasted image 20230822135945.png]] ![[Pasted image 20230822140000.png]] ![[Pasted image 20230822140009.png]] addd in inference hings (data driven ) - let users specific hints (subproblem being ) - parameterize meta inference ; reconfigurable by end users (cusomized template) ![[Pasted image 20230822140014.png]] ![[Pasted image 20230822140135.png]] - as worse as RL lol data driven works much better ![[Pasted image 20230822140201.png]] compiles substructure (use exact inf algo tosolve problems) - coupling ?? and ?? which is most ![[Pasted image 20230822140225.png]] two paths (forward) = working on the highltithted (blue rows - link - reconsidering in light of that context - to improve to fix mistakes) ![[Pasted image 20230822140326.png]] ![[Pasted image 20230822140439.png]] - mutually help you (choose what variables to block over - mini batch (but have analogy)) ![[Pasted image 20230822140446.png]]![[Pasted image 20230822140532.png]] translate to discrete subproblem exactly - top down (high accuracy that take account of current state) - amortized (gen sim data from model and train nn treating all the latent var as labels; actual posterior of label - no gap) - fast and modular (exact - for the parts of the problem that's affected); expoit ; wouldn't explore whole palce but explot gen model is symbolic and proceed or invalidate given new data - instead generate a label data than do (principles are superset of ; can use amortized in submodule) - pclean is not only for data cleaning (massive spread of disimforable - confabulation from ; humand and machine generated confabulation) ![[Pasted image 20230822140924.png]] ground truth, combined with smc (score unstructure text in terms of - filter llm - check inclaims againt pclean)- making a factual claim if it a factural than what is the fact set that supports possible to build, test design for immune system (claims and facuality and language model - audit trail - ppl make prob listic judgement) - type of prob. (symbolic representation) - neural generative models immune system (operate probablistically - no answer in ) - fundamental escape from immune ; risk award tradeoff - societal scale (additional what rights should be; shallow parts consumption of speech - virally ) alternative to social media unregulative (genAI) - enormaous partnership btw government (painful for industry (facutality)- ) work towards step ; parse text how poetnetial factual it is (ind) then you can see what's the evidence of facts (entity that googles use ; structure boxes (that have lots of facts in it) pcleasns to ) very hard smart entrep (legal and ) - catalyze political will --- colab notebook to genjax (tutorials); two cells (google machine) - lab runs that (present with everything with no setup) ![[Pasted image 20230822142229.png]] explores hier.sensor model (data generated from the model) gpu accelarated model one scale parallism (play with this notebook) ![[Pasted image 20230822142419.png]] 1. make a copy and run first two celss (configure gpu) genjl tutorials (problem sets) - gen probgraph (including amortized data driven progposals ) ![[Pasted image 20230822142530.png]] tutorial https://github.com/probcomp/gen-quickstart cloud will be coming access to now - down > how does rejuvanation works? clasic resample (after resampling when the weights are identically 1) - smc p3(del moraal ) - mc psuedo marginal (general way to improve the sample when decided to this ) https://proceedings.mlr.press/v206/lew23a/lew23a.pdf ---- ![[Pasted image 20230824090611.png]] depth ๋ฐ”๋‹ฅ, ๋ฒฝ ์ƒ์ž ![[Pasted image 20230824085618.png]] relative location of box and wall is given (ray given) slam (given map, use sensor (ray casting)) - gps localization์˜ ๊ด€์ธก๊ฐ’ (์œ„์น˜๋ฅผ ์˜ฎ๊ฒจ๊ฐ€๋ฉด์„œ, ์ „ ๋‹จ๊ณ„ ๊ด€์ธก์„ ๋ฐ”ํƒ•์œผ๋กœ ์ง€๋„๋ฅผ ๊ทธ๋ ค) wall is blue, box is orange inliner์€ ๋ฒฝ์œผ๋กœ๋ถ€ํ„ฐ ![[Pasted image 20230824085757.png]] there are two models constrained and baseline ![[Pasted image 20230824090505.png]] Y๋Š” deterministic ray operation, X: sampled from operation / with P(X|Y), update sigma, omega - hd: heading (๊ฐ๋„) @ is for alias: python ์˜ x๋ณ€์ˆ˜๋ฅผ gen์˜ x ๋ณ€์ˆ˜๋กœ alias - to classify well in the trace ![[Pasted image 20230824091648.png]] tweaking with X @ sensor ![[Pasted image 20230824092028.png]]![[Pasted image 20230824092132.png]] ![[Pasted image 20230824092307.png]] ![[Pasted image 20230824092714.png]] outlier์„ ๋ฐฐ์ œํ•˜๋Š”๊ฒŒ ๋ชฉ์ , ๋ฒฝ์ƒ์ž์„œ ๋ฐ˜์‚ฌ๋œ๊ฒŒ ๋“ค์–ด์˜ค๋Š”๊ฒŒ ์„ผ์„œ๋ชจ๋ธ์— ๋Œ€ํ•ด ์–ผ๋งˆ๋‚˜ ๊ฐ€๋Šฅํ•œ ๋ฐ์ดํ„ฐ๊ฐ€ ๋‚˜์˜ค๋Š”์ง€ (outlier๋ฉด ๋น„์œจ์ด ๋‚ฎ๊ฒŒ ๋‚˜์˜ด - importance samplingํ–ˆ์„๋•Œ - ๊ฐ€๋Šฅ๋„๊ฐ€ ํฌ๋ฐ•) - is it better to have more outlier (or not) ![[Pasted image 20230824094106.png]] ![[Pasted image 20230824094117.png]] u is "key:PRNGkey" model.importance(keys[0], ch0, ) second parameter is value to be fixed (chm: choice map as constraing) ![[Pasted image 20230824094646.png]] ![[Pasted image 20230824094849.png]] mixture๊ฐ€ ์—†๋Š” ch1 outlier = 0 has a higher likelihood ![[Pasted image 20230824094956.png]] vmap์˜ ์ธ์ž๋Š” ์–ด๋–ค input๊ธฐ์ค€์œผ๋กœ ๋ฒกํ„ฐํ™”์‹œํ‚ฌ์ง€ ๋ฅผ ๊ฒฐ์ •ํ•˜๋Š” ์ธ์ž๊ฐ€ ์žˆ๋‹ค (sig, out ๊ธฐ์ค€ ๋ฒกํ„ฐํ™”) ![[Pasted image 20230824095718.png]] x,p, sigma, omega, Y 1. Y-> omega, sigma 2. X-> omega, sigma ![[Pasted image 20230824100259.png]] lru_cache (simulation) ํ•จ์ˆ˜๋ฅผ function_cache (ํ•จ์ˆ˜๋ฅผ ๋ถˆ๋ €์„๋•Œ ๊ฐ’์„ ์บ์‹œํ™”) instruction, ํ•จ์ˆ˜๊ฐ’๋„ ์บ์‹œํ™” (cpu๊ตฌ์กฐ์ ์œผ๋กœ ๋ถˆ๊ฐ€ , byte code interpreter - cpu๋Š” ํŒŒ์ด์ฌ์˜ ์‹คํ–‰ํŒŒ์ผ์„ ์บ์‹œ) lru_cache๋Š” vm cpu์—์„œ ์‹คํ–‰๋˜๋Š” ์–ด์…ˆ๋ธ”๋ฆฌ์–ด (cpu) ![[Pasted image 20230824101457.png]]![[Pasted image 20230824101515.png]] Y๋ฅผ ๊ฐ€์ง€๊ณ  ์™€์„œ, ์„ผ์„œ๋ชจ์ˆ˜๋ฅผ ํ™•์ธํ•ด์•ผ (์ถ”๋ก ), X๋ฝ‘์„์ˆ˜ ์ž‡์Œ (๋ณด์ •๋œ ์„ผ์„œ๊ฐ’; Y๊ฐ€ ์ฃผ์–ด์กŒ์„๋•Œ X, sigma์ถ”๋ก ) - pose์ถ”๋ก  (๋ณด์ •๋œ ์„ผ์„œ๋กœ pose์ถ”๋ก ) ![[Pasted image 20230824095237.png]]