/robowaifu/ - DIY Robot Wives

Advancing robotics to a point where anime catgrill meidos in tiny miniskirts are a reality.

Build Back Better

More updates on the way. -r

Max message length: 6144

Drag files to upload or
click here to select them

Maximum 5 files / Maximum size: 20.00 MB

Captcha
no cookies?
More

(used to delete files and postings)


Have a nice day, Anon!


Open file (13.41 KB 750x600 Lurking.png)
Lurk Less: Tasks to Tackle Robowaifu Technician 02/13/2023 (Mon) 05:40:18 No.20037 [Reply]
Here we share the ideas of how to help the development of robowaifus. You can look for tasks to improve the board, or ones which would help to move the development forward. You could also come up with a task that needs to be worked on and ask for help, use the pattern on top of OP for that, replace the part in <brackets> with your own text and post it. >Pattern to copy and adjust for adding a task to the thread: Task: <Description, general or very specific and target thread for the results> Tips: <Link additional information and add tips of how to achieve it.> Constraints and preferences: <Things to avoid> Results: Post your results in the prototypes thread if you designed something >>18800, or into an on-topic thread from the catalog if you found something or created a summary or diagram. General Disclaimer: Don't discuss your work on tasks in this thread, make a posting in another thread, or several of them, and then another one here linking to it. We do have a thread for prototypes >>18800, current meta >>18173 and many others in the catalog https://alogs.space/robowaifu/catalog.html - the thread for posting the result might also be the best place to discuss things. >General suggestions where you might be able to help: - Go through threads in the catalog here https://alogs.space/robowaifu/catalog.html and make summaries and diagrams like pointed out starting here >>10428 - Work on parts instead of trying to develop and build a whole robowaifu - Work on processes you find in some thread in the catalog https://alogs.space/robowaifu/catalog.html - Test existing mechanisms shared on this board, prototypes >>18800 - Try to work on sensors in some kind of rubber skin and in parts >>95 >>242 >>419 - Keep track of other sites and similar projects, for example on YouTube, Twitter or Hackaday. - Copy useful pieces of information from threads on other sites and boards talking about "sexbots", "chatbots", AI or something similar. Pick the right thread here: https://alogs.space/robowaifu/catalog.html

Message too long. Click here to view full text.

Edited last time by Chobitsu on 05/08/2023 (Mon) 11:17:16.
19 posts and 4 images omitted.
<<placeholder for task description. To be expanded later>>

Welcome to /robowaifu/ Anonymous 09/09/2019 (Mon) 00:33:54 No.3 [Reply]
Why Robowaifu? Most of the world's modern women have failed their men and their societies, feminism is rampant, and men around the world have been looking for a solution. History shows there are cultural and political solutions to this problem, but we believe that technology is the best way forward at present – specifically the technology of robotics. We are technologists, dreamers, hobbyists, geeks and robots looking forward to a day when any man can build the ideal companion he desires in his own home. However, not content to wait for the future; we are bringing that day forward. We are creating an active hobbyist scene of builders, programmers, artists, designers, and writers using the technology of today, not tomorrow. Join us! NOTES & FRIENDS > Notes: -This is generally a SFW board, given our engineering focus primarily. On-topic NSFW content is OK, but please spoiler it. -Our bunker is located at: https://trashchan.xyz/robowaifu/catalog.html Please make note of it. -Library thread (good for locating terms/topics) (>>7143) > Friends: -/clang/ - currently at https://8kun.top/clang/ - toaster-love NSFW. Metal clanging noises in the night. -/monster/ - currently at https://smuglo.li/monster/ - bizarre NSFW. Respect the robot. -/tech/ - currently at >>>/tech/ - installing Gentoo Anon? They'll fix you up. -/britfeel/ - currently at https://trashchan.xyz/britfeel/ - some good lads. Go share a pint! -/server/ - currently at https://trashchan.xyz/server/ - multi-board board. Eclectic thing of beauty. -/f/ - currently at https://trashchan.xyz/f/res/4.html#4 - doing flashtech old-school. -/kind/ - currently at https://wapchan.org/kind - be excellent to each other.

Message too long. Click here to view full text.

Edited last time by Chobitsu on 04/03/2024 (Wed) 03:57:55.

Emmy The Robot Robowaifu Technician 04/15/2024 (Mon) 20:31:05 No.30919 [Reply]
Welcome all Nandroids fans to the Emmy thread, for discussing and posting about EtR. Please refrain from posting off-topic things. --- Also, be sure to check out Emmy-Pilled's project thread! (>>25306) Important Community Links: Boorus, etc.: https://nandroid.booru.org/index.php https://emmytherobot.art/ (Jumbo controlled, be careful.) Google Docs: https://docs.google.com/spreadsheets/d/1mXuNh9ESedCiDZclVuz9uiL7nTNk3U9SgCE_CRHi3Us/htmlview# Webtoons: https://m.webtoons.com/en/canvas/emmy-the-robot/list?title_no=402201 > previous threads : >>27481 >>26629
Edited last time by Kiwi_ on 04/17/2024 (Wed) 18:33:09.
15 posts and 8 images omitted.
Open file (99.41 KB 1638x2048 GLzKk_9W4AAwJEq.jpeg)
Open file (228.86 KB 1661x1080 GLykSvPbAAAUGSK.jpeg)
Open file (100.54 KB 1626x2033 GLzKk_-W8AAHABB.jpeg)
>>31003 looks jewish
New Pemmy (Pirate Emmy or Missy) Java Skin available
New HD/Bedrock Aria skin - Updated tops and bottoms
>>31011 >japanese >looks jewish Heh that's funny

Open file (93.53 KB 800x540 TypesOfMotors.jpg)
Open file (436.87 KB 1664x2048 MotionBaseServos.jpeg)
Open file (78.13 KB 922x1396 Femisapien.jpg)
Open file (2.25 MB 2500x1778 MechaMadoka.jpg)
Actuators For Waifu Movement Part 3 Kiwi 12/06/2023 (Wed) 01:18:16 No.27021 [Reply] [Last]
(1stl thread >>406 2nd thread >>12810) Kiwi back again with a thread for discussing actuators to move your waifu! Part Three! Let's start with a quick introduction to common actuators! 1. DC motors, these use brushes to switch the ferrous core electromagnets on a rotor to rotate its magnetic field relative to surrounding magnets! They're one of the cheapest options with an average efficiency range of 30 to 90%. Larger DC motors and motors with higher turn counts are more efficient. 1.5 Coreless DC motors, by removing ferrous materials, losses from hysteresis are almost eliminated, dramatically increasing efficiency to nearly 90% even in small motors. Eliminating the ferrous materials reduces flux focusing, resulting in weaker fields and higher speeds. 2. Brushless DC motors (BLDC), these use a controller to switch the electromagnets on a stator to rotate the magnets of a rotor! Without brushes, they have the potential to be more efficient with higher power density compared to DC motors. Their efficiency and behavior vary depending on the algorithm and sensors used to control them. Coreless brushless motors exist but are rare and only used for very niche applications. 3. AC motors, a wide and incredibly varied category. They all rely on AC’s frequency to control them. With single phase AC motors relying on shaded poles, capacitors, or some other method to induce a rotating magnetic field. 3 phase AC motors naturally have a rotating field which usually gives them higher efficiency and power density. Notably, most AC motors are brushless. The most commonly used brushed AC motor is the universal motor, which is 4. Stepper motors, brushless motors with ferrous teeth to focus magnetic flux. This allows for incredible control (stepping) at the cost of greater mass, subsequently giving them higher rotary inertia. Usually 50 to 80% efficient depending on control algorithm/speed/and quality of the stepper. Due to their increasing mass production (& ubiquitous low cost controllers), they have appeal as a lower cost alternative to BLDC motors if one carefully designs around them. 5. Coiled Nylon Actuators! These things have an efficiency rating so low it's best to just say they aren't efficient. (0.01% typical, 2% achieved under extremely specific conditions in a lab.) Though they are exciting due to their incredible low cost of fabrication, they’re far too slow and the energy requirements are nonsensical. https://youtu.be/S4-3_DnKE9E https://youtu.be/wltLEzQnznM 6. Hydraulics! These rely on the distribution of pressure in a working liquid to move things like pistons. Though popular in large scale industry, their ability to be used in waifu's has yet to be proven. (Boston Dynamics Atlas runs on hydraulics but it's a power guzzler and heavy) Efficiency varies wildly depending on implementation. They would work great for a giantess! 7. Pneumatics, hydraulics lighter sister! This time the fluid is air! This has the advantage in weight. They aren't capable of the same power loads hydraulics are but, who wants their waifu to bench press a car? (Too loud and inefficient for mobile robotics.) 8. Wax motors, hydraulic systems where the working fluid is expanding melted (commonly paraffin) wax! Cheap, low power, and produce incredible forces! Too bad they're slow and hard to control. 9. Explosion! Yes, you can move things through explosions! Gas engines work through explosions! Artificial muscles can be made by exploding a hydrogen and oxygen mixture in a piston, then using hydrolysis to turn the water back into hydrogen and oxygen. None of this is efficient or practical but it's vital we keep our minds open! Though there are more actuators, most are derivatives or use these examples to work. Things like pulleys need an actuator to move them. Now, let's share, learn, and get our waifu moving!

Message too long. Click here to view full text.

Edited last time by Chobitsu on 12/06/2023 (Wed) 03:06:55.
131 posts and 41 images omitted.
>>30996 This. The rare-earths are used to greatly 'amp up' the electromagnetic properties of the basic electro magnets themselves. Kind of like how you need trace (tiny amounts of) otherwise-often-harmful-micronutrients (like cyanide) for your cells to properly manufacture the incredibly-yuge swath of proteins vital for your metabolic processes, these trace amounts of rare earths significantly enhance magnets -- especially when they are at cold temps! (Say, around liquid nitrogen realms.) >>30999 Good luck aydoll! If I haven't yet given you the 'official' greeting yet, welcome! :^) >>31004 As a country boy, I'm quite familiar with block & tackle. It's absolutely amazing the force multiplier that can be achieved with this simple machine.
>>30999 For some reason the crotch reminds me of a robosapien... I think it is the slight angle where the codpiece meets the first hip joint.
>>31012 >For some reason the crotch reminds me of a robosapien... I think it is the slight angle where the codpiece meets the first hip joint. IMHO, this literally the most critical design spot on the entire robowaifu. Everything else depends on the pelvis/hip confluence being correct to function well as an overall bipedal locomotive system (not to mention snu-snu working properly).
>>31013 Excellent resources, Kiwi. Thanks! :^)

Open file (1.76 MB 2560x1600 wp8232537.png)
Open file (427.25 KB 1920x1200 wp3421764.jpg)
Open file (368.53 KB 1680x1050 bRSM5J.jpg)
/robowaifu/meta-9: Wintertime will be sublime. Chobitsu Board owner 10/30/2023 (Mon) 00:42:15 No.26137 [Reply] [Last]
/meta, offtopic, & QTDDTOT >--- General /robowaifu/ team survey (please reply ITT) (>>15486) >--- Mini-FAQ >A few hand-picked posts on various /robowaifu/-related topics -Why is keeping mass (weight) low so important? (>>4313) -How to get started with AI/ML for beginners (>>18306) -"The Big 4" things we need to solve here (>>15182) -HOW TO SOLVE IT (>>4143) -Why we exist on an imageboard, and not some other forum platform (>>15638, >>17937) -This is madness! You can't possibly succeed, so why even bother? (>>20208, >>23969) -All AI programming is done in Python. So why are you using C & C++ here? (>>21057, >>21091, >>27167, >>29994)

Message too long. Click here to view full text.

Edited last time by Chobitsu on 02/29/2024 (Thu) 06:43:57.
469 posts and 152 images omitted.
>>30932 >so it can be changed later, right? Yes. >This one? Yeah, whatever's the latest revision. IIRC, there were some issues being fixed till then.
I keep having fantasies of my robot wife while we listen to edm. Anyone have an idea how we can make a robot enjoy music? My current idea is a stream of consciousness (one frame every 0.1 sec) that gets put through a liquid time constant rnn or equivalent trained on emotion and music. There's some github repos I can get started using if no one else has a better idea.
>>30992 How do you "train"on emotion? Do you mean you pick a specific genre, say EDM, then assign a positive weight to it, and assign negative values to other genres? Like in RL? So, you're gf will say positive things when hearing EDM ?
>>30993 Assign a value to of -10 to +10. Let sensory information dictate that value. >positive weight to it, and assign negative values to other genres? If you force someone to like something, do they really like it? Emotions are influenced by hormones for humans which, when represented in pure data, would be numeric noise for a robot. Ask yourself, "Who are you with when you're listening to the music, what did you eat before you listened to the music, what kind of place are you listening to the music in?". There's lots of background information that your brain processes on top of the hormones it releases to color the world with emotion. All those factors contribute to the overall enjoyment of music and how you feel about it when you hear it. I got my dick sucked while listening to https://www.youtube.com/watch?v=XL1BDku3GLM, I'd heard the song before but it hit a little different to me after that experience and whenever I hear it now, an emotion (along with a memory tied to that emotion) resonates with me. I don't think a robot can enjoy music the same way a human can, not without a very sophisticated brain than what we currently have available. At best, it could mimic a human's emotions if they saw the human was enjoying music. If you just want the robot to say "I enjoy electronic funk", that's easy enough to write into its definitions/personality.
>>30992 >>30993 >>30998 May I introduce you Anons to Snowball, the dancing cockatoo? I'm sure his little birb-brain is quite smol compared to a human's, yet as a Nephesh creature (like all birbs), he is capable of emotions. https://www.youtube.com/watch?v=N7IZmRnAo6s >=== -minor edit
Edited last time by Chobitsu on 04/23/2024 (Tue) 16:03:50.

LLM & Chatbot General Robowaifu Technician 09/15/2019 (Sun) 10:18:46 No.250 [Reply] [Last]
OpenAI/GPT-2 This has to be one of the biggest breakthroughs in deep learning and AI so far. It's extremely skilled in developing coherent humanlike responses that make sense and I believe it has massive potential, it also never gives the same answer twice. >GPT-2 generates synthetic text samples in response to the model being primed with an arbitrary input. The model is chameleon-like—it adapts to the style and content of the conditioning text. This allows the user to generate realistic and coherent continuations about a topic of their choosing >GPT-2 displays a broad set of capabilities, including the ability to generate conditional synthetic text samples of unprecedented quality, where we prime the model with an input and have it generate a lengthy continuation. In addition, GPT-2 outperforms other language models trained on specific domains (like Wikipedia, news, or books) without needing to use these domain-specific training datasets. Also the current public model shown here only uses 345 million parameters, the "full" AI (which has over 4x as many parameters) is being witheld from the public because of it's "Potential for abuse". That is to say the full model is so proficient in mimicking human communication that it could be abused to create new articles, posts, advertisements, even books; and nobody would be be able to tell that there was a bot behind it all. <AI demo: talktotransformer.com/ <Other Links: github.com/openai/gpt-2 openai.com/blog/better-language-models/ huggingface.co/

Message too long. Click here to view full text.

Edited last time by Kiwi_ on 01/16/2024 (Tue) 23:04:32.
267 posts and 84 images omitted.
>>30813 If you want advice, I still suggest /g/'s /lmg/. They're quite helpful.
Some guy (Morgan Millipede) started to reverse engineer Neuro-Sama: https://youtu.be/uLG8Bvy47-4 - basically just a humorous introduction on how to do this (he has a $4k computer, though, and she's slower in her responses at the beginning). 4chan responded: https://youtu.be/PRAEuS-PkAk - Her response time improved since the first video.
>>30821 Lol. Thanks NoidoDev, I'll try to make time to look these over. Cheers. :^)
>llama3-70b on Groq runs at 300 tokens/s for 7k tokens >mixtral-8x7b at 550 tokens/s for 7k tokens >my tinyllama-1.1b model extended to 12k tokens runs at 0.5 tokens/s I don't feel so good, bros. How do we make faster models? I have an idea to use Matryoshka representation learning to reduce the hidden dimension size dynamically: https://arxiv.org/abs/2205.13147 but even if I truncate the model's 2048 dimensions down to 512 dimensions, it will perform at 8 tokens/s at best. And who knows how much slower it will be once I get to 32k context. If it's possible to reduce 90% of the tokens to 64 dimensions, then it might get 70 tokens/s at the very most, but GPU latency will probably fuck that down to 20 tokens/s. I could also prune a few layers of the model, quantize it to 4-bits and implement mixture of depths https://arxiv.org/abs/2404.02258 but that will only give a tiny speed up and I don't want the accuracy to drop further than it is. With the much smaller model size though I could convert it into a sparse-mixture-of-experts model https://arxiv.org/abs/2401.04088 with 16 experts to make up for the loss in accuracy without sacrificing speed. The model will eventually be finetuned with self-rewarding ORPO too, hopefully providing a boost in usefulness to overcome its barebone compute, although I'll likely use Llama3-70b to bootstrap the reward labels until its capable of consistently self-improving on its own. Odds ratio preference optimization (ORPO): https://arxiv.org/abs/2403.07691 Self-rewarding LMs: https://arxiv.org/abs/2401.10020 The T5 efficient model worked fine with a hidden dimension size 512 after finetuning: https://arxiv.org/abs/2109.10686 And Matryoshka representation learning also worked well using a 16-dimension embedding for a 1k-class classification task. I forget the paper but I remember reading one years ago where they found some layers in transformers are only making a decision between a few choices, so a large hidden size might not be necessary in those cases. To convert the model's hidden states to Matryoshka I plan to add importance biases to parameters and train the biases with the rest of the parameters frozen and then take the softmax over them and top-k. After training, the parameters could be sorted and the importance biases pruned, and then the model parameters could be finetuned. I may have to train an even smaller model from scratch though since TinyLlama uses 32 attention heads.
>>31006 >use Matryoshka representation learning to reduce the hidden dimension size dynamically This seems both interesting & promising, Anon. Good luck with your research. Cheers. :^)

Open file (46.39 KB 458x620 eve preview.jpg)
My Advanced Realistic Humanoid Robot Project - Eve Artbyrobot 04/18/2024 (Thu) 17:44:09 No.30954 [Reply]
So far I have plans to build Adam, Eve, and Abel robots. All of these are Bible characters. This thread will cover the Eve robot. Eve will have no "love holes" because adding those would be sinful and evil. It is a robot, not a biological woman after all and I will view her with all purity of heart and mind instead of using her to fulfill my lusts of my body. Instead I will walk by the Spirit no longer fulfilling the lusts of the flesh as the Bible commands. Eve will be beautiful because making her beautiful is not a sinful thing to do. However, I will dress her modestly as God commands of all women everywhere. This would obviously include robot women because otherwise the robot woman would be a stumbling block to men which could cause them to lust after her which would be a sin. To tempt someone to sin is not loving and is evil and so my robot will not do this. To dress her in a miniskirt, for example, would be sinful and evil and all people who engage in sinfullness knowingly are presently on their way to hell. I don't wish this for anyone. My robot will dress in a way that is a good example to all women and is aimed toward not causing anybody to lust as a goal. My robot will have a human bone structure. It will use either a PVC medical skeleton or fiberglass fabricated hollow bones. My robot will look realistic and move realistic. It will be able to talk, walk, run, do chores, play sports, dance, rock climb, and do gymnastics. It will also be able to build more robots just like itself and manufacture other products and inventions. I realized with just a head and arm, a robot can build the rest of its own body so that is my intention. My robot will use BLDC motors for drones, RC, and scooters that are high speed and low-ish torque but I will downgear those motors with a archimedes pulley system that will be custom made from custom fabricated pulleys that will be bearings based. By downgearing with pulleys, instead of gears, I will cut down the noise the robot makes so it will be as silent as possible for indoor use. By downgearing, I convert the high speed motors into moderate speeds with great torque. BLDC motors with large torque generally are too large in diameter for a human form factor and take up too much volumetric area to be useful which is why I go with the high speed smaller diameter type motors but just heavily downgear them 32:1 and 64:1. My robot will have realistic silicone skin. Thom Floutz -LA based painter, sculptor, make-up artist is my inspiration as it pertains to realistic skin. The skin for my robots has to be at his level to be acceptable. It must be nearly impossible to tell the robot is not human to be acceptable. I will have a wireframe mesh exoskeleton that simulates the volumes and movements of muscle underneath the skin which will give the skin its volumetric form like muscles do. Within these hollow wireframe mesh frameworks will be all the electronics and their cooling systems. All of my motor controllers will be custom made since I need them VERY small to fit into the confined spaces I have to work with. I need LOADS of motors to replace every pertinent muscle of the human body in such a way that the robot can move in all the ways humans move and have near human level of strength and speed.

Message too long. Click here to view full text.

5 posts and 9 images omitted.
Here is a 3d model I made of the motor controller design I made. I felt this would help me really perfect the layout and visualize how I wanted it and the wiring routing etc. I also did 2d schematic in photoshop and finally in KiCad. I plan to etch my own flat flex pcbs for aspects of this motor controller.
Here's the arduino mega barebones CAD design I made. This will use flat flex ribbon cable soldered directly to the pins of the chip to make the form factor volumetically as small as possible. I'll have at least 30 of these in the robot controller the motors and reading in sensor input for current amps, strain gauges, gyrometers/accelerometer, potentiometers, etc. These will hold my code for low level stuff and manage the motor movements directly through the mosfet systems. They will all report back updates to the main brains PC who will then know the progress of movement commands it sent out to the network of arduinos doing the low level stuff.
Here's a progress shot of my arduino mega barebones prototyping using flat flex cable directly soldered to the pins.
Very nice. Please continue to show us your work as you progress. >=== -sp edit
Edited last time by Chobitsu on 04/23/2024 (Tue) 15:32:35.
Welcome, Artbyrobot! I would suggest you look around the board as a courteous greeting, but I have the feeling you've already done so. Your project seems well thought out, I wish you good success with it. I'm particularly glad to see you intend to program your AI (et al?) in C++ . I probably need to do little to explain to you why this is important for success in this monumental set of tasks ahead of us all. Heh, I notice you seem to be positioning your project as coming from the Christian worldview. Nice! I too desire to see God glorified through these projects. I personally intend -- among other projects -- to work on a Christ-chan project. Basically a plug-in module approach to our general robowaifu's personalities that Anons can add into their own waifus. Looking forward to seeing your progress with this projects(s), Anon. Cheers. :^)

Robot Vision General Robowaifu Technician 09/11/2019 (Wed) 01:13:09 No.97 [Reply] [Last]
Cameras, Lenses, Actuators, Control Systems

Unless you want to deck out you're waifubot in dark glasses and a white cane, learning about vision systems is a good idea. Please post resources here.

opencv.org/
https://archive.is/7dFuu

github.com/opencv/opencv
https://archive.is/PEFzq

www.robotshop.com/en/cameras-vision-sensors.html
https://archive.is/7ESmt
Edited last time by Chobitsu on 09/11/2019 (Wed) 01:14:45.
117 posts and 53 images omitted.
>>30877 I know this is a stupid question but can you strip those components right out of the suppoirt frame and have them simply connected to the wires?
>>30879 Zoom in to the whole in the centre. Looks like there is a circuit board under there. If one were to take it out of the frame it would require adding wires and attaching back to the circuit board I imagine.
>>30879 >>30880 I expect the physical positioning of the 3 camera components is tightly registered. Could be recalibrated I'm sure, but it would need to be done.
>>30879 >Depth Perception From what I know these systems work so that it knows the distance between the two cameras and this is part of the hardware. If you want to do this yourself then your system would need to know the distance. I think Kudan Slam is a software doing that: >>29937 and >>10646 >Kudan Visual SLAM >This tutorial tells you how to run a Kudan Visual SLAM (KdVisual) system using ROS 2 bags as the input containing data of a robot exploring an area https://amrdocs.intel.com/docs/2023.1.0/dev_guide/files/kudan-slam.html >The Camera Basics for Visual SLAM >“Simultaneous Localization and Mapping usually refer to a robot or a moving rigid body, equipped with a specific sensor, that estimates its motion and builds a model of the surrounding environment, without a priori information [2]. If the sensor referred to here is mainly a camera, it is called Visual SLAM.” https://www.kudan.io/blog/camera-basics-visual-slam/ >.... ideal frame rate ... 15 fps: for applications with robots that move at a speed of 1~2m/s >The broader the camera’s field of view, the more robust and accurate SLAM performance you can expect up to some point. >...the larger the dynamic range is, the better the SLAM performance. >... global shutter cameras are highly recommended for handheld, wearables, robotics, and vehicles applications. >Baseline is the distance between the two lenses of the stereo cameras. This specification is essential for use-cases involving Stereo SLAM using stereo cameras. >We defined Visual SLAM to use the camera as the sensor, but it can additionally fuse other sensors. >Based on our experience, frame skip/drop, noise in images, and IR projection are typical pitfalls to watch out. >Color image: Greyscale images suffice for most SLAM applications

Message too long. Click here to view full text.

Open file (225.52 KB 1252x902 kinectxie.jpg)
>>30877 The kinect was cheap at 12$ and I scaled it to the full sized robot head in gimp. I can use the main camera in the middle of aperture and the two projector/IR camera lenses as the eye shines. It won't look like this in the final robot head, but it will be positioned in this manner.

Open file (329.39 KB 850x1148 Karasuba.jpg)
Open file (423.49 KB 796x706 YT_AI_news_01.png)
Open file (376.75 KB 804x702 YT_AI_news_02.png)
General Robotics/A.I./Software News, Commentary, + /pol/ Funposting Zone #4 NoidoDev ##eCt7e4 07/19/2023 (Wed) 23:21:28 No.24081 [Reply] [Last]
Anything in general related to the Robotics or A.I. industries, and any social or economic issues surrounding it (especially of robowaifus). -previous threads: > #1 (>>404) > #2 (>>16732) > #3 (>>21140)
389 posts and 167 images omitted.
>>30718 > supremacist (n.) >"one who believes in the inherent superiority of one race or sex or social group," by 1892, in white supremacist, originally with reference to political campaigns and candidates in the U.S. South (Louisiana), from supremacy + -ist. Compare supremist. Related: Supremacism. >1892 https://www.etymonline.com/word/supremacist
>>30722 Which means "white privilege" is an expression of white supremacy, a contemporary expression of "white man's burden" (the idea that whites must help non-whites develop civilization and prosper).
>>30976 >It's just usual content farming all big YouTubers do. I believe that's just called 'clickbait', isn't it Anon? :^) >I have never in the wild seen anyone care beyond just feeling sorry someone feels that lonely. Then I think it likely you haven't broached this topic clearly, with any women who consider themselves still to have SMV (today that's probably even up to 50yo+ grannies, lol). Or with a hard-core Leftist/Filthy-Commie. They -- all of them -- hate the very idea itself. Most of the ones I've engaged with in any way also threaten physical violence against robowaifus if/when they ever see one. We'll see how that all works out for them. :^) The Filthy Commies go one step further and threaten physical attack against robowaifu owners too since that's how Filthy Commies behave, after all (think: Pantyfags, F*men, etc.) -- under the bribery directives of their Globohomo puppetmasters, ofc. LOL. Heh, we all need to look up Stickman (is he out yet?) and give him a complementary Model A robowaifu! :^) Blacks just destroy things simply b/c they're blacks, by all appearances. They also will be involved with this violence to be directed against robowaifus/owners; but mindlessly, not for the agenda-driven motives of the first two groups mentioned here. Once they see the GH media glorifying physical attacks against robowaifus, I'm sure they'll be all-in with it for a while, too. (And -- like these other Leftists -- they too have their paid rabble-rousers [cf. the paper-hangin', pregnant-woman-abusin', multi-feloner Fentanyl Floyd's overdose-death's -- mostly-peaceful, mind you -- Burn Loot Murder 'honorarium' """protests""", et al]). All this type of clickbait (cf. >>30975, et al) is literally just GH predictive-programming attempting to prepare the masses for violence, come the day. TOP KEK! May the stones that they are preparing to roll down on us, all roll back upon their own heads instead, in Jesus' name!! [1] :DD Make no mistake: this is a broad cultural war already going on within our so-called """society""" of today. Robowaifus will amp that up to 12. I'm sure /cow/ and their ilk will be delighted, once the time is ripe. So get your popcorn ready, kids! :D >t. Noooticer. :^)

Message too long. Click here to view full text.

Edited last time by Chobitsu on 04/21/2024 (Sun) 15:03:44.
>>30986 This is completely and abundantly true. The programality (program-reality, is this even a word, if not it should be) of it all is baked in. Like those fools that buy pit bulls and tell everyone it's how you raise them. Of course the nice doggies rip the skin off their children heads.
>>30989 All pit bulls should be destroyed, outright. I love doggos in general (I've had several), but not those demonic little sh*tes. In fact, in some bizarre spiritual sense, they seem almost allegorical in their natures (for demons, ofc) to me.

Robot skin? Possible sensitivity? Robowaifu Technician 09/15/2019 (Sun) 07:38:17 No.242 [Reply] [Last]
The Anki VECTOR has a skin-like touch sensor on it, could we incorporate it into our robogirls?
74 posts and 16 images omitted.
I thought about doing casting for the skin but it might not be necessary. The skin might be easier than I thought.
>>24712 Then again if the waifu are to be done in bulk casting is still the better approach though.
>>24712 I think this approach may in fact serve us very well for a robowaifu's face, boobas, and vagoo. Good luck with your research! :^)
I was looking for the link on using optical fibers linked to camera chips for sensors (this is one of the greatest ideas in history). The link for using cameras and fiber for sensors is, https://hackaday.com/2019/08/30/fibergrid-an-inexpensive-optical-sensor-framework/ While looking for this I found a new link that really fits in well with the other that I had not seen. This one, https://hackaday.com/2011/10/21/building-optical-flex-sensors/ A little thought and you will see this guy has figured out a way to make very good touch/pressure sensors that could be waterproof, reliable, self contained and if you combine this with the first, cameras used to sense the values of light back from sensors, now you have a complete sensor package. You can use optical encoders fed by fiber(light) for motion and use this second idea for pressure, touch, sensing and with the first, cameras for sensing, you have the whole thing in one nice package. If the touch sensors are in a X-Y grid you have position sensing, which could be combined with the light attenuation to give touch pressure values with point locality. Since cameras these days have huge values you could have a very precise position location sensing. And the whole thing in a waifu could be self calibrating. So you plug all these fibers and/or sensors into a package. The waifu bends it's limbs in sequence under control while monitoring the feedback from the sensors. After it calibrated what joints go to which pixels on the camera it starts squeezing it's body parts and calibrates the amount of squeeze with pressure and calibrates which pixels and by how much they change. The whole done in a few minutes. Could even do so periodically to make sure calibration is correct with wear. Now the trick is to get away from fiber. Stringing together all these fibers is labor intensive. The second link shows a path to this. In it he has a hollow tube with a filler. What if you made a trough for one side of a skin section and on the other side a hill that fits into the trough? The outsides would be a dark rubber. So you flex it and the light hits the dark rubber and is absorbed as opposed to going straight through the thin film which is clear that you made the trough and hill out of. When I say film I'm saying you make this structure, like a big flexible nervous system out of a cast plastic. Possibly cast the clear interior then spay over, paint or cast the exterior dark surface. I suspect that there will be some trickery and art involved in making this film, trough and hill structure to get the maximum sensitivity without blocking the light all together. You would have to separate the joint bending light attenuation from the pressure/touch sensing light attenuation. I suppose you could bend, then pressure test, touch calibration. Since it knows it's bending joints and knows this value then any further attenuation of light would be assumed to be touch. Every since I heard about the camera fiber sensor idea I've been trying to think of a way to make this THE method of sensing for all facets of the robowaifu I think, "this is the way", though I have no doubt there are lots of niggling bugaboos that could cause trouble. But once you could get some small section worked out and function properly then it's only a matter of multiplying that small section onto the whole of the waifu.

Message too long. Click here to view full text.

>>30980 Thanks! Great content, Grommet. Cheers. :^)

Report/Delete/Moderation Forms
Delete
Report