Comfyui reuse seed not working. Please keep posted images SFW. May 25, 2024 · Understanding ComfyUI seed. Then, if there are no issues, enable only the ComfyUI-Manager and test again. Follow the ComfyUI manual installation instructions for Windows and Linux. WASasquatch closed this as completed on Mar 24, 2023. 2. I showed two possible solutions: Updating existing workflows to use the Oct 10, 2023 · You signed in with another tab or window. The Seed of the following images (2 and 3) is different. ago. It's a clear comfyui windows portable, with RTX4060 8GB,win10 ,not work But it works well at other computers (with RTX3060,win10) Traceback (most recent call last): File "D GlobalSeed does not require a connection line. I'm also on Linux (Manjaro), but I'm using Python 3. ltdrdata commented on Sep 4, 2023. The seed changes so you can copy it down and reuse it if you like the results. File "E:\ComfyUI\comfy\model_base. When I render an image using same settings in WebUI and ComfyUI The results are different Did I do somethings wrong? Which part of the inference makes them different? thank you very much! Checkpoint: v1-5-pruned-emaonly. Dec 9, 2023 · Check the Network tab in the browser's developer tools to see if openpose. However its a different story with my tablet. However, when I use ComfyUI and your "Seed (rgthree)" node as an input to KSampler, the saved images are not reproducible when image batching is used. Some third-party seed nodes offer a control_before_generate approach instead of control_after_generate. 1. samplers' (C:_ComfyUi\ComfyUI\comfy\samplers. Setting CFG to 0 means that the UNET will denoise the latent based on that empty conditioning. A node like that really needs the ability to open the operating system's file browser. You signed out in another tab or window. e. py in the diffusionmodels folder, when replaced with the previous version, fixes the issue also. Step 5 - [Fixed seed] Hit queue and move that single image through the rest of the workflow. Apr 13, 2024 · You signed in with another tab or window. It’s an input for a Boolean (true/false) you can switch it to a dropdown instead. Closed Tobe2d opened this issue Feb 14, Once I disable comfyui_dagthomas it work and restart is working just like it was Navigating the ComfyUI User Interface. Ferniclestix. Maybe someone else can check, but for me this errors out on batch sizes greater than 1. Development. py) WAS Node Suite: OpenCV Python FFMPEG support is enabled Feb 23, 2024 · You signed in with another tab or window. "Negative Prompt" just re-purposes that empty conditioning value so that we can put text into it. This appears to be because your nodes (and other nodes available for ComfyUI) do not save the correct seed. exe -V. In ComfyUI the prompt strengths are also more sensitive because they are not normalized. It provides several ways of distributing seed numbers to other nodes all without the connecting lines! You just have to set "control_after_generate" widget on nodes to "fixed" for it to work. For example, (from the workflow image below): Original prompt: "Portrait of robot Terminator, cybord, evil, in dynamics, highly detailed, packed with hidden details, style, high dynamic range, hyper Install the ComfyUI dependencies. Welcome to the unofficial ComfyUI subreddit. I restarted as prompted. Download The custom node will analyze your Positive prompt and Seed and incorporate additional keywords, which will likely improve your resulting image. (For Windows users) If you still cannot build Insightface for some reasons or just don't want to install Visual Studio or VS C++ Build Tools - do the following: (ComfyUI Portable) From the root folder check the version of Python: run CMD and type python_embeded\python. Please share your tips, tricks, and workflows for using this software to create your AI art. Award. Moving the folder with ComfyUI from drive C to drive D worked for me. Based on the current logs, it's not certain whether the issue lies with the ComfyUI-Manager. And I do wonder why they are so extremely different. Extension: ComfyUI Inspire Pack. 2) (best:1. ckpt_name% it's literally printed and not converted. Feb 24, 2024 · ComfyUI is a node-based interface to use Stable Diffusion which was created by comfyanonymous in 2023. Nov 7, 2023 · However, ComfyUI and Automatic1111 seem to be processing Stable Diffusion quite differently. 6 days ago · ltdrdata > comfyui-impact-pack Wildcard Processor - Convert seed to input does not work about comfyui-impact-pack HOT 1 CLOSED vtoaster commented on June 6, 2024 Wildcard Processor - Convert seed to input does not work. Actually not even first, jesus @comfyanonymous what's up with batches? Jul 27, 2023 · Like this: Right click on the KSampler node to turn "Seed" into an input You can then use a seed Node with fixed output OR the KSampler will take any INT input. The way ComfyUI is built up, every image or video saves the workflow in the metadata, which means that once an image has been generated with ComfyUI, you can right click on the node and click convert force_inpaint to widget. Related Issues (20) Welcome to the unofficial ComfyUI subreddit. If I understood you right you may use groups with upscaling, face restoration etc. . Some commonly used blocks are Loading a Checkpoint Model, entering a prompt, specifying a sampler, etc. Click run_nvidia_gpu. ERROR:root:Failed to validate prompt for output 10: ERROR:root:* StyleAlignedReferenceSampler 45: ERROR:root: - Value not in list: share_attn: '1' not in ['q+k', 'q+k+v', 'disabled'] ERROR:root: - Value 0 smaller than min of 1: batch_size ERROR:root: - Failed to convert an input value to a INT value: noise_seed, None, int() argument must be a Welcome to the unofficial ComfyUI subreddit. I think this is due to more restrictive permissions on the system disk. randint (1,4294967294) I've used this approach in my integration, and I can confirm that it works wonderfully. , when I use the checkpoint 'variable' -> %CheckpointLoaderSimple. . Whereas traditional frameworks like React and Vue do the bulk of their work in the browser, Svelte shifts that work into a compile step that happens when you build your app. If you want to reuse the seed, hit the green recycle button and it will stick and you'll have to manually change it back to -1. Same result in Safari and Chrome. I converted variation_seed on the Hijack node to input because this node has no "control_after_generate" option, and added a Variation Seed node to feed it with the variation seed instead. So the short answer to the question is: if you want to set a seed to create a reproducible process then do what you have done and set the seed once; however, you should not set the seed before every random draw because that will start the pseudo-random process again from the beginning. The old behavior was that when you clicked randomize, it would set the seed value to -1, which allows for a new random seed to be used every new generation queued. Q&A. Sep 18, 2023 · Sure - here's an example of a PNG that won't load whether selected through the 'load' menu or brought in via drag and drop. just to check, close server and chrome tab, Go to "update" folder of the crashing instance of comfyUI incase you have more, and launch Feb 14, 2024 · Restart not working #410. Sytan's SDXL Workflow will load: Oct 29, 2023 · Been using ComfyUI for the last 4~5 days, without any issue at all in the first 3 days, some minor slow downs here and there, but no freeze/crash/reboot whatsoever dummy_seed: DALL-E3 does not currently provide a way to specify a seed value. Mar 18, 2024 · ApexHigh commented on Apr 3. Found it ! From right to left ie : Group seed corresponds to furthest image to the right, you can then ad 1 for each image to the left for the seed you want to concentrate on. In this video, I will first introduce the concept of the variation Open Settings (small gear icon in the top right corner of control panel) and change Widget Value Control Mode to "before". safetensors' not in [] Apr 19, 2023 · No milestone. This subreddit is not designed for promoting your content and is instead focused on helping people make games, not promote them. And this is with all what I want to use: Seed, steps, CFG. Now, ComfyUI: 2046a38b9b Mar 31, 2024 · I'm getting the same issue as OP, and like they did, I completely re-installed ComfyUI and then ComfyUI_IPAdapter_plus. When that still didn't work, I also used the "Try update" button to try updating IPAdapter_plus from within the ComfyUI Manager, with the same result. You can right click on a node and change many selections to an input. The a1111 ui is actually doing something like (but across all the tokens): Oct 19, 2023 · I. You can construct an image generation workflow by chaining different blocks (called nodes) together. Yes that sounds like you are using a SD1. before launching the workflow and you get actual values used in the launch, until you hit the Queue Prompt button again. If you drag a noodle off it will give you some node options that have that variable type as an output. Reload to refresh your session. i have selected Latent as upscale_type so didnt think i would need a pixel upscaler, neither am i using control net in this workflow. Nesting the selected nodes. Jul 18, 2023 · Wildcard node for ComfyUI. The downside is that you cannot cherry-pick your models to compare, as the control_after_generate option will just select the next model from the list after each generated image. Such as: prompt ["3"] ["inputs"] ["seed"] = random. Adding a Node: Simply right-click on any vacant space. The page is loading but there only is a grey canvas visible (no grid or menu on it). This issue thread provides a detailed solution to fix the problem and enjoy the powerful features of Reactor node. When the action is set to increment, decrement, or randomize, it modifies the value of GlobalSeed and applies the same value uniformly to the seeds of all nodes. I'm not sure if this is what you want: fix the seed of the initial image, and when you adjust the subsequent seed (such as in the upscale or facedetailer node), the workflow would resume from the point of alteration. Launch ComfyUI by running python main. It will change if you run your workflow again. Aug 10, 2023 · You signed in with another tab or window. Coders can take advantage of its built in scripting language, "GML" to design and create fully-featured, professional grade games. start the next job in the queue to evaluate another random seed) Sep 16, 2023 · Hey there! Installed this via the package manager to the latest ComfyUI. 8. Make sure you put your Stable Diffusion checkpoints/models (the huge ckpt/safetensors files) in: ComfyUI\models\checkpoints. 0 the Mar 20, 2024 · ComfyUI is a node-based GUI for Stable Diffusion. and 'ctrl+B' or 'ctrl+M' that groups when you Jan 27, 2024 · But to fix the crash if you don't want to reset everything in case don't have space or testing new modules. Selecting the nodes to nest. This is a custom node for ComfyUI to add in wildcard functionality from files. 3. TripleCLIPLoader: Value not in list: clip_name1: 'clip_g_sdxl_base. This is beneficial if you want to replicate the same image, without replacing the wildcard Update ComfyUI and all your custom nodes first and if the issue remains disable all custom nodes except for the ComfyUI manager and then test a vanilla default workflow. Getting this: Prompt outputs failed validation. It might seem daunting at first, but you actually don't need to fully learn how these are connected. If the prompt and resolution are the same and dummy_seed has not changed, this Node will reuse the cached output (i. Zooming or scrolling doesnt reveal anything. Jan 11, 2024 · 1. That's how the prompt adherence function works. If you have trouble extracting it, right click the file -> properties -> unblock. Just delete the custom-node folder for the specific node, Launch it will probably be a blank canvas. A very short example is that when doing (masterpiece:1. Does anyone know how I can get my exact Auto1111 results in Comfy? I noticed that reducing the cfg scale in ComfyUI gets me more pleasing results, but it's still not at all the same as in Automatic1111. Data. OP • 10 mo. But controlling the seed can help you generate reproducible images, experiment with other parameters, or prompt variations. The x% variation applied to images 2 and 3 I use the Global Seed (Inspire) node from the ComfyUI-Inspire-Pack by Lt. py", line 155, in sdxl_pooled return args["pooled_output"] The text was updated successfully, but these errors were encountered: Follow the ComfyUI manual installation instructions for Windows and Linux. Belittling their efforts will get you banned. PortraitMaster: Failed to convert an input value to a FLOAT value: iris_details, -, could not convert string to float: '-'. Mar 5, 2024 · Hello and thanks for making this into a ComfyUI node! I just tried to install using the ComfyUI Manager Menu -> Install Custom Nodes. You don't need to come up with the number yourself because it is randomly generated when not specified. 3 participants. I fixed it and added it to the main repo as advanced->model->RescaleCFG. Doesn't appear in the browser even though it's listed on startup: Import times for custom nodes: 0. Oct 29, 2023 · You signed in with another tab or window. If that works out, you can start re-enabling your custom nodes until you find the bad one or hopefully find out the problem resolved itself. EDIT: I just update all of comfy but it didn't make any difference. , the image will not change); if dummy_seed has changed, it will generate a new image. Not a stupid question because it is very confusing if you've come from Auto1111 to understand how seeds work in ComfyUI. You can also use to highlight nodes. Dec 30, 2023 · You signed in with another tab or window. 👍 1. [1] ComfyUI looks Jun 25, 2023 · You signed in with another tab or window. This dummy_seed value is a parameter to control caching. Simply download, extract with 7-Zip and run. Dec 19, 2023 · Step 4: Start ComfyUI. But the new nodes are not available. After making a selection, on any of the selected nodes and select and choose a name that won't conflict with any other existing node. Select multiple nodes by using on the desired nodes to nest. Nov 8, 2023 · I rechecked the paper and it's supposed to be done on the vpred model output not the predicted noise which is why it wasn't working properly. Step 6 - Switch back to [random seed] and go back to Step 1. safetensors' not in [] Value not in list: clip_name2: 'clip_l_sdxl_base. Here’s a concise guide on how to interact with and manage nodes for an optimized user experience. Basically, the file should contain <upcounting numer:0001>_<seed>_<steps>_<sampler>_<cfg> and all images should be saved in a folder with the used checkpoint as name. 3) (quality:1. cguillou. A lot of people are just discovering this technology, and want to show off what they created. Nov 17, 2023 · If you have trouble getting Reactor node to work with ComfyUI, you are not alone. Just to be clear, though, no PNGs work at all on the problematic installation. GameMaker Studio is designed to make developing games fun and easy. Repeating the same Seed with the same Ksampler will produce the same 3 images. 10. You can set the noise seed manually by right clicking on the sampler, and under bypass, choose convert seed to input. Many users have reported issues such as import failed, missing modules, or incompatible versions. If you are facing difficulties after the update, this one is for you. Join the largest ComfyUI community. You Jul 17, 2023 · Hi I am new to ComfyUI. 0 seconds: /Users/zebra/D Nov 2, 2023 · Cannot import C:_ComfyUi\ComfyUI\custom_nodes\efficiency-nodes-comfyui module for custom nodes: cannot import name 'CompVisVDenoiser' from 'comfy. The idea of uploading images isn't anything more than the idea of opening files Feb 19, 2024 · Have a fresh install of comfyUI, and installed all the custom nodes. And above all, BE NICE. Dr. Once you're set up, 95% of the work does not need zoom out pan zoom in adjust zoom out pan zoom in adjust zoom out queue zoom in wait. to the corresponding Comfy folders, as discussed in ComfyUI manual installation. Mar 27, 2023 · I think this is why there is now optional input for the seed input, so you can use a single INT and input it into any seed input, effectively having a global seed. py --force-fp16. Comments (1) ltdrdata commented on June 6, 2024 #325. 3. Generating noise on the GPU vs CPU does not affect performance in any way. When I try to reproduce an image, I get a different image. Solve the broken IPAdapter (missing) nodes after the IPAdapterV2 upgrade. If you double click and start typing 'seed', you'll find a couple seed generation nodes to use. A Deep Dive into ComfyUI Nodes. Nodes work by linking together simple operations to complete a larger complex task. The Seed of the Ksampler is assigned to all the images generated in a batch size. com) btw, this should work with a1111 images as well. The most important thing about seed is that Yea, I think there is a ton of potential there. This extension provides various nodes to support Lora Block Weight and the Impact Pack. You switched accounts on another tab or window. As of right now, it seems there is only support for doing that with images. Nodes that have failed to load will show as red on the graph. The image that I'm using was previously…. Github View Nodes. js is loading correctly. This reference Seed is the Seed from the first image. 001, more noticeable as you increase it to 0. 0 replies. They have since hired Comfyanonymous to help them work on internal tools. In case you are still looking for a solution (like i did): I just published my first custom node for comfy that is loading the generation metadata from the image: tkoenig89/ComfyUI_Load_Image_With_Metadata (github. I don't understand myself, tbh. The batch index should match the picture index minus 1. 11. Apr 25, 2024 · You signed in with another tab or window. Also, after reaching the last model, it does not return to the first. 4) girl. That way each image would have it's own seed number 🏼🤖🎨. I really liked this one for A1111, so I wrote a node for ComfyUI. • 10 mo. The GREAT length some people take to "clean" their noodle flows is excessive. 01 and by the time you are at 1. Failed to convert an input value to a FLOAT value: circular_iris, -, could not convert string to float Extension: ComfyUI-Chibi-Nodes Nodes:Loader, Prompts, ImageTool, Wildcards, LoadEmbedding, ConditionText, SaveImages, Authored by chibiace Share, discover, & run thousands of ComfyUI workflows. Install the ComfyUI dependencies. I'm trying to use face detailer and it asks me to connect something to 'force inpaint' and it doesn't render. If you increase this above 1, you'll get more images from your batch up to the max # in your original batch. I'm on Windows 11 ComfyUI: 202136f7fa Manager: V2. Aug 2, 2023 · i thought there is something like -1 like A1111's api, btw thanks for the help !! :) You can feed it any seed you want on this line, including a random seed. Seed does not work consistently - if I generate a single image with a fixed seed I get a different first image to when I use the same seed but for a batch size of 4. It won't change or set the seed number in nodes with to any of the other If can also be a fixed seed in the sampler, if you have a fixed seed and don't modify the prompt or any other parameter, after 1 queue comfyui won't generate the image, because it would be exactly the same as the one already rendered before. Reply. I got comfyui running on the local network and can access the ui without any problems on my android phone. bat and ComfyUI will automatically open in your web browser. Raising CFG means that the UNET will incorporate more of your prompt conditioning into the denoising process. But if you choose increment/decrement, which would be the delta (rate of change), you can't change how much it increments/decrements. Step 3 - [Fixed seed] Turn on upscaling and additional groups. Generate from Comfy and paste the result in Photoshop for manual adjustments, OR. Target sizes: [1024]. It works as you would expect minor changes with a default strength of 0. " Command window: Traceback (most recent call last): File "C:\Stable_Diffusion\ComfyUI_windows_portable Mar 24, 2023 · I could fork and distribute a new UI but I'd like to keep working on ComfyUI. Nov 10, 2023 · In the recent "Inspire Pack", various nodes related to the variation seed have been added. Oct 21, 2023 · Been messing with the last update and cant seem to use the HighRes-Fix script anymore as it keeps asking for a pixel_upscaler and a control_net_name. The length should be 1 in this case. ComfyUI’s graph-based design is hinged on nodes, making them an integral aspect of its interface. Jan 15, 2024 · Disable all custom nodes and test it. This node creates a Seed, and can be wired to the KSampler to reuse the same seed. Click the Load button and select the . py; Note: Remember to add your models, VAE, LoRAs etc. The underlying number is still -1, it's just covered by the seed that they used. Instead of using techniques like virtual DOM diffing, Svelte writes code that surgically updates the DOM when the state of your app changes. This is the prompt ComfyUI is giving me: Aug 19, 2023 · Suggestion: use a fixed seed in your KSampler to compare models reusing the same seed number. To give you an idea of how powerful it is: StabilityAI, the creators of Stable Diffusion, use ComfyUI to test Stable Diffusion internally. 🙏. This is how things would be done in a node network, and should be. The selected nodes will be replaced with a new node Feb 10, 2024 · I've found another thread on another git page where someone had a similar issue but I'm not sure if that's relevant and if so how to translate that solution to this, ModelSurge/sd-webui-comfyui#184, here is the link if that could possibly be helpful. Mar 21, 2023 · But another thing - how does it work with batches?! Like wth? I have fixed seed, bach size 4. json workflow file you downloaded in the previous step. Step 4 - [Fixed seed] use 'latent from batch' to pick the image from Step 1 you liked and want to process. For instance if you did a batch of 4 and really just want to work on the second image, the batch index would be 1. An optional hacky solution is to allow the node to update after it's been ran, it runs, sets a new RETURN_TYPE or uses a method, and outputs that new node type, and UI updates output label accordingly. x model which won't work. Now, when clicking randomize it sets the seed to -1 for a split second, b Sep 13, 2023 · The method mentioned above is a solution within the current default ComfyUI implementation. Mar 1, 2024 · A user over on reddit pointed out the file openaimodel. x embedding on an SD2. Category. I like image 3, and want to work/tweak it, so reduce batch to 1, and good luck to me? incrementing seed 2 times or whatever, I can't get images apart from first. ComfyUI breaks down a workflow into rearrangeable elements so you can easily make your own. Hello r/comfyui , I just published a video on how to fix the missing or broken IPAdapter node after the IPAdapter V2 update. from comfyui-impact-pack. Provides many easily applicable regional features and applications for Variation Seed. Seed in Stable Diffusion is a number used to initialize the generation. Authored by ltdrdata. Draw in Photoshop then paste the result in one of the benches of the workflow, OR. Nov 26, 2023 · When launching comfyui portable on windows 11 with everything up to date: ComfyUI web interface: "When loading the graph, the following node types were not found: VHS_VideoCombine. No branches or pull requests. Unlike other Stable Diffusion tools that have basic text fields where you enter values and information for generating an image, a node-based interface is different in the sense that you’d have to create nodes to build a workflow to generate images. Note that --force-fp16 will only work if you installed the latest pytorch nightly. Combine both methods: gen, draw, gen, draw, gen! Always check the inputs, disable the KSamplers you don’t intend to use, make sure to have the same resolution in Photoshop than in May 30, 2023 · You signed in with another tab or window. If you have another Stable Diffusion UI you might be able to reuse the dependencies. (This is a REMOTE controller!!!) When set to control_before_generate, it changes the seed before starting the workflow from the queue prompt. safetensors Prompt: a woman with blonde hair and a white shirt is looking at the viewer, masterpiece Dec 19, 2023 · ComfyUI was created in January 2023 by Comfyanonymous, who created the tool to learn how Stable Diffusion works. Problem with SD3 triple Clip loader (comfy is up to date) So I load the basic workflow from their huggingface example but I get the following error: Prompt outputs failed validation. Direct link to download. If your comfyui is accessed through an nginx proxy with a prefix URL, this issue may occur because the openpose editor uses absolute paths to access the js files. It seems like there are two competing things at play here: A seed defines a value for a deterministic "random" process that means it can be repeated in the future. This way the values will randomize/increment etc. I'm guessing that's why the LoadLatent node is still in testing. Mar 15, 2023 · comfyanonymous commented on Mar 15, 2023. I. RuntimeError: The expanded size of the tensor (1024) must match the existing size (768) at non-singleton dimension 0. Look at preview to determine if it's a good seed; If it's a good seed, run latent upscaling (or something else that is more time-consuming) If it's not a good seed, stop the job (i. Hope that helps. Since My workflow: suuuuup, :Dso, with set latent noise mask, it is trying to turn that blue/white sky into a space ship, this may not be enough for it, a higher denoise value is more likely to work in this instance, also if you want to creatively inpaint then inpainting models are not as good as they want to use what exists to make an image more Jul 20, 2023 · Random seed for each run; Run txt2img; Pause here, wait for human input. cb pa wm sn yo ov ok xm az lq