Skip to content
  • Search
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse

ZMT zurich med tech

  1. Home
  2. Sim4Life
  3. Simulations & Solvers
  4. Running iSolve on GPU

Running iSolve on GPU

Scheduled Pinned Locked Moved Simulations & Solvers
4 Posts 2 Posters 767 Views 2 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • S Offline
    S Offline
    sanand
    wrote on last edited by
    #1

    I'm trying to run a simulation on iSolve using a GPU (NVIDIA GeForce RTX 3090) on a Linux machine running Ubuntu 20.04.6 LTS. I have a license that allows a simulation to be run on a single GPU. I've followed the instructions to install iSolve in the manual and added the following lines to my bashrc:

    export AX_USE_UNSUPPORTED_CARDS=1
    export SEMCADX_CUDA_ADDITIONAL_CARDS="GeForce RTX 3090"
    export CUDA_VISIBLE_DEVICES=1
    export LD_LIBRARY_PATH=[path to iSolve]:$LD_LIBRARY_PATH
    export LM_LICENSE_FILE=[path to license file]
    

    However, when I run a simulation using iSolve, it uses CPU by default. Is there a way I can specify that I want the simulations to run on GPU? I am testing a very small simulation based on a tutorial, so I don't think memory is an issue. Thanks!

    1 Reply Last reply
    0
    • B Offline
      B Offline
      brown
      ZMT
      wrote on last edited by
      #2

      In the Solver settings for your simulation, did you change the Kernel? By default, it is set to Software, which runs on the CPU. You can change it to aXware or CUDA to run on the GPU depending on your license.

      1 Reply Last reply
      0
      • S Offline
        S Offline
        sanand
        wrote on last edited by
        #3

        Thank you! I hadn't changed the Kernel, but now I have changed it to CUDA. However, I get an "out of memory" error. Is there really not sufficient GPU memory (24GB) to run a sim of ~576k cells? The simulation is the IEEE-SCC34 tutorial. See screenshot below.
        Screen Shot 2024-05-01 at 10.45.09 AM.png

        1 Reply Last reply
        0
        • B Offline
          B Offline
          brown
          ZMT
          wrote on last edited by
          #4

          You definitely have enough memory here. It's likely that the solver could not use the device for some other reason. Can you try updating your graphics drivers?

          1 Reply Last reply
          0
          Reply
          • Reply as topic
          Log in to reply
          • Oldest to Newest
          • Newest to Oldest
          • Most Votes


          • Login

          • Don't have an account? Register

          • Login or register to search.
          • First post
            Last post
          0
          • Search