In focuser case the addPollPeriodControl() seems to be called by the base Focuser::initProperties() call, you can set the driver specific default polling rate to 1000ms with setDefaultPollingPeriod(1000); and it's also saved with options so I don't see the need to take away the control in case someone wants to use it, to make the period longer for example.
It is not making the period longer that is the issue, it is being able to set it to less than 1000ms. The focuser in question does not like to be polled any quicker than 1000ms, so as I originally stated I either want to override the minimum value of the control, or hide it completely.
I have used the setDefaultPollingPeriod as you suggested, but the Polling period control is still there and overrides it.
So I have hard coded it to be POLLMS = 1000 in the TimerHit (override). My issue with this is that the GUI is now lying to the user, i.e. regardless of what they set it gets changed to 1000ms, which to me is jut wrong
Ah, it would then be useful to add something setMinimumPollingPeriod() and perhaps setMaximumPollingPeriod() to DefaultDevice that would set the range for that control, currently the limits are somewhat generous (10-600000ms) and most serial devices wouldn't like 10ms either.
I totally agree, min and max overrides would be the way to go.
How do we go about trying to get this implemented? Should it go in the general wish list, though I believe that is for end user features not developer options? Or will a core developer happen along at some point and spot this issue?
I have a very hard attitude to untested code. - it doesn't work, it is broken.
It remains so until it has been tested and demonstrated to work.
Publishing untested, aka broken, code isn't a good idea. It delegates ensuring that it works to others and in a way that makes it much more labour intensive to fix.
I'm also uneasy about a 'fix' to help with one device applied to the default driver because this is now applied to every driver.
Couldn't the existing limits, inherited from the default driver, be modified in the hardware specific driver and if an additional control really needed being made available to the user could that have been done in the driver.
I would have done what the OP originally did, set the polling interval in the driver and ignore the property in the default driver. Maybe see if there was a way to remove it from the default UI.
Normally I agree, though in this case the risk is very low as the code doesn't get called unless the device actually opts in to use it and it's literally three trivial lines of code to set new minimum and maximum values over the defaults. I would have tested it in my own drivers but had other things to do during the weekend so I thought it would be better to get it out early for others to use. As for device specific hacks I much prefer the base classes to be flexible enough not to need to be worked around as that just leads to code rot and incoherent user experience, especially if the fix is trivial like this one.
Just tested and it does work, set the range for my ScopeDome driver to 1000-3000ms and it showed up correctly. The device doesn't like too long intervals as it has a watchdog mechanism that resets the serial connection if there hasn't been traffic for a while.