![]() |
[QUOTE=c10ck3r;366794]You missed the joke; the joke being that only Curtis Cooper is assigned primes for LL.[/QUOTE]
Ha! I totally missed that! :redface: I mean I have read the text, but I took it as some complaint for the fact that the guy is actually hoarding exponents (which he [U]did[/U] in the past, and still doing occasionally, I never liked the guys who only do LL, it looks to me very selfish, kinda Davieddie, but with more computing power, expecting other people to do your dirty work, like TF and P-1, and using the university's electricity/money - you can see I am full of envy too :razz:) |
Rather than start a new thread for a bug report, I'll just note it here:
GPU72 does a bait-and-switch when selecting LLTF WMS assignment. It shows a 66.5M assignment in the preview, then gives out something else altogether as the actual assignment. |
[QUOTE=axn;367364]Rather than start a new thread for a bug report, I'll just note it here:
GPU72 does a bait-and-switch when selecting LLTF WMS assignment. It shows a 66.5M assignment in the preview, then gives out something else altogether as the actual assignment.[/QUOTE] I have found that it gives me what I select, even though the example exponents shown do not match what I have requested ... |
[QUOTE=axn;367364]Rather than start a new thread for a bug report, I'll just note it here:
GPU72 does a bait-and-switch when selecting LLTF WMS assignment. It shows a 66.5M assignment in the preview, then gives out something else altogether as the actual assignment.[/QUOTE] Too bad it's not April 1st :) |
[QUOTE=axn;367364]Rather than start a new thread for a bug report, I'll just note it here:
GPU72 does a bait-and-switch when selecting LLTF WMS assignment. It shows a 66.5M assignment in the preview, then gives out something else altogether as the actual assignment.[/QUOTE] I can verify this. I was shown a preview for 66M 71->74 (it should be to 73M anyway according to the new rules) and was given a 60M 73->74. |
[QUOTE=garo;367383]I can verify this. I was shown a preview for 66M 71->74 (it should be to 73M anyway according to the new rules) and was given a 60M 73->74.[/QUOTE]
I can also attest to this particular behavior. |
I too am getting 60M 73 to 74. Is this really the most optimal use of our resources at this time? I feel like releasing 60M at 73 is much less of a bad thing than releasing unfactored no P-1 69M.
|
Let GPU72 decide is giving me 66M 71—>73
|
The thing axn reported, I could verify it too, it is not an April Fool joke. You request some exponents, you see them in the sample window, then you "take" them, and you are given a (very) different range/type of expos. Yeah, I know it says that what you see may change if someone else takes work meantime, but the change was huge, and moreover, the shown expos were still available after I took the assignment. The assigned expos were still ok for me, so I queued them. But if you are really unhappy in that case, then you can use "[U]unassign all of these[/U]" button.
Normally I don't care, I let misfit and gpu72 decide. Therefore it does not matter too much. Very seldom I do "experiments" and need few expos manually assigned in different ranges. |
[QUOTE=axn;367364]Rather than start a new thread for a bug report, I'll just note it here:
GPU72 does a bait-and-switch when selecting LLTF WMS assignment. It shows a 66.5M assignment in the preview, then gives out something else altogether as the actual assignment.[/QUOTE] Thanks for the bug report. And sorry guys; not intentional. Yet another SPE -- I updated the preview code, but didn't update the actual assignment code. (Yes, it's different code; I really need to refactor that (and sleep more)....) |
I have been trying to get exponents from 320m and up, but without success.
Anyone else had the same problem? |
| All times are UTC. The time now is 23:17. |
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.