You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello guys. I'm working around Sonarqube API and I want to download all issues from our project.
My request looks like https://servername.com/api/issues/search?componentKeys=<ProjectName>$facetMode=effort$facets=types&types=CODE_SMELL&ps=500&p=20
It will return 20th page of issues (9500-10000 range).
When I trying to request 21 page I got: Can return only the first 10000 results. 10500th results asked.
For my end it looks really odd why I cant request more then 10k issues with pagination? It can be different types of smells and on big project it definitely make sense to be able download more the 10k. Also I can't request all the issues at once. Why? I think that product should provide this possibility and don't block the end user with internal performance issues.
F.E. in my case I`m happy to have 10 sec curl request instead of multiple shorts ones.
The text was updated successfully, but these errors were encountered:
Hello guys. I'm working around Sonarqube API and I want to download all issues from our project.
My request looks like
https://servername.com/api/issues/search?componentKeys=<ProjectName>$facetMode=effort$facets=types&types=CODE_SMELL&ps=500&p=20
It will return 20th page of issues (9500-10000 range).
When I trying to request 21 page I got:
Can return only the first 10000 results. 10500th results asked.
For my end it looks really odd why I cant request more then 10k issues with pagination? It can be different types of smells and on big project it definitely make sense to be able download more the 10k. Also I can't request all the issues at once. Why? I think that product should provide this possibility and don't block the end user with internal performance issues.
F.E. in my case I`m happy to have 10 sec curl request instead of multiple shorts ones.
The text was updated successfully, but these errors were encountered: