Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Take the RobotsTxt User-Agent from the Request #294

Merged
merged 10 commits into from
Apr 29, 2024

Conversation

adonig
Copy link
Contributor

@adonig adonig commented Apr 11, 2024

This pull request updates the RobotsTxt middleware to dynamically use the User-Agent from each request instead of relying on a hardcoded value. It supersedes an earlier attempt, ensuring that the changes merge cleanly without the previous issues.

@adonig
Copy link
Contributor Author

adonig commented Apr 15, 2024

Maybe there's a way to squash all those commits into one 😅

Copy link
Collaborator

@oltarasenko oltarasenko left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could I ask you to add a test to the code, so I can merge it.

@adonig
Copy link
Contributor Author

adonig commented Apr 25, 2024

Do you believe this test is sufficient?

  test "Respects the User-Agent header when evaluating robots.txt" do
    :meck.expect(Gollum, :crawlable?, fn
      "My Custom Bot", _url -> :crawlable
      _ua, _url -> :uncrawlable
    end)

    middlewares = [
      {Crawly.Middlewares.UserAgent, user_agents: ["My Custom Bot"]},
      Crawly.Middlewares.RobotsTxt
    ]

    req = @valid
    state = %{spider_name: :test_spider, crawl_id: "123"}

    assert {%Crawly.Request{}, _state} =
             Crawly.Utils.pipe(middlewares, req, state)

    middlewares = [Crawly.Middlewares.RobotsTxt]

    assert {false, _state} = Crawly.Utils.pipe(middlewares, req, state)
  end

@adonig adonig requested a review from oltarasenko April 29, 2024 07:06
@oltarasenko oltarasenko merged commit 711dba0 into elixir-crawly:master Apr 29, 2024
1 check failed
@adonig adonig deleted the robotstxt-user-agent branch April 29, 2024 19:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants