ShieldSquare Feed-Based Protection provides a mechanism to take action on bots at the IP Level without the need to make a Synchronous API call to ShieldSquare Service. Feed-Based Protection comes in very handy when the complete web page content is cached at the CDN level, and requests don’t reach the server layer before being rendered. Feed-Based Protection can be consumed at the application level, server level, WAF level, or CDN level depending on your needs.
The documentation below explains ShieldSquare's bot blocking procedure by consuming Feed-Based Protection, encompassing prerequisites, Service description, response formats, and the implementation process.
- Blacklisted signatures: These are the signatures that have bot characteristics and have been detected by ShieldSquare.
- Deleted signatures: Every signature blocked has an expiration time. Throughout the document, a deleted signature refers to the expired signature.
- Cron Job: The software utility Cron is a time-based job-scheduler in Unix-type computer Operating Systems like UNIX, Linux, FreeBSD and Darwin (Mac OS X). People who set up and maintain software environments use cron to schedule jobs to run periodically at fixed times, dates, or intervals.
- IP Feed: This has been interchangeably used with ‘Feed’ and ‘ShieldSquare IP Feed’.
- Sign-up to ShieldSquare Service at this link.
- Integrate ShieldSquare service (in Monitor mode) using any of the cloud connectors /REST API/ Server Plugins at this link (this step will help you detect bots on your site).
- Switch the toggle button present on the right side of the screen from Monitor to Active in ShieldSquare Dashboard at this link and choose the appropriate bot responses based on your business rules. There is no need to make any more changes in the code to activate Feed-Based Protection.
- Reach out to firstname.lastname@example.org to enable Feed-Based Protection feature for your Subscriber ID.
Feed-Based Protection Summary
The endpoint for the Feed-Based Protection is as follows:
Services provided by IP Feed:
- getipfeed: Returns a list of JSON objects containing blacklisted and deleted signatures related to the Subscriber ID along with additional information.
- getfeedcount: Returns the count of the feed related to the Subscriber ID.
- getfeedbackup: Returns the last feed sent while callinggetipfeedservice. This can be invoked if the last feed is lost.
Feed-Based Protection Details
This feed returns a JSON Packet containing signatures that were blacklisted or deleted, and other information related to it.
Parameters to be passed to the feed: Subscriber ID. Refer to the template below:
The feed will return a JSON packet with the list of signatures that were blacklisted or deleted (Parent Node) and each signature with its description (Child Node). On successful call, the JSON response will have the following nodes as shown below:
|Description||A small description on whether the signature was blacklisted or deleted by ShieldSquare. This field is just for informational purposes.|
|Operation||Can have values ADD/ DEL based on whether the signature was blacklisted (ADD) or deleted (DEL).|
|IP||The IP address that has been associated with the blacklisted or deleted signature.|
|Updated Time||Time at which the signature was blacklisted or blacklist expired by ShieldSquare. DD/MM/YYYY-HH::MM:SS is the time format. |
|Rule||This field is present for backward compatibility reasons and can be ignored.|
|TTL||TTL is acronym for 'Time To Live'. It is the time in seconds for which the corresponding signature should be blocked. This will be present only for blacklisted signatures.|
|Bot-Type||The value is one of :|
- DATACENTER_BOT – These are bots with malicious intent operating from Data centers.
- BAD_UA_BOT – Bots which do not have a legitimate user agent are classified as Bad UA bots.
- INTEGRITY_FAILED_BOT – These are bots which fail some of the integrity checks we perform, such as Browser integrity check, HTTP header check, Referral check, etc.
- MONITORING_BOT - These are the bots which monitor the system health of their customers' websites. (e.g. Pingdom).
- AGGREGATOR_BOT – Bots which collate information from other websites (e.g. WikioFeedBot). Market intelligence purpose crawlers are also included in this category.
- SOCIAL_NETWORK_BOT - These are bots which are run by social network sites. (e.g. twitterbot).
- BACKLINK_CHECKER_BOT - These are bots which check the back-links of URLs (e.g. UASlinkChecker).
- PARTNER_BOT - These are the hits from partner sites which are useful to the website. (e.g. Paypal IPN, Ad Networks, etc.)
|Preferred action||The value is one of :|
This indicates the action that needs to be taken for the BOTS. This values are based on the preferences set on the Bot Response List page in the ShieldSquare Dashboard.