Legacy perimeter security mechanisms can be evaded very easily. It’s disappointing, but it’s true. Innovatively-designed malware and APTs have the potential to evade even the strongest signature-based security solutions that are currently being deployed across industries. This has encouraged IT companies to think beyond prevention and to design effective detection strategies. In recent times, companies have started analyzing traffic logs through a deployment of technology as well as professional services to detect attacks that are under way. However, even though traffic log analysis can promote the identification of malware activity, companies may not benefit from it much as the on-premises approach is incomplete, inefficient, and expensive at the same time.
Webscraping, data mining, cookieing… these are just some of the many tactics used by businesses to collect information from other websites for their financial gain. So if a company uses software to “webscrape” information from a third-party website, is it legal? Is it ethical? Google offers webscraping tools, while the courts consistently rule against its users. Either way, it certainly raises some serious security concerns.