Malware

In the past, viruses were created with the sole purpose of wreaking havoc on the infected systems. A large fraction of today’s malware, on the other hand, are designed to generate revenues for the creator. Spyware, botnets, and keyloggers steal information from your system or control it so that someone else can profit. In other words, the motivation for making them is now more attractive than before.

Keyloggers can reveal your usernames, passwords, PIN numbers, and other authentication information to their creators by recording your key strokes. This information can then be used for breaking into various accounts: credit cards, payment programs (like PayPal), online banks, and others. You’re right, keyloggers are among the favourite tools of individuals involved in identity theft.

Much like the viruses of old, most present day malware drain the resources, such as memory and hard disk space, of contaminated systems; sometimes forcing them to crash. They can also degrade network performance and in extreme cases, may even cause a total collapse.

If that’s not daunting enough, imagine an outbreak in your entire organisation. The damage could easily cost your organisation thousands of euros to repair. That’s not even counting yet the value of missed opportunities.

Entry points for malware range from optical disks, flash drives, and of course, the Internet. That means, your doors could be wide open to these attacks at this very moment.

Now, we’re not here to promise total invulnerability, as only an unplugged computer locked up in a vault will ever be totally safe from malware. Instead, this is what we’ll do:

  • Perform an assessment of your computer usage practices and security policies. Software and hardware alone won’t do the trick.
  • Identify weak points as well as poor practices and propose changes wherever necessary. Weak points and poor practices range from the use of perennial passwords and keeping old, unused accounts to poorly configured firewalls.
  • Install malware scanners and firewalls and configure them for maximal protection with minimal effect on network and system performance.
  • Implement regular security patches.
  • Conduct a regular inspection on security policy compliance as well as a review of the policies to see if they are up to date with the latest threats.
  • Keep an audit trail for future use in forensic activities.
  • Establish a risk management system.
  • Apply data encryption where necessary.
  • Implement a backup system to make sure that, in a worst case scenario, archived data is safe.
  • Propose data replication so as to mitigate the after effects of data loss and to ensure your company can proceed with ‘business as usual’.

Once we’ve worked with you to make all these happen, you’ll be able to sleep better.

Other defences we’re capable of putting up include:

Check our similar posts

How the Dodd-Frank Act affects Investment Banking

The regulatory reform known as the Dodd-Frank Act has been hailed as the most revolutionary, comprehensive financial policy implemented in the United States since the years of the Great Depression. Created to protect consumers and investors, the Dodd-Frank Act is made up of a set of regulations and restrictions overseen by a number of specific government departments. As a result of this continuous scrutiny, banks and financial institutions are now subject to more-stringent accountability and full-disclosure transparency in all transactions.

The Dodd-Frank Act was also created to keep checks and balances on mega-giant financial firms that were considered too big to crash or default. This was especially deemed crucial after the collapse of the powerhouse financial institution Lehman Brothers in 2008. The intended result is to bring an end to the recent rash of bailouts that have plagued the U.S. financial system.

Additionally, the Dodd-Frank Act was created to protect consumers from unethical, abusive practices in the financial services industry. In recent years, reports of many of these abuses have centered around unethical lending practices and astronomically-high interest rates from mortgage lenders and banks.

Originally created by Representative Barney Frank, Senator Chris Dodd and Senator Dick Durbin, the Dodd-Frank Wall Street Reform and Consumer Protection Act, as it is officially called, originated as a response to the problems and financial abuses that had been exposed during the nation’s economic recession, which began to worsen in 2008. The bill was signed into law and enacted by President Obama on July 21, 2010.

Although it may seem complicated, the Dodd-Frank Act can be more easily comprehended if broken down to its most essential points, especially the points that most affect investment banking. Here are some of the component acts within the Dodd-Frank Act that directly involve regulation for investment banks and lending institutions:

* Financial Stability Oversight Council (FSOC): The FSOC is a committee of nine member departments, including the Securities and Exchange Commission, the Federal Reserve and the Consumer Financial Protection Bureau. With the Treasury Secretary as chairman, the FSOC determines whether or not a bank is getting too big. If it is, the Federal Reserve can request that a bank increase its reserve requirement, which is made up of funds in reserve that aren’t being used for business or lending costs. The FSOC also has contingencies for banks in case they become insolvent in any way.

? The Volcker Rule: The Volcker Rule bans banks from investing, owning or trading any funds for their own profit. This includes sponsoring hedge funds, maintaining private equity funds, and any other sort of similar trading or investing. As an exception, banks will still be allowed to do trading under certain conditions, such as currency trading to circulate and offset their own foreign currency holdings. The primary purpose of the Volcker Rule is to prohibit banks from trading for their own financial gain, rather than trading for the benefit of their clients. The Volcker Rule also serves to prohibit banks from putting their own capital in high-risk investments, particularly since the government is guaranteeing all of their deposits. For the next two years, the government has given banks a grace period to restructure their own funding system so as to comply with this rule.

? Commodity Futures Trading Commission (CFTC): The CFTC regulates derivative trades and requires them to be made in public. Derivative trades, such as credit default swaps, are regularly transacted among financial institutions, but the new regulation insures that all such trades must now be done under full disclosure.

? Consumer Financial Protection Bureau (CFPB): The CFPB was created to protect customers and consumers from unscrupulous, unethical business practices by banks and other financial institutions. One way the CFPB works is by providing a toll-free hotline for consumers with questions about mortgage loans and other credit and lending issues. The 24- hour hotline also allows consumers to report any problems they have with specific financial services and institutions.

? Whistle-Blowing Provision: As part of its plan to eradicate corrupt insider trading practices, the Dodd-Frank Act has a proviso allowing anyone with information about these types of violations to come forward. Consumers can report these irregularities directly to the government, and may be eligible to receive a financial reward for doing so.

Critics of the Dodd-Frank Act feel that these regulations are too harsh, and speculate that the enactment of these restrictions will only serve to send more business to European investment banks. Nevertheless, there is general agreement that the Dodd-Frank Act became necessary because of the unscrupulous behaviour of the financial institutions themselves. Although these irregular and ultimately unethical practices resulted in the downfall of some institutions, others survived or were bailed out at the government’s expense.

Because of these factors, there was more than the usual bi-partisan support for the Dodd-Frank Act. As a means of checks and balances, the hope is that the new regulations will make the world of investment banking a safer place for the consumer.

Contact Us

  • (+353)(0)1-443-3807 – IRL
  • (+44)(0)20-7193-9751 – UK
Without Desktop Virtualisation, you can’t attain True Business Continuity

Even if you’ve invested on virtualisation, off-site backup, redundancy, data replication, and other related technologies, I?m willing to bet your BC/DR program still lacks an important ingredient. I bet you’ve forgotten about your end users and their desktops.

Picture this. A major disaster strikes your city and brings your entire main site down. No problem. You’ve got all your data backed up on another site. You just need to connect to it and voila! you’ll be back up and running in no time.

Really?

Do you have PCs ready for your employees to use? Do those machines already have the necessary applications for working on your data? If you still have to install them, then that’s going to take a lot of precious time. When your users get a hold of those machines, will they be facing exactly the same interface that they’ve been used to?

If not, more time will be wasted as they try to familiarise themselves. By the time you’re able to declare ?business as usual?, you’ll have lost customer confidence (or even customers themselves), missed business opportunities, and dropped potential earnings.

That’s not going to happen with desktop virtualisation.

The beauty of?virtualisation

Virtualisation in general is a vital component in modern Business Continuity/Disaster Recovery strategies. For instance, by creating multiple copies of virtualised disks and implementing disk redundancy, your operations can continue even if a disk breaks down. Better yet, if you put copies on separate physical servers, then you can likewise continue even if a physical server breaks down.

You can take an even greater step by placing copies of those disks on an entirely separate geographical location so that if a disaster brings your entire main site down, you can still gain access to your data from the other site.

Because you’re essentially just dealing with files and not physical hardware, virtualisation makes the implementation of redundancy less costly, less tedious, greener, and more effective.

But virtualisation, when used for BC/DR, is mostly focused on the server side. As we’ve pointed out earlier in the article, server side BC/DR efforts are not enough. A significant share of business operations are also dependent on the client side.

Desktop virtualisation (DV) is very similar to server virtualisation. It comes with nearly the same kind of benefits too. That means, a virtualised desktop can be copied just like ordinary files. If you have a copy of a desktop, then you can easily use that if the active copy is destroyed.

In fact, if the PC on which the desktop is running becomes incapacitated, you can simply move to another machine, stream or install a copy of the virtualised desktop there, and get back into the action right away. If all your PCs are incapacitated after a disaster, rapid provisioning of your desktops will keep customers and stakeholders from waiting.

In addition to that, DV will enable your user interface to look like the one you had on your previous PC. This particular feature is actually very important to end users. You see, users normally have their own way of organising things on their desktops. The moment you put them in front of a desktop not their own, even if it has the same OS and the same set of applications, they?ll feel disoriented and won’t be able to perform optimally.

Contact Us

  • (+353)(0)1-443-3807 – IRL
  • (+44)(0)20-7193-9751 – UK
8 Reasons why you Need to Undertake Technical and Application Assessments

Are your information assets enabling you to operate more cost-effectively or are they just drawing in more risks than you are actually aware of? Obviously, you now need to get a better picture of those assets to see if your IT investments are giving you the benefits you were expecting and to help you identify areas where improvements should be made.

The best way to get the answers to those questions is through technical and application assessments. In this post, we?ll identify 8 good reasons why it is now imperative to undertake such assessments.

1. Address known issues – Perhaps the most common reason that drives companies to undertake a technology/application assessment is to identify the causes of existing issues such as those related to data accessibility, hardware and software scalability, and performance.

2. Cut down liabilities and risks – Unless you know what and where the risks are, there is no way you can implement an appropriate risk mitigation strategy. A technology and application assessment will enable you to thoroughly test and examine your information systems to see where your business-critical areas and points of failure are and subsequently allow you to act on them.

3. Discover emerging risks – Some risks may not yet be as threatening as others. But it would certainly be reassuring to be aware if any exist. That way, you can either nip them in the bud or keep them monitored.

4. Comply with regulations – Regulations like SOX require you to establish adequate internal controls to achieve compliance. Other regulations call for the protection of personally identifiable information. Assessments will help you pinpoint processes that lack controls, identify data that need protection, and areas that don’t meet regulatory requirements. This will enable you to act accordingly and keep your company away from tedious, time-consuming and costly sanctions.

5. Enhance performance – Poor performance is not always caused by an ageing hardware or an overloaded infrastructure. Sometimes, the culprits are: unsuitable configuration settings, inappropriate security policies, or misplaced business logic. A well-executed assessment can provide enough information that would lead to a more cost-effective action plan and help you avoid an expensive but useless purchase.

6. Improve interoperability – Disparate technologies working completely separate from each other may be preventing you from realising the maximum potential of your entire IT ecosystem. If you can examine your IT systems, you may be able to discover ways to make them interoperate and in turn harness untapped capabilities of already existing assets.

7. Ensure alignment of IT with business goals – An important factor in achieving IT governance is the proper alignment of IT with business goals. IT processes need to be assessed regularly to ensure that this alignment continues to exist. If it does not, then necessary adjustments can be made.

8. Provide assurance to customers and investors – Escalating cases of data breaches and identity theft are making customers and investors more conscious with a company?s capability of preserving the confidentiality of sensitive information. By conducting regular assessments, you can show your customers and investors concrete steps for keeping sensitive information confidential.

Ready to work with Denizon?