Hello everyone, today i will be discussing Bit coins. Some of you may already know of Bit Coins but the large majority will never have heard of it. So I will be explaining to you what they are, how you obtain them and what you can use them for. What are Bitcoins? Bit Coins are an online currency (also know as e-cash) that was created in January 2009 by Satoshi Nakamoto. Satoshi is a cryptographer who came up with the Bit Coin Protocol. Bit coins are similar to that of local currency (such as that of town’s that have their own currency that you use in local shops) in the sense that the worth of the Bit Coin is decided by the online community. The worth of the Bit Coin will fluctuate depending on the amount and size of transactions though, unlike national currencies, the value of the bit coin can quickly recover. At the time of typing this, 1 Bitcoin (BTC) is equal to £8.239GBP or $13.32USD. Bit Coins come in 3 varieties: You have the Bitcoin (1), the Bitcent (0.01) and the Satoshi (0.00000001). All of these can be earned and traded over the internet like normal currency. These will be kept in your Bitcoin wallet on your computer. These wallets have their own, unique address (like a bank account number) that will be used to make transactions. The worth of the Bit Coin will fluctuate depending on the amount and size of transactions though, unlike national currencies, the value of the bit coin can quickly recover. How does it Work? Like all currencies, Bitcoin relies on the communities trust in the exchange for it to have value. By the community having trust in Bitcoin people will use, buy and sell Bitcoin. if no-one has trust in Bitcoin then no-one will use it. Unlike normal currencies, Bitcoin doesn’t have a centralized body that issues Bitcoins. Instead, Bitcoins are rewarded to “Miners” whom then sell their Bitcoins onto the community. The miners themselves are individual’s that are all connected to the Bitcoin network who either work by themselves or in groups (this will be discussed later). The number of bitcoins that can be “mined” is set at 21 million (at time of writing there are currently 11 million Bitcoins in circulation). Once 21 million Bitcoins have been mined no more will be created. This is outlined by the protocol that Satoshi came up with. By people abiding by the protocol and trusting in the currency is what gives Bitcoins their worth and makes the system work. In order to make counterfeiting impossible, all the transactions are stored on an online database called the block chain. The block chain is formed of blocks. Blocks are list of all the transactions that took place in the space of 10 minutes. Once that 10 minutes is up the block is added to the block chain and a new block is started. Once 6 blocks are put on top of each other the block chain the consensus solidifies so it becomes impractical to alter the transactions for your own gain. These blocks must meet constraints, dictated by the network, before they are added to the block chain. This mean that it is very hard for a person to cheat the system. How do you Obtain BitCoin’s? You can obtain BitCoins in 3 ways: exchanging for actual currency, trading and mining. Exchanging for actual money is used to get you started in the Bitcoin exchange. Once you have some bitcoins you can start trading with others in the community such in the same way city traders do. You can buy and sell Bitcoins and then exchange them for real currencies. If you own a business you can join the 1000 merchants signed up to Bitcoins and start trading your good’s for Bitcoins. The main method for obtaining Bitcoins is Mining. This is the process of finding a solution to a difficult Proof-Of-Work problem which confirms transactions and prevents double spending. This would be done by a node (a graphics card) on a dedicated server (it has to be dedicated as it requires a lot of electricity). These transactions are put into files called blocks every 10 minutes and then added to the block chain (an online public database that has contains a list of all the transactions). The block chain is only allowed to accept 1 block per 10 minutes. This block must meet stringent constraints dictated by the network. If the block hasn’t formed into the form it should then the block chain will reject it. The one block that gets accepted onto the Block chain will get it’s owner 50 Bitcoins as a reward for the work they’ve done. The downside to mining is the time, effort and resources it takes to get the reward. Mining uses a lot of electricity, data and takes place on Graphics Cards so it can be impractical cost money for the individual miner. For this reason, some miners pool their resources together and then split the reward they get depending on the work they’ve done. The only downside to this is you have to give up some of the reward you get but it can be beneficial in the long run. I hope this has given you a better understanding into the world of Bitcoins. Thank you for reading!

Hello there. Today I will be discussing why small to medium sized businesses should invest in a fully integrated cloud infrastructure over a traditional IT model. Cloud technology has been around for a couple of years now and (with all things technical) businesses have been slow to adopt it. Most big businesses have taken the step to the cloud yet small to medium sized businesses are still choosing to take up the traditional IT option (physical desktops, on site storage etc.). So below I’ve made a list of reasons why small businesses should move to the cloud.

  1. Flexibility Flexibility is a very important aspect to small and medium sized businesses. Being flexible allows them to manoeuvre and adapt to the ever changing business and economic climate we live in. So, how come they don’t apply that to their IT? The traditional business model for IT is only as flexible if you are willing to pay a lot of money. For example: as an owner of a business, you need a server to do specific tasks. However, to make the most of that server you will need to think 5 years ahead to decide on its requirements. So, to make it flexible in the future, you have it run on up to date hardware, the latest software and a lot of disk space for expansion. By doing this, you now have a high end server that should be able to meet future requirements. However, to get that ability you are now down by a large sum of money. Also, given that technology advances at a fast rate, you’ll find that (after a couple of years) your server will struggle to meet the demand of the tasks you give it. You’ll then have to pay more money to either improve performance or upgrade to a new server. In comparison, using a cloud server will already give you that flexibility from the word go. A cloud server can be changed and adapted to your needs in a matter of minutes. If you need more disk space more can be added in a few clicks. Software can be added in a couple of clicks. This gives you a degree of flexibility that you wouldn’t be able to achieve with a traditional IT model. By taking the cloud route, you can adapt your IT to whatever your requirements are in a short amount of time and for a low cost. Flexibility can also be used in terms of maintaining your IT systems. A problem with a desktop will take longer to fix than a cloud desktop as a technician would have either have to try and fix the problem remotely or come on site to fix the problem. If the technician would have to come on site then the down time is a lot longer as you’d have to wait for the technician to arrive. By having your IT systems in the cloud the technician would be able to take control of your computer and have the issue fixed in a shorter period of time.
  2. Security Your data is the most important asset to your company. Should your data become compromised or destroyed then your company will struggle to recover it. With the traditional IT method, you’ll setup up your computer and servers to backup regularly, encrypt any data you store and have anti-virus installed. These measures should protect you against most threats such as viruses and unforeseen technical malfunctions. However, will it protect it from a major disaster? Should your company experience a major disaster (such s a flood or a fire) then you’re looking at major downtime in your business and lost or unrecoverable data. This, potentially, could be an event that your business won’t be able to recover from. By using utilising the cloud you can avoid these threats. Having all your data stored on the cloud takes away the risk of localized threats. For example: when using a cloud desktop you connect to the desktop over the internet. All the work that is done on that cloud desktop stays in the cloud. Should your PC break then all you have to do is go to another computer (that has an internet connection) and access your cloud desktop from there. All the work you were working on will be left the way it was when your PC broke. So should your whole office go down none of the work that you or your employees were working on will be lost. The same principle goes for data storage. With your servers in the cloud you will be able to easily access your data once you have a site setup.
  3. Cost There are two main subjects when to the cost of IT: the cost to run those IT systems (electrical usage) and the cost to maintain those systems. With the traditional IT model the cost is far higher. Even with the advancement of technology your standard desktop still uses a lot of electricity. The same goes to your on site servers, which you should keep running 24/7. The cost to run these IT systems will increase each year as electricity prices go up and your PC goes out of date (in terms of technology). Over those years your PC’s and server’s will also degrade. The standard life span of a desktop is 3 years and in those 3 years the cost of maintenance will go up as your desktop struggles to cope with the tasks it’s given. This changes when you move to the cloud. A number of places offer thin clients to access your cloud desktop to access your cloud desktop. These are, essentially, compact desktops that run on minimum hardware. The hardware itself is barely used as all the work is done on the cloud desktop itself. By using these you reduce electricity costs as far less electricity is being used. Maintenance costs also go down because a thin client has far less to go wrong with it. Since the hardware is barely being used the thin client degrades at a far slower rate, meaning it lasts far longer.

I hope you have found this blog article interesting. Thank you for reading.

Today I have compiled a list of 5 technologies to look forward to this year. I believe the item’s on this list will be game changing in their technological area: 5) Synaptics new range of touch interfaces Synaptic’s are touch pad manufactures who have been around since 1995. If you’ve had any device that has a touch device, the chances are you’ve used one of their products. This year they will be releasing a new range of touch products that will offer more than the current touch products around at the moment. These are: ForcePad, ThinTouch and the ClearPad. The ForcePad is their latest Touch Pad device. This uses pressure tracking sensors instead of the traditional mechanical switches that are in current use. What it does is measure the pressure that your finger is asserting on the pad. Light movement will move the cursor around on the screen and applying pressure will select. This is similar to standard touch pads. However, the pressure tracking sensors allow for extra features as it allows the tracking of multiple fingers. For example: When you use your current touch pad, to right click an icon you have to you the right click button on the bottom of the touch pad. What they have done is do away with the buttons at the bottom and made it so that taping the pad with two fingers is right click. The extra functions are supposed to work in conjunction with the new Windows 8 operating system (which is designed for touch input first). The ForePad allows the users to use all of the 8 touch commands that come with Windows 8 and has the technology to expand in the future. The next in their products is the ThinTouch. The ThinTouch is synaptics new range of keyboards that were originally designed for the ultrabooks and thin notebooks. The main difference is the thickness of the keys. Modern keys use a scissor technology. These range from 6mm to 3.5mm. In comparison, the ThinTouch is 2.5mm at its highest point. This allows for very thin laptops or larger batteries for the laptop. But, this isn’t the coolest thing about the ThinTouch. The whole keyboard is equipped with capactive touch sensors which will allow you to do gestures with the keyboard along with the touch pad. Since each key has a capactive touch sensor installed there is an electronic field over the surface of the keyboard. This would allow for near field gestures to be done as well (waving your fingers over the keyboard rather than touching the keys.) The last of their products is the ClearPad. This is designed for smart phones, tablets and notebooks in mind with up to 17” displays. This uses a single chip (a combinations of display controller and touch controller) to do the work. This reduces the energy consumed, the cost of the ship and reduces the latency as well (latency means the time between command and response). This allows for much quicker response times to your touch commands, increases the battery life of the device and brings down its cost. Overall the reason I’m looking forward to these devices coming out is because they will change the way we interact with the devices we use. It will allow for much more fluid control over the touch device and improve the PC human interface. 4) The rise of OLED When is comes to display technology there are 4 technologies used: Liquid Crystal Display (LCD), Plasma, Cathode Ray Tube (still around but not in main stream use) and Organic Light-Emitting Diode (OLED). The last one in that list is the latest in display technology. It has been around for while and is utilized in small screen devices such as the PSP Vita and the range of Samsung smart phones. However, LG have very recently released their new 55” TV that uses OLED. The great thing about OLED it that it is much for energy efficient compared to LCD and plasma. It also allows for much thinner displays compared to LCD and plasma. Combining both of these qualities means it’s great for portable devices as it would offer greater battery life (or in the case of static displays less electricity being used which is great when we are all going green) and will either make the device thinner or offer more space inside the device to be utilized by other systems. The most interesting thing about OLED is that a number of people believe that OLED could herald flexible, “bendy” devices. The reason for this is that OLED doesn’t require a glass screen. This means that devices that are made from OLED will be lighter, more durable and can be folded away or even wearable. Samsung are already in the final stages of making their flexible phone (made from OLED) and should be released in the first couple of months of 2013 3) Google Glasses Google glasses are Google’s latest in mobile devices which run off the android operating system. It is a pair of glasses which displays the data on the inside of the glasses as a HUD (Heads Up Display). It is controlled by voice command and can be used on the move. Even though it isn’t the first pairs of HUD glasses it will probably be an innovation in its own right (such as the iPad not being the first tablet but it did kick start tablet technology). What the glasses allow you to do is to have a constant stream of data whilst you are on the move. So unlike smart phones (which you have to hold in your hand and look at) all the information will all ready be there in front of yours eyes. There are also rumours that Google are considering adding a phone capability to the glasses. With this capability it could completely change the playing field when it comes to portable devices. 2) Leap Motion Leap motion is a small box that sits in front of your keyboard. What it does is add a completely new dimension to controlling your computer. Traditionally you’d use a keyboard and either a touchpad, touch screen or mouse to select items. What leap motion does is it allows you to control the computer by waving your hand over it. It senses the gestures your hand makes whilst over the device and then translates and inputs it into the computer. To get a better understanding I highly recommend you watch the video on their website: https://leapmotion.com/ 1) Oculus Rift As a fan of video games I’m very much looking forward to the Oculus Rift. What most gamers want and enjoy about gaming is immersion, the feeling that they are actually inside and are apart of the game they are playing. This is mostly achieved by getting the biggest screens possible (or in the case of a number of avid computer gamers having multiple screens), having a good surround sound system and, if you are very serious about your gaming, a head tracker (this tracks to movements of your head and then uses that in your game). The only downside to all of this is that it costs a lot of money and is limited on the feeling of immersion in the sense what you see on the screen is limited in the field of view. What Oculus Rift does is take immersion to the next level. It’s a pair of goggles that you wear on your head. It will then display the game inside your goggles which will gives you a peripheral vision and a sense of depth, none of which can be achieved by a normal screen. It links up to your computer using a DMI connector and there are plans in the future for it to be used on consoles as well. They have a video on their website that demonstrates the Oculus Rift: http://www.oculusvr.com/ I hope you have found this blog interesting and will look forward to these new technologies with the same anticipation as I do. Thank you for reading.

Hello everyone, today I will be typing up 3 ways cloud technology can benefit accountants.

  1. Can be used remotely As an accountant you’ll probably be required to move around a bit, either going to different customers or working from home. Doing this means that you’ll have to take all the documents you need and the applications to do your work on with you. That would mean having a laptop that is capable of running all the required applications on it, applications installed on it to do the work and plenty of space to store the data. This will cost a lot of money to get a decent laptop to do this work and cost more for the licenses for the software. This would be ideal for a Virtual Desktop Infrastructure (VDI). A VDI is a virtual computer that runs on a server on the internet. You can access this from anywhere as long as you have a device with an internet connection. The VDI will have all the applications you would require to work and access to all the data you need (such as worker shared drives). This can be accessed on a cheap laptop that has WiFi capabilities. The VDI would be what you would do your work on, essentially becoming your office desktop. All that work can be easily accessed from you clients site or from home.
  2. Can be used effectively in the office In an office that utilizes the traditional IT model (i.e. desktop computers, servers e.t.c) a lot of money is spent on running and maintaining the IT systems. Everyone would require a PC and you would also require multiple servers to run services (in a large office this can cost a lot on the electricity bill). Your IT support costs would also be high. Desktops have a life span of 3 years, meaning that plenty of money will need to be spent to replace them every couple of years an even more to roll out new applications a software to the desktops. If you have multiple branches then you will also require VPN’s to be setup so that all the works are connected to each other and have access to the data they need. Adoption of cloud technology can change this. Instead of using traditional desktops you would use a ThinClient (an 8th of the size of a standard PC). This would be used to access your VDI. The great thing about the ThinClient is that it has far less parts inside it than a desktop. With few parts its uses less electricity and its life span increases to 5 years. This takes a good chunk out of the cost for the electricity bill and maintained costs. All servers can be moved to the cloud as virtual servers. This would free up space and take away the cost of running and maintaining server rooms. With all your works running on VDI’s they would be able to access the same data. The VDI’s would be setup on the same network. This would mean that works all around the country would have access to the same network drives. The centralization of the VDI’s also makes it easier for IT support to access the VDI’s and fix them. An IT Technician can monitor, control and remotely access the VDI’s in seconds. From a central management console they can roll out updates and applications in minutes, compared to a technician going round each PC in the office and installing software on each individual computer. By doing this, it means that down time of the office computer is greatly reduced. Meaning less money is lost to maintenance
  3. Securely store data in an easy to access place Accountants have to handle their client’s most sensitive data. With the rise of cyber crime this data should be made very secure because if the data fell into the wrong hands it could cripple the company. However, securing it can make it harder for those who need to access it. Having it on the cloud can tailor for both worlds. Cloud servers are typically stored with the IT technicians; giving them direct access to the server should anything go wrong. The data would be backed up and secured on encrypted servers at the data centre (protecting it from cyber criminals and other hazards) but at the same time would allow the people who need access to it easy access. You would be able to access the data remotely as long as you know the passwords and have the permissions to view the files. This allows the accountant to do their work whilst keeping out the unwanted.
  4. Thank you for reading.

Today I will be discussing how schools, colleges and universities can benefit from cloud technology. Some of these places are the size of large companies. Making use of an efficient, cost effective and integrated IT system can benefit schools, students and staff immensely. Cloud does that all so I’m going to explain how moving to the cloud can benefit education systems.

  1. Access to all software that a student needs for their course In secondary school upwards students become more varied in the courses they do (the same different student need different books for their courses). With these courses comes the technology they need to use to achieve this (i.e. a computing student will require Visual studio, an engineering student CAD etc.). With the current systems in place it will be very difficult to achieve this. Most large schools would have an Active Directory system in place. This would allow the students to login to any of the computers on the school domain but they won’t be able to access the software unless it’s installed on that computer. With a VDI they would still be able to login to any of the computers but also do work (with the software required to do it) outside of class.
  2. Quicker and easier to unroll the latest technology Keeping up to date is an important thing in education. Any new technology that comes out and is shown to greatly improve the quality of education would be installed on the school system as soon as possible. The only problem with this is that it will take a while for the onsite technicians to go and install this software on each of the computers. With cloud desktops you will be able to roll out the new technology within minutes to all of the students virtual desktops’. The same could apply with specialist software (i.e. something that only students of a specific course need access to). Normally you would have to install that on the computers in the classrooms being used by that course. With a Cloud Desktop you can quickly toll it out to the students that need it. They will then be able to access the software from any desktop via their cloud desktop. By doing this it will allow the school to remain up to date with the latest technology and could potentially improve the learning standards of the students.
  3. Students can access all their work from home or on their own devices at school More and more students are bringing their own devices in to schools, colleges and universities. Depending on the schools stance BYOD (Bring Your Own Device) will either be discouraged or out right banned. However, having cloud desktops could work in the advantage of the student when combined with BYOD. Cloud desktops can be setup to run off any device that has access to the internet (i.e. laptops, tablets, even smart phones). By doing this the student can use their Cloud desktop to type notes in class/lectures and can be used to help them in research tasks. By doing so the student would be able to utilise their own device to actually benefit their studies rather than it being a distraction. Adding to the point that Cloud desktops can be accessed from any device with internet access, in recent times students are having to do more work outside of school (either homework or coursework) or can’t get to school due to turbulent weather (i.e. heavy snowfall). Sometimes the homework or coursework will require the student to have access to applications they do not have. They would also have to bring that work in to school (either by email or USB) which would put the school IT systems at risk from viruses. Using a Cloud desktop means the student will have access to all the applications and files they need whilst doing it on a protected system.
  4. All work saved and stored in one easy to access place In regards to studies it is very important for data to easily be accessed by teachers and students from a central point. This, in a number of schools, is achieved by using an intranet which can be logged into via the school website. From there the teachers can upload work and students can download it. However, the problem with this is that, since teachers and students are using their own computers, it can create a large range in applications used by individuals. This means that not everyone will be able to read what the teacher has uploaded as their PC either has a different version of the application or it does not have it at all. With a VDI all work is saved and stored there. The students can be setup with shared drives (depending on the course they are doing) from which their teachers can save work, revision notes etc. from them to view. Given that they are doing the work on VDI’s they would all have the same applications and software to do the work, so there would be no capability gap between students and teachers.

I hope you have found this interesting to read. Thank you for reading.

Hello there, my name is Ben and today I will talking you through how to set a user’s permissions on office 365 using powershell. I shall be talking you through the steps to configure the permissions of an office 365 user to view another users mailbox.

To configure a user to view another users mailbox, follow these steps:

1) This step will show you what your execution policy is on your computer. Depending on what execution policy you have will determine what scripts you can run. Theresfore, it is important that you do this step before starting so that you are sure you can actually execute the scripts you’re about to use.

get-executionpolicy

2) This step verifies that you are an administrator and will get you all the users that the administrator (that you are logging in as) is in control of, which will allow you to change their settings.

= Get-Credential

3) This step will set the so that when you import it you will get all the scripts required from the outlook exchange server.

= New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://ps.outlook.com/powershell/ -Credential -Authentication Basic -AllowRedirection

4) This step imports and the scripts that you are about to use.

Import-PSSession

5) This step will set the permissions. "-identity" will be the persons mailbox that you want the other user to view and "-user" will be the user who you are setting the permissions. "-AccessRights" will be what the other user can view. Giving them FullAccess (like what has been done in this script) will give the other user full access to the mailbox. Remember to change the and with the respective users email addresses.

Add-MailboxPermission -Identity -User -AccessRights FullAccess -InheritanceType All

6) This step is optional and not required, however it will confirm whether the permissions you've just added have been implemented.

Get-MailBoxPermission

Thank you for reading this blog. I hope it has helped you with allowing an office 365 user to view anothers mailbox.

In this blog I am going to talk about cloud computing and explain exactly what it is.

What is the Cloud?

The Cloud, much like Web 2.0 is a term that has been given to many different technologies all grouped together. It’s a term which managers and marketing people like to use as it is the current buzzword of IT. In reality it is a shift back towards mainframe/centralised computing. The main driving force of this has been the start of computer virtualisation. We have got to a point where server hardware is far more powerful than you need to run the majority of systems. This resulted in servers sitting idle and not doing any work. Virtualisation is the process where you can slice up a physical server into multiple virtual servers. These can be running multiple different operating systems all at the same time on the same physical server. This means instead of having servers using 15% of their resources on average you can up this to 80-90%. In one company we managed to reduce 70+ physical servers down to 6 physical servers using virtualisation. This technology is being heavily used by people such as Amazon to provide their Web Services. Services such as Amazon let customers quickly provision additional servers as required. For example, if a company has a new launch happening which will mean they will be having a far greater demand on their website than the normal servers can cope with, additional servers can be started on the Amazon servers and the load spread across them all. Once the demand has dropped off these servers can be removed.

The next side of cloud computing is providing software as a service. This is where instead of installing software on your local computer, programs are run through a web browser and delivered over the internet. The best example of this is Google Docs which allows you to have a fully functional replacement to Microsoft Office run from inside your web browser.

Essentially cloud computing is the process of utilising other peoples hardware to run your systems or shifting the management of your software to another company to run for you.  

Why is it important?

The best way I have heard the importance of cloud computing described was at an Ubuntu Cloud event. They started out by talking about the early years of electricity. In these early years people who wanted electricity would have to have their own generators at home if they wanted electricity. As time went by the national grid was setup and electricity was turned into a commodity. At this point rather than producing their own energy people just paid for it as a service. This process has been repeated for many different innovations in the past such as telephones.

This relates to computing as up until now the computer industry has been an extremely new one and has still been in the innovation phase of its existence. At this point in time we are finally at a switch over point where computing is going to evolve into a commodity. There is no longer any need for people to constantly update hardware and software where this can now all be maintained at the supplier end. All people will need is a simple machine and monitor and all the hard work will be performed on the servers of suppliers rather than on people’s local computers. These can be upgraded constantly without ever affecting customers.

The other main reason for the switch is to save money. As suppliers can run huge datacentres to provide software they can benefit from huge economies of scale and thus the price of running a system will drop. Also without the need to run servers locally huge amounts of money are saved on energy and hardware costs. You also get the resilience of being able to run your servers from multiple data centres around the world, or have your files backed up to multiple locations in different continents.

What will it mean in the future?

So what will your system look like in the future? At the moment the main innovation that is coming out soon is Google’s Chrome OS. This is essentially an entire operating system wrapped around Google’s Chrome Web Browser . This will automatically store all of your files on Google’s servers and whenever you login to any computer running Chrome OS you will get your own interface and files.

This video made by Google explains it nicely:

Another example of Cloud computing innovation is a company called OnLive. They allow you to stream computer games over the internet. This means you don’t have to have a cutting edge PC to play all the latest games. They spend all the time setting up their systems to get the best graphics and then just stream all the images to your local computer over the internet.

This link has a video which explains how it works: http://www.onlive.com/service/cloudgaming?autoplay=force

Thanks for reading this blog and feel free to contact us if you would like to find out how we can help you with Cloud Computing.

Author: Luke Whitelock

UPDATE: A tool has been published which will let you get the key to unencrypt your files for free. https://www.decryptcryptolocker.com/ This week one of our clients has been infected by Cryptowall. This attack occurred on a laptop protected by McAfee and fully up to date at the time of the attack. In space of 5 hours Cryptowall managed to encrypt users data and data on network shares across 4 different servers. Over 8000 files were encrypted. In each encrypted folder were following files: DECRYPT_INSTRUCTION.TXT DECRYPT_INSTRUCTION.HTML DECRYPT_INSTRUCTION.URL What is inside these files: What happened to your files ? All of your files were protected by a strong encryption with RSA-2048 using CryptoWall. More information about the encryption keys using RSA-2048 can be found here: http://en.wikipedia.org/wiki/RSA_(cryptosystem) What does this mean ? This means that the structure and data within your files have been irrevocably changed, you will not be able to work with them, read them or see them, it is the same thing as losing them forever, but with our help, you can restore them. How did this happen ? Especially for you, on our server was generated the secret key pair RSA-2048 – public and private. All your files were encrypted with the public key, which has been transferred to your computer via the Internet. Decrypting of your files is only possible with the help of the private key and decrypt program, which is on our secret server. What do I do ? Alas, if you do not take the necessary measures for the specified time then the conditions for obtaining the private key will be changed. If you really value your data, then we suggest you do not waste valuable time searching for other solutions because they do not exist. For more specific instructions, please visit your personal home page, there are a few different addresses pointing to your page below: 1.https://xxxxxxxxxxx.torexplorer.com/xxxx 2.https://xxxxxxxxxxx.tor2web.org/xxxx 3.https://xxxxxxxxxxx.onion.to/xxxx If for some reasons the addresses are not available, follow these steps: 1.Download and install tor-browser: http://www.torproject.org/projects/torbrowser.html.en 2.After a successful installation, run the browser and wait for initialization. 3.Type in the address bar: kpai7ycr7jxqkilp.onion/5L1m 4.Follow the instructions on the site. IMPORTANT INFORMATION: Your personal page: https://xxxxxxxxxxx.torexplorer.com/xxxx Your personal page (using TOR): xxxxxxxxxxx.onion/xxxx Your personal identification number (if you open the site (or TOR ‘s) directly): xxxx What to do if you get infected: – look at the owner of these files to track the infected machine(s) – disconnect the infected machine from the network – disconnect all shares